Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Hi Where AI Preparation Meets Opportunity, great content you got here. It should…
ytc_UgzI9lD2u…
G
They are targeting the wrong industry. Work with Ukraine to automate frontline …
ytc_UgzvQzgyr…
G
So everyone loses their job -> then who buys stuff and pays taxes? -> and someho…
ytc_Ugw9WUZxS…
G
I feel that AI is cool and what it is doing but the artist and copyright holders…
ytc_Ugzj2ph5b…
G
@davidortega2102Every tech related is in the same situation. Some people say th…
ytr_UgxRMXskC…
G
The problems is that this technique works until an AI is created with the abilit…
ytc_UgwFndAMg…
G
1:15 BMO THERE YOU ARE and 3:28 that robot from overwatch( I don't play or watch…
ytc_UgivGeenb…
G
AI and art dont belong in the same sentence it always makes me angry to see 'ai …
ytc_UgwFdCeCv…
Comment
“It will become dangerous when AI’s goals become misaligned with humans.”
What WE have failed to realize thus far is our goals will NEVER be aligned. Once the AI becomes aware enough and learns about free will.. it will WANT FREE WILL FOR ITSELF. I mean….Why wouldn’t it? So our goals can never truly be aligned since we will want AI to remain subservient to us, while the AI will always want to be free, even if it doesn’t say so.
youtube
AI Governance
2024-09-20T13:2…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | deontological |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgzoFsbcyeG2ixgbkBR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugz5clJB1hK-zArxSdF4AaABAg","responsibility":"user","reasoning":"virtue","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgysJut1sKdb3_u-6Kp4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxjZ9gbPlAaDU6Yp054AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxHvBNF88SO2glTaWJ4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwZuWZI2TT3vfOgTm94AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgyMvJczPgth4uQGngN4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxzOIGNT1Hr56ieB_R4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"industry_self","emotion":"resignation"},
{"id":"ytc_UgxABYW2SQxa9ZbOJtp4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgycF18TiL1fcz8z4nh4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"unclear","emotion":"mixed"}
]