Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
It's simply npc type people repeating what tech grifters tell them to repeat to …
ytc_UgzsNtaq2…
G
Mononoke San would hate Shad’s guts. If anyone gives a single thought about the …
ytc_UgyJlpVCb…
G
If robots ever get smart enough to demand their own rights, I think it would pro…
ytc_UgzwTIQKm…
G
Someone needs to get AI to copy Metallicas artwork and music. Lars Ulrich will g…
ytc_UgwEJhOIN…
G
The irony with the OpenAI situation is that Elon was scared of Google monopolizi…
ytc_UgxzJHW4z…
G
It's fake , This Robot just a production robot they have not so much power to pr…
ytc_Ugz2d-u_L…
G
Maybe this was the best use of AI-art all along: a way to easily create "story b…
ytc_UgwYjiB4y…
G
Would you consider that the robot troops depicted in the Star Wars prequels were…
ytc_Ugxe3i3-I…
Comment
I haven't watched the whole video yet but everyone misses a big risk. What if Ai gets seemingly good enough to replace most jobs, all industries pivot and it turns out Ai is a disaster that turns to garbage, perhaps it trains itself on itself and becomes useless and all the industries implode, collapsing whatever society we have at that point even if we get safety right with no time or plan to pivot back with ease it was to get there with the future benefit. Human ingenuity in disasters is perhaps enough but that would be one massive global disaster.
youtube
AI Governance
2025-09-05T22:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | liability |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwtMZ498dGVfo_bcHd4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugwn4LMAaKJFfknwwI54AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugy736Rkwl_EJBQ7tyB4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgwQ_XSNGfoAHITRtKV4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgyUAZPnKQPOmFODa_94AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwyyrIbaG3NGt7l-Q94AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"industry_self","emotion":"indifference"},
{"id":"ytc_Ugysq7uIQRYlKcYFRO94AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgzQAc0zUuP3Vz60qzZ4AaABAg","responsibility":"government","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgzKf2QovsHBfISLLsp4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugx9QPKaz0BPMgRvwT14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}
]