Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Its makes me extremely upset when ai "artists" will say "oh, well its different …
ytc_Ugx4zZ78x…
G
AI is only, and ONLY useful for chatting with
I would say that for looking up s…
ytc_UgzcITxUn…
G
"AI Could Wipe Out the Working Class" it's only a problem if the wealth isn't re…
ytc_UgwqNC-Dp…
G
NO IT WILL NEVER!. In the beginning of Ai (2 years ago), when Ai image generator…
ytc_UgxcRm9UK…
G
People who proudly use ai for making trash are literally saying: “I’m proud to c…
ytc_UgyaexHyT…
G
I physically cringe whenever i see ai "art", so i don't think its boring, its di…
ytc_Ugy2RUYFX…
G
Look guys I am from the ai department, nothing to worry,not even own enginners c…
ytc_UgzQtm65j…
G
Bro you need a fucking robot to draw for you your not asking for art your asking…
ytr_UgwB0nfug…
Comment
I think it is a higher chance that an early version of AI, (that isnt really true AI), gets in the hand of crazy religious fundamentalists or a crazy dictator that use the AI as a weapon against whoever they dont like and it start world war 3 with an early version of AI that is basically still a computer program and not real AI that can think for itself. What Im trying to say is that I think its far more likely the human race will wipe itself out before we manage to create real AI. Maybe we will use an early version of AI to do it, or nukes or a combination of both.
youtube
AI Governance
2025-07-03T21:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgwgaIXeQ9zpmnJTR2V4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzsVPtcQXkfioJ0fSB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugz7VsXvGEwsJL6BuJp4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxKqGo3Q5v9-eoIc9V4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugz080KjOalv8ThUUoh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzY5EfT48fg6vTzWOh4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyVv0vueSzBI2d2Aop4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugxib9CH84ZbPjwyO6x4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugwx8ZT_z6-Ewi_FVrt4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxMaAxsOMdtNBVGRW54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"}
]