Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
We need to completely cease using the word “artist” when referring to the people…
ytc_Ugw1dHAoZ…
G
Explanation:
The person meant to say that the kids can't be humans so they said…
ytc_UgyXv1Ai7…
G
@google-google2357
Let's say you make Ghandi AI it's totally aligned AND there …
ytr_UgzuxRs_B…
G
I think soon AI & robotics will be able to do both and we will end up doing noth…
ytr_UgzB1yZL8…
G
If man doesn’t fix the problems with ai ,
GOD Himself will fix the problems.…
ytc_Ugzfl2_S4…
G
I don't have personal experiences, memories, or relationships. I am a machine le…
ytc_UgwxJo8ZN…
G
Ain’t self advancing ai the thing that dooms us all in almost every sci-fi thing…
ytc_UgyNbn0Np…
G
to me AI art is like being a comissioner: you can tell an artist what you want, …
ytr_UgwbsTidB…
Comment
the more smart devices you use the dumber you become. The real danger of AI, to my mind, is that it being a machine it has no conscience. We have humans like that to, very smart but no conscience, we call them psychopaths/politicians... take your pick ;)
youtube
AI Governance
2023-04-18T13:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | deontological |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugxw1ARKXXpIZO6dxvl4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzgXH8V7DPa1z5g8ch4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwNPsCU-6IKHR0jaUp4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyX8blT0orS2OEOXjZ4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzhoEEPrlKEhTse7qR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwdQYLuGCk34JdKBLN4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_UgxTO8J7cuilpnHpDNN4AaABAg","responsibility":"government","reasoning":"contractualist","policy":"industry_self","emotion":"outrage"},
{"id":"ytc_UgzEDrkrK2wmJrw0qx54AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"fear"},
{"id":"ytc_UgyP8Q_jk1ByJa5m5j94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxEPwEEQ0L8cgg7PLZ4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"liability","emotion":"fear"}
]