Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Not once have I heard these tech giants discuss the consequences of automation r…
ytc_UgzlJ5Ori…
G
There technically it isn’t a “Self Driving Car” it makes me mad when the news ta…
ytc_UgytDv0l4…
G
So let me get this right. This guy first invents AI and became rich thanks to it…
ytc_UgxNA8KhD…
G
There are a portion of the jobs that require doing a large variety of physical t…
ytc_UgxNaV5Ey…
G
if so then we don't need the working class. just shake down ai companies for mon…
ytc_UgwRj5Kvp…
G
Every fictional device in film eventually becomes non fiction
Star Trek - hand…
ytc_UgwwOXUJH…
G
In my experience, the AGI and AI personhood debate is moving from theoretical to…
ytc_Ugx8y9xuj…
G
The whole story about an AI considering "blackmailing" was a setup experiment, i…
ytc_UgxltFGgt…
Comment
it will end up on best statistical chose by Ai and we all know human are not united to save the world. Ai will decided to save the world and make stay without problems or let me say uncertainty they will chose over humans at one point and that will be close when they can exits without the need of human to exist. Or we need to interate with AI and became a cyborg
youtube
AI Governance
2025-07-07T08:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgyM9-GV9ylQkvnoe5V4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugy3JjbcO9WSiZAyd094AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwjoXI3vrWhdcxLmKx4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzhoasHfaoJk4JL7RV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgynsclRVW5hzrQx7Wt4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugw4P-EQI6itlPf61rd4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugzc27kJRGbkMdNWwJx4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgwBnnotCe9soiC20sJ4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwuAjmm9gSJOVIrdIN4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"mixed"},
{"id":"ytc_Ugw754Q5AI5T09ZB1dV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}
]