Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The first part will be intercity between distribution yards on the edge of the u…
ytc_UgyVQR7me…
G
Perhaps the AI pointed out that hitting the survivors was a war crime, and it to…
ytc_Ugzcqt6w6…
G
Not a promo lol but Imagine bo has been insane for me. I made a full landing pag…
ytc_UgysKsRzd…
G
I've found that AI won't do all the work for us, it just assists us, forcing us …
ytc_UgzCMN4ws…
G
Please Step away from the vehicle you have 15 seconds to comply... Before we cal…
ytc_UgwDtWYQE…
G
a UBI would have to gradually be introduced, as more and more jobs get removed b…
ytc_Ugy4ch0ld…
G
BS! If your robot can't lift a small object autonomously, what the heck is this …
ytc_UgwR_ll_2…
G
Musk is obsessed with his mortality. He thinks AI can eventually be his ticket t…
ytc_UgzGBSQPR…
Comment
Remember, AI has no emotions, it is not self conscious, it is just a machine that can transfer and read huge amounts of data and make "good calls" based on statistics. Every thing that appears to be emotions and self consciousness will always fall down to that the AI deems a statement to be the best response to look smart &/ human. He has worked on these things his whole life and sure enough he has good points in where things are heading, but man is he wrong about some things. Just like any person who has spent their life focusing intensely on mostly one thing. Very interesting interview nevertheless.
youtube
AI Governance
2025-06-26T20:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | deontological |
| Policy | unclear |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgxmshTHxVh8OyVNEkB4AaABAg","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxWbeTk4dfcWqSb1ap4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugzy2szSkyFDs05C5tx4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzpltM8KFH4yh4z2Tl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgyN19pErKRPGY9pUON4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgywjlPSsa1qWlWgaZ54AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwbFbGrm5bYedF8hG14AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugz3pDP54ZzizBOd1dl4AaABAg","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwCJITD6ih9my1Dde14AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugx2r40U4Nvvq7Q-CMt4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"}
]