Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Yeah I was gonna say, not only can they not stop governments using this tech, in…
rdc_ekte4ux
G
12:27 you would need to ask the same iteration of ChatGPT to get a confirmation…
ytc_UgypXSwHF…
G
Go debate Matt Dillahunty ~ as a secular humanist myself, I made Chatgpt confirm…
ytc_UgxaYWi8F…
G
@josephpublico2337exactly nukes came regardless of what people wanted so will AI…
ytr_Ugy3Y5VCH…
G
It is a bit sad, but right now it is very true.
Yet nobody cares what will happe…
ytc_UgwHa2OqL…
G
I don’t understand how Ai could apparently get so smart that it figures out all …
ytc_UgzHi0fOG…
G
AI is the enemy of humanity because it will replace human jobs. No more AI, stop…
ytc_UgyIYD9Kt…
G
We are all experimentalists. We generate a lot of bullshit (one word at a time) …
ytr_Ugz9JDd_x…
Comment
This is a nightmare. I don't trust AI driven trucks not to avoid an accident.
If AI takes all our jobs, what are we supposed to do?! If none of us have jobs, how are we supposed to buy anything? These billionaires are so shortsighted.
What happens when someone hacks the system to tell trucks to go to a remote location and turn off all their cameras so they can rob it?
youtube
AI Jobs
2025-12-12T19:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugw-8YuNsTPBLC4edit4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugzd88YejqqdqM7vgud4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwxhEm8AKDNAPGWntB4AaABAg","responsibility":"company","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugyz52SwFy9ACDoarXB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwHBCxhaA261dkn2p14AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgyG80U31DnGI2JOrJd4AaABAg","responsibility":"government","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzUQt8Ds0HLkpXwnc94AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwU2ZTLFaO9Y6q-LlZ4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwFfZ0-cO5RyJ4XSkh4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugx8PCV5ZI0GKHIF4aF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"}
]