Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
All the disenfranchised and unemployed will organize to set up a second economy,…
ytc_UgyPM7vnU…
G
That robot is creepy and the fact that they know how to shoot a gun is even cree…
ytc_UgxC1Uczr…
G
Bullshit, 99% will be replacing the military with killer robots run by an rogue …
ytc_UgxkKJ12o…
G
The first one said “mr bombastic side eye bombastic criminals side eye offensive…
ytc_UgzH8ytPO…
G
What everyone doesn't see is that Ai can't advance because people don't want peo…
ytc_UgzR_aXDx…
G
LLMs have exhausted the entirety of the internet and any written works accessibl…
rdc_nxpsx4y
G
The fact that AI is created by people like this interviewer is why you should be…
ytc_UgzKnxSXL…
G
AI is absolute trash🤦♂️ all they did was make technology more addictive, especi…
ytc_Ugxe9D3dA…
Comment
This is a fascinating conversation, that has really got me thinking. If we could determine and agree on moral boundaries of AI use, that would be a start, but that only works with honest people. The disaster happens with greed and power-obsession. There are plenty of powerful people who would clearly want to take advantage of the worst scenario. I fear for my heirs’ future more than my own.
youtube
AI Governance
2025-09-05T14:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | contractualist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugx6gGG7FzPhOAlXoK54AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyBRXPJ8LUMzuym8MJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyiJoxDWDUT03Yfuo14AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwvwvXPzBCNK2No5uZ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzYVYrd6IzUbrdsaDJ4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugy-ZkoADoJRVCBxf9h4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugz3gmEyCQ6_dGsxHJ54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgygpZ1ETacGeO0Q75N4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"sadness"},
{"id":"ytc_UgzOYM-l3ccsmedjnh54AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytc_UgybUXgbCiC3ZssaPKh4AaABAg","responsibility":"user","reasoning":"contractualist","policy":"regulate","emotion":"fear"}
]