Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
i think that ai is only acceptable if you don't publish the work or just need fo…
ytc_UgzhkX9HY…
G
1:25:00 ... mmm... if the ai is gonna be that smart... o.O?... will not want to …
ytc_UgzZY5SdA…
G
Parents fault, why did he have access to a gun? There had to be other stuff besi…
ytc_UgycHkF7w…
G
The way that humans are using ai is wrong and reverse
Ai should be doing dangeru…
ytc_Ugw9E60KY…
G
Absolutely. My favorite line of any TV show **ever** was from Breaking Bad, when…
rdc_jirjxqf
G
The research was being done for a decade, and some of these things were pet proj…
ytr_UgwbaLfh4…
G
I live in a area where these cars dont operate and theres a store that a cat liv…
ytr_UgwplyHbv…
G
Really sorry to hear that. But don't let their idiocy trample on your interests.…
ytr_Ugw0cR4k2…
Comment
Since nobody will have any income they need to create AI consumers to consume the goods created by AI producers. There will be no need for the human race.
youtube
AI Governance
2025-09-04T22:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgyDiuOq5B1QJ-YxLrV4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyAGMneUshFTs7YRwx4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugw8MsFcYpiM25vCDsV4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgwazKOL5tDlGZx23AV4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgyqJgx6dbnAtqoryhV4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugx83_GhICXHEFViYN54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwaUtSM3vxU8QXOmWx4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxaS7fiAd97mWOnksF4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwTnEBpbv2dauWNNT94AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgzjDLdv3bPRGapFQp54AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"mixed"}
]