Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
What comes to mind: 'The abomination that causes desolation'.
From the bible wh…
ytc_Ugzka0ORu…
G
No they don't. It's just marketing.
There is a good reason that you only show me…
ytc_Ugx5-_J06…
G
Seems super AI might want to stop humans from doing a nuclear armageddon, which …
ytc_Ugxp927l8…
G
in my opinion, if AI is extremely intelligent , that is because of the mistakes …
ytc_UgyL797_M…
G
Use AI for Art and look cool ❌
Use AI art and give artists an idea what to draw…
ytc_UgxEygOlO…
G
As someone that was a woodworker until AI taught me to develop software, I’d lik…
ytc_UgzCgzyi_…
G
Honestly it’s ridiculous to me how people can so strongly defend AI art when the…
ytc_Ugxufs5L7…
G
Ai was supposed to enhance human, just like computers and automobiles did. Inste…
ytc_UgyRDrNUC…
Comment
Okay, there's some massive, MASSIVE ethical implications that folks are just handwaving when it comes to AI (and other related tech). That issue is that there is no precedent for dealing with a liable AI, and it's not just law, humans have a natural, intuitive understanding of liability that is entirely incompatible with whatever AI is. It's not human, so you can't treat it as such, but it's not natural either.
youtube
AI Harm Incident
2025-11-25T04:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | contractualist |
| Policy | liability |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgwM8Rf2bzAk21_X9D54AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzCO37ZjvwkvkveZ0J4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwV3qKWo_l4c8N_RfB4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugyd2TXCC3rqtE9EsLR4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzmJUnn8IvxoWAI4FJ4AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgyzKsvnGidaaYX036d4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugzs8bW_8EMCwD26mdJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwwdvtPskdAIdsM6m14AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgyJxC8A_azgz32WgYd4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzpdGnq5Hm6kJDr-QJ4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"fear"}
]