Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Ai art doesn't create anything new. It doesn't "learn", it just collects data to…
ytc_UgwDNX_ec…
G
ChatGPT is dangerous not because it’s smart, but because it’s actually so dumb a…
ytc_Ugw-7HfR3…
G
Ai can only do what it’s it is trained to do. Not new original things.
But t…
ytc_UgzTDblOM…
G
Oh yeah, the idea that AI chatbots that can and have been programmed/trained for…
rdc_mtosdf5
G
"AI is accessible, unlike real art"
My brother in christ, you have google.
If yo…
ytc_UgwIPsOF_…
G
This isn't autonomous driving problem, it is a jay walking problem. Outcome wou…
ytc_UgxWbQwzY…
G
Hey Stephen!
Thanks for stopping by Reddit for an AMA!
In recent interviews y…
rdc_cthnptq
G
@ that’s a pretty good test to see how much it really poisons the AI. This, how…
ytr_UgxrrcPAO…
Comment
It’ll either be a Utopia with everyone getting a UBI and having total freedom to pursue their passions while A.I. robots do all the menial work… OR no UBI and millions are plunged into poverty. The 1% will keep all the money generated by A.I. and use A.I. robots to first control the masses and then ultimately eliminate them after 99% of humanity are deemed a burden. Then a handful of Trillionaires will be the only ones on the planet left while A.I. does all the production and caters to their every need. And if Karma is real, then A.I. will turn on those rich humans and either enslave them or wipe them out as well. Have a great day! 😂
youtube
AI Harm Incident
2025-12-13T17:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_Ugxb7tJ-sMdSAwrZqwJ4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgwTQckiTnX6OJxNcNR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgyVMTmOIo0vcpZplaN4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzUiPqCOIe2sSbPNxh4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugw3qo9C62pT2A4Hghh4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwT6NTJiLipNHxCl214AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"industry_self","emotion":"resignation"},
{"id":"ytc_UgzWto9Q6iKgd6Q33ch4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugy7kW1kLTtxnX6Y-Vd4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugy9-slKWjKtYLiqouJ4AaABAg","responsibility":"government","reasoning":"contractualist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgyFV4vXfWtNHnpb0pp4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"mixed"}
]