Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
AI is just an excuse here… the job market in the US is broken because of the adm…
ytc_UgwOarliX…
G
Maybe that's how AI "artists" should actually present their "work" , as a text f…
ytc_UgxFJkwod…
G
7:42 duh, I'm creating my D&D (and other systems) campaigns with dozens importan…
ytc_UgzoMUpy8…
G
I have a feeling what this person was talking about with ChatGPT was so bad Open…
rdc_o6la0ld
G
I get the feeling you people are being willfully ignorant, such as with the pizz…
ytc_UghqsHUXX…
G
Thats been Britains consistent long term position. Before the war they were in t…
rdc_ohzxuqy
G
In pro this talking robot, if my ex say wanna destroy the humans then she will :…
ytc_UghBzBATP…
G
So, seems like this is saying AI is not useful. Not true at all. That said, AI…
ytc_UgxsewdMC…
Comment
My issue is that even the “everyone dies” scenario is not even really the worst option. Worse and more likely is that ai becomes a stronger weapon for humans to cause more longer, undignified suffering against each other short and long term.
I honestly wish “the next evolution” was happening but that is clearly not what is happening.
youtube
AI Moral Status
2025-11-25T00:1…
♥ 3
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgypMbjp_0O-0bRAUHx4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwB1ZjZ7h99zRNhxLF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugzwc7dYMn9tvTKnnzN4AaABAg","responsibility":"user","reasoning":"virtue","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgyVAcI0RjxY8VScb2p4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzDyyk2BAyMVWx_GHl4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwV61n2GVJpzVHkEa94AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzOJKhJ3g6tgydgR_d4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyqO4QfUHv1sJYvPk14AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugwl82nGWamDuP3jmYt4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"},
{"id":"ytc_Ugw-Kw1b2S3Tj3KBwEZ4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"outrage"}
]