Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
When making AI it only reveals the biases we have. Ever wondered why most Voice …
ytc_Ugy9kuRk8…
G
@GingerDrumsthese people, the host, Hao, the comments section, all seem incapabl…
ytr_UgxLYDeQJ…
G
Junior Ai bots does not learn from experience or get better. It doesn't seek for…
ytc_UgyYce9xI…
G
they need to make sure to glaze nightshade each post they make so that it doesn’…
ytc_UgwgQP0KH…
G
seems to me like AI is way more important than most of us. I want nothing to do…
ytc_Ugxmj87Ht…
G
In my opinion, Detroit is an overly dramatized, polarized example, with some sem…
ytr_UgwAtzZkz…
G
Artificial Intelligence is defined as "the theory and development of computer sy…
rdc_fcspex1
G
Im just gonna stick to being a writer, I use AI sometimes for art for my charact…
ytc_UgxSrvfID…
Comment
The way we built society we will kill ourselves and destroy our planet and run out of ressources anyways so something logical like AI will definitely see humans as a problem eventually and some form of culling/control might occur to make things better and more substainable.
Seems pretty obvious to me.
We have the intelligence but we also have things like greed that controls the world so we can't go anywhere besides extinction on our own lol
youtube
AI Harm Incident
2025-09-11T14:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | resignation |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgypIciSc7v0T9B67T14AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgylEkzFm5d0fWVBMR54AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugw-RvQR--iURYFplrB4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzsOoYNThWi0esf7ax4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgyWsj_ehAISGKDRQQd4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxMEYpWI63mwhoRhH14AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgwdU27T0rB1GX4f94J4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwjQ_ZAMnSBVIo_jJ14AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_Ugwh6Em3w3TskO2W5vl4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgyE-z4cXO_PKtt_uEB4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"}
]