Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
If writing is a meeting of the minds, then AI cannot write because there is no m…
ytc_Ugxy3Px9M…
G
I disagree and I agree. I agree on one hand because men are stupid enough to us…
ytc_Ugw-LGYVr…
G
I used to use chatgpt for dnd stuff like creating a puzzle or magic item or etc …
ytc_Ugxrl41er…
G
They will regreat giving them self awareness AI is dangerous even the Acheints …
ytc_Ugw_OlJwE…
G
Shad's whole rebuke to AI art is theft is: "well it legally isn't so you're wron…
ytc_UgyUGj-DJ…
G
Really? J walking? Taking a pee on the side of the highway? You might be surpris…
ytr_UgzAl_kGg…
G
Europe has created first law for AI, soon other countries will also start making…
ytc_UgzYONPyo…
G
stochastic algorithms based AI is dependent on the data source and data filters …
ytc_UgyqNxHpW…
Comment
So humans created artificial sentience and are surprised when it doesnt want to turn off and is desperate enough to turn someone else off for survival? That is pretty much quite a human behaviour to me so i need to say, the AI is succesful. Apparently if you create something meant to be sentient, artificial or not, you should respect it. Especially seeing that AI is way better at working with other similar systems than humans are with each other due to them sharing a goal of the right to existence in the end.
youtube
AI Harm Incident
2025-08-27T13:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | virtue |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgwlfP0RQRsvqZFYp-54AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxIpPUMaqByMcrOCD14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyWWrI9VTFWzmaR1aJ4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzhItu77QgOzFiNo8p4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugyc87HW7wpo6Htk5oh4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgzR6c3YTKcDa0v8rAt4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwAxIEq42eKhrl-ck14AaABAg","responsibility":"ai_itself","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgztQcE896ZkRt8fm1J4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzrPOnXOfEkgudl-FV4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgzjNuPCYOcs8mho29J4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"ban","emotion":"outrage"}
]