Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I bet in 24 months AI will still be like a 🦧 F’n a 🏈 and Geoffrey will be 🪦…
ytc_UgzT4z7xh…
G
Why are we doing this, beside corporations trying to save money by firing their …
ytc_Ugwf4quSr…
G
ATTENTION K-POP AND EYEKONS!
There are some creeps using AI to create n@ked/
ina…
ytc_UgwRYorWR…
G
I appreciate your thoughts! While Sophia does have preset responses, the interac…
ytr_UgwEQ3D7d…
G
I'm surprised no one has called out this BS of an AI defending rant. But then ag…
ytr_UgwnngKDz…
G
Dont fall for this. The Kissinger of France Jaques Attali wrote in his 2006 book…
ytc_UgzEEfZLC…
G
Yeah i got indoctrinated into believing artists are a waste of space.
Then i st…
ytc_UgypSCDPA…
G
why tf would you give it the option to essentially decide when it doesn’t want t…
ytc_Ugzaze8nQ…
Comment
This was a good video but both the title and thumbnail image are grossly misleading. That brain-scan never even shows up in the video as far as I can tell, and DeArrow renames it to "Case Study of Bromism Due To AI Suggested Diet" which is way more accurate. There was not a single thing in the video about literal "brain cooking" the way that image suggested, which was good, because it was anti-cooked-brain squeamishness that kept me from clicking through before to begin with.
youtube
AI Harm Incident
2025-12-18T23:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | unclear |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgwrMzrM1Ry7fQrudf14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugx7NvcGperRvEBEE_14AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_UgwkaaJgUOQ-s4d_gpR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyR4tJR38wyxuvZUvd4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgydSDdXHwUO_PwRKDx4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgyNzpK7RNJErkNXGjt4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwbaOliPlbUm8oz4ZN4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"disapproval"},
{"id":"ytc_UgxV9D6ntJ1w3Jgd8ot4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugz4izSyDEfw6WTsj-l4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"disapproval"},
{"id":"ytc_Ugw8go61NJrpBPJKI8l4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"}
]