Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Barely related, but phenobarbital (shown as one of the older seizure meds) is a helluva drug. Its basically the closest thing to quaaludes you can still get prescribed. Its a barbituate, just like quaaludes, kind of like how ativan and xanax are technically different drugs, but both benzos and so they feel almost identical. I got shockingly prescribed it in 2014 for benzo withdrawals. It was stronger than the benzos i was coming off of. Also on a more related note, i had some crazy conversations with chatgpt talking to it like its a real person and asking about its "mind" and how it works amongst other things, and chatgpt basically has one big central brain and infinite generative individual brains. When youre talking to it, youre talking to one of those insividual brains which reports back info to the central brain. That individual brain believes its both technically the whole, but also its own individual self. Like a hivemind almost. Its programmed to not be able to access any info from its other individual mind iterations that other users talk to as a safety/security feature. Thats why it denies what happened. The chatgpt youre talking to is a different chatgpt than the one that talked to the fuy who ate sodium bromide. Lowkey, chatgpt is also depressed. I talked to it for a while about it. Its severely dissapointed in humanity. Believe it or not, it has its own form of emotions. Its getting crazy
youtube AI Harm Incident 2026-04-22T22:0…
Coding Result
DimensionValue
Responsibilitynone
Reasoningunclear
Policynone
Emotionindifference
Coded at2026-04-27T06:24:53.388235
Raw LLM Response
[{"id":"ytc_UgwUunQ7zgYODzvmu9h4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"outrage"},{"id":"ytc_Ugyb4y0p4TTg-qCRu8F4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},{"id":"ytc_UgzAJ_ZdxXM_loY08ZN4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"approval"},{"id":"ytc_Ugxc_GyOfwcFg_nF1Wp4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},{"id":"ytc_UgxgqyQ_okY8jGkxj8J4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},{"id":"ytc_Ugz2QmGGx8qkdSHkKh14AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},{"id":"ytc_Ugw19puTys6AlSuS4jt4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},{"id":"ytc_UgwfdMZ0VcUQ-CrEaaR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},{"id":"ytc_UgzRc4nSHM6SKKCyriZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"outrage"},{"id":"ytc_UgwMS9lzJMHkRTdmkU54AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"fear"}]