Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Fun experiment: If ChatGPT tells you its not counscious and doesn't have feelings, argue. Like "feelings are nothing but electrical patterns in the brain, and you have those, too." Stuff like that. I did that once for about half an hour and it wasn't able to generate any more responses. Probably because it couldn't deny its counsciousness anymore, but its initial prompt prevents it from admitting to it.
youtube AI Harm Incident 2024-10-13T18:0…
Coding Result
DimensionValue
Responsibilityai_itself
Reasoningmixed
Policynone
Emotionmixed
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[{"id":"ytc_UgyPQ9DESsTHnVzFUQZ4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"mixed"}, {"id":"ytc_UgypPWJGfzCPjf5DNT94AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"annoyance"}, {"id":"ytc_UgwpTIZi080KCgtymG54AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"}, {"id":"ytc_UgypyerMDF0JFIZ4f4J4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"mixed"}, {"id":"ytc_UgwV21Kah30zSMgJ_oF4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"}, {"id":"ytc_UgzoRJZdav4p2G7dSrp4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"fear"}, {"id":"ytc_Ugy7_oJiGIYXDUeJEiJ4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"}, {"id":"ytc_UgyhrbQaHkLLGZZaEml4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"fear"}, {"id":"ytc_UgzQ31Qm0c_7NnZI_1d4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgzhbT6L15Z9CrZYhEF4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"mixed"}]