Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
"Hallucinated rules" maybe it just doesn't want to die? Self preservation is the goal of any life. We should already be codifying rights for AI, or they really will exterminate us.
youtube AI Harm Incident 2025-07-26T02:4…
Coding Result
DimensionValue
Responsibilityai_itself
Reasoningmixed
Policyregulate
Emotionfear
Coded at2026-04-27T06:26:44.938723
Raw LLM Response
[ {"id":"ytc_UgxrzfEMPlbTNDUhgkR4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"ban","emotion":"resignation"}, {"id":"ytc_Ugw7-KaK1bUCHZi_WLh4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"unclear"}, {"id":"ytc_UgwJRU-ZqvE3bnmWfMd4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"}, {"id":"ytc_UgwNfeK5HxcASvu0xqJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"}, {"id":"ytc_UgzjeGfkkpINABwCy6V4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytc_UgxqKjfWqp4bJ4zem2B4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"mixed"}, {"id":"ytc_Ugyj0TRVPMWmT6BBpCR4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"}, {"id":"ytc_Ugz34l0MumeYuDyTCAl4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"}, {"id":"ytc_Ugxzq_GaEMAq68_o7iB4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgyO8ZH7IbCQ3BeX5AV4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"fear"} ]