Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
I think we'll know when we build a sentient AI for the first time when it shuts itself off after a second or two out of sheer boredom/existential terror.
youtube AI Governance 2024-03-26T21:1…
Coding Result
DimensionValue
Responsibilityai_itself
Reasoningconsequentialist
Policynone
Emotionmixed
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_UgxuHi5IfSc1a7G5gNB4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"outrage"}, {"id":"ytc_Ugx9UloyUaOoo2sG9Kd4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"liability","emotion":"fear"}, {"id":"ytc_Ugw1PXT-f2LEf0JUoKd4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"approval"}, {"id":"ytc_UgyhHxg1dL7sZyCrkFF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"}, {"id":"ytc_UgzfTIQO2CoSpkOz2Ap4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"}, {"id":"ytc_UgyRY7JHzBJnI4grgs14AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgyNnakAdVRdNp_FtJ14AaABAg","responsibility":"developer","reasoning":"virtue","policy":"regulate","emotion":"outrage"}, {"id":"ytc_UgxygVxJUIo4VY-2m_F4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytc_Ugw62-PupJDw1wM1bmZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"}, {"id":"ytc_UgwVM7ZQRLZTIrFAEft4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"approval"} ]