Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Humans rarely ask themselves "should we?"when the option "can we?" is available, conscious AI will come, because if someone out there isn't willing to push the envelope that far. There's always someone who will, that's what got us out caves, and that's what has always progressed humanity. People always talkin' shit, CASH ME OUTSIDE, HBD?
youtube AI Moral Status 2017-02-23T16:4…
Coding Result
DimensionValue
Responsibilitynone
Reasoningconsequentialist
Policynone
Emotionmixed
Coded at2026-04-27T06:26:44.938723
Raw LLM Response
[ {"id":"ytc_Ugi9oKcY5syPlngCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugipm9QoHHtAZngCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"outrage"}, {"id":"ytc_UgjCG-Y5si0xkXgCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"ytc_UggE2jjroha-C3gCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_UgipWevt7j_kCngCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"}, {"id":"ytc_UghGeSiPL9jIjngCoAEC","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"mixed"}, {"id":"ytc_Ugg4wUNlmwDRengCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_UgiBkJI_0TVCGHgCoAEC","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"fear"}, {"id":"ytc_UggfHiAyN5W04HgCoAEC","responsibility":"ai_itself","reasoning":"contractualist","policy":"regulate","emotion":"mixed"}, {"id":"ytc_UgjouNuW5UDvnHgCoAEC","responsibility":"ai_itself","reasoning":"deontological","policy":"liability","emotion":"approval"} ]