Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Why would you need to program an AI to feel pain? It'll learn to feel pain on its own (by chance, it's not guaranteed of course). Also, it may learn something else that we don't feel but is "bad" to it.
youtube AI Moral Status 2017-02-24T02:4…
Coding Result
DimensionValue
Responsibilityai_itself
Reasoningconsequentialist
Policynone
Emotionindifference
Coded at2026-04-27T06:24:53.388235
Raw LLM Response
[ {"id":"ytc_Ugh4xkVi4MfetHgCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"}, {"id":"ytc_UggLQKwVGkmGH3gCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"ytc_UggAnOn8fXWe_XgCoAEC","responsibility":"none","reasoning":"contractualist","policy":"none","emotion":"mixed"}, {"id":"ytc_UgglPt9FSMOxZHgCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgijavkW4w4I8HgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"}, {"id":"ytc_Ugg4Od1C-VYHqHgCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"}, {"id":"ytc_UggUKbdXKJJrMngCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_UggEZyTQU4SE3ngCoAEC","responsibility":"none","reasoning":"contractualist","policy":"liability","emotion":"fear"}, {"id":"ytc_Ugiata-MDSuPkHgCoAEC","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"outrage"}, {"id":"ytc_UghrBLrWi9JmwHgCoAEC","responsibility":"distributed","reasoning":"mixed","policy":"none","emotion":"resignation"} ]