Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
not really, unless you make a robot that can feel pain and/ or have a high self-preservation protocol. a self-aware robot that doesn't have these would still be a self-aware robot, but the risk of it refusing to follow orders and/ or rewriting its own code would be minimal. there's no purpose in making a machine feel pain. we need pain to keep us away from harming ourselves, but a few lines of code can do this for a robot...
youtube AI Moral Status 2017-02-23T20:4…
Coding Result
DimensionValue
Responsibilitydeveloper
Reasoningconsequentialist
Policynone
Emotionindifference
Coded at2026-04-27T06:26:44.938723
Raw LLM Response
[ {"id":"ytc_Ugi3l4d6_ZVSPngCoAEC","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"outrage"}, {"id":"ytc_UggXxUS6ImDcVngCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"ytc_UggiiBkWN73X1ngCoAEC","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"fear"}, {"id":"ytc_Ugg5MrhnXcA4ZHgCoAEC","responsibility":"distributed","reasoning":"contractualist","policy":"unclear","emotion":"indifference"}, {"id":"ytc_UggRPiq5dwY9P3gCoAEC","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"}, {"id":"ytc_Ugg2foK25E_ACHgCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgjaRkRwKWzpoHgCoAEC","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"ytc_Ugg8QyIAn2PW43gCoAEC","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugj4vFwy4jRFsngCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgjrvLoOOdSbhngCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"} ]