Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
No, because you don't have to program/teach it to feel. Why would you program a toaster to feel? Why would you program any robot to feel or have its own desires?
youtube AI Moral Status 2017-03-31T02:5…
Coding Result
DimensionValue
Responsibilitydeveloper
Reasoningconsequentialist
Policynone
Emotionindifference
Coded at2026-04-27T06:26:44.938723
Raw LLM Response
[ {"id":"ytc_UggdK4-kj4fZ8XgCoAEC","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"indifference"}, {"id":"ytc_UggKRY8uFcIsqXgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_UgjnW1J3ViyfrngCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_UgiC1N3DmtnvpHgCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UggNuoc-I2DP_XgCoAEC","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"}, {"id":"ytc_UggOCJjsINSZxXgCoAEC","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"}, {"id":"ytc_UgjWrIZdXZ2JAngCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgjBTO2oKelJlngCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"approval"}, {"id":"ytc_UggYU4Qkt_4CP3gCoAEC","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugje-B8BgTNcmHgCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"} ]