Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
How about we just don't program super-intelligent AIs at all? Would probably be a lot better than potentially having to deal with angry robot takeover.
youtube AI Moral Status 2017-02-27T06:2…
Coding Result
DimensionValue
Responsibilitydeveloper
Reasoningconsequentialist
Policyban
Emotionfear
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[ {"id":"ytc_Ugg3xHoUtx6gWngCoAEC","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"fear"}, {"id":"ytc_UggE8ZCLy_Y7-XgCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_UghT7BJ_Jkv_q3gCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_UghD0PHvZSddz3gCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"fear"}, {"id":"ytc_UghzdlAYEf702XgCoAEC","responsibility":"government","reasoning":"deontological","policy":"none","emotion":"outrage"}, {"id":"ytc_Ugi0wlK0xxTZ3XgCoAEC","responsibility":"developer","reasoning":"virtue","policy":"regulate","emotion":"mixed"}, {"id":"ytc_UggL_n6lQWteeXgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytc_UggLBNtGHpEtHHgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytc_UgiX-KqMqEVNV3gCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"fear"}, {"id":"ytc_UgivRlFZ-T5UaXgCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"} ]