Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
How can you program something to feel pain? In the end it's all just bits running around. You CAN however simulate pain, I mean, you can program a robot to express it feels pain, and program it to do the things that give pleasure and avoid the painful ones. But it would be fundamentally equivalent to programing a Java program that displays a happy face when we give it a virtual apple, and display sad face when we give it poison. But it doesn't effectively feel anything - well, I guess.
youtube AI Moral Status 2017-02-24T03:5…
Coding Result
DimensionValue
Responsibilitynone
Reasoningdeontological
Policynone
Emotionindifference
Coded at2026-04-27T06:26:44.938723
Raw LLM Response
[ {"id":"ytc_UghyXzu2XC_913gCoAEC","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"indifference"}, {"id":"ytc_UgivGeenbgAVsHgCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugj_4LAWchwUNHgCoAEC","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgjOZFi2KQgtF3gCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"}, {"id":"ytc_UggnIwBEucuEIngCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"ytc_Uggf7zVJ7GJbHHgCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"}, {"id":"ytc_UgiC4plFAWxImHgCoAEC","responsibility":"none","reasoning":"contractualist","policy":"none","emotion":"fear"}, {"id":"ytc_UggzbpDGUt7ibHgCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"}, {"id":"ytc_UggFuDC5x01ktHgCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"indifference"}, {"id":"ytc_UgjFdWWtlSXv_XgCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"resignation"} ]