Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
You will always have the dilemma of 3:05 , an automated car may take the minimum harm road while a car from a security company or body guard will try to protect it s passengers at any cost. This is not new. Humans will also take the second option, the self preservation instinct in a split second reaction is too strong.
youtube AI Harm Incident 2015-12-08T17:5… ♥ 2
Coding Result
DimensionValue
Responsibilityai_itself
Reasoningconsequentialist
Policynone
Emotionmixed
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[{"id":"ytc_UgizunohajILCHgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgjjWOUDi8MzcHgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgjQvTuYsrqOtngCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"}, {"id":"ytc_UgghF14lWrWg93gCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UghSobsLJzKwTngCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UghTsPIeRMcNT3gCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgiGpAhmNNMkf3gCoAEC","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgjlPAxVCSrTmHgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugju10Xr0tXdF3gCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"}, {"id":"ytc_Ugjc3KGPZNZyqngCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"}]