Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
There is no ethics to figure out, the car should just stop if there is no way around. Automated cars are aware of everything around them, everything, this scenario is not possible unless the people are behind something solid n without looking or on purpose jump in front of the car.
youtube AI Harm Incident 2014-05-25T17:4…
Coding Result
DimensionValue
Responsibilityai_itself
Reasoningdeontological
Policynone
Emotionindifference
Coded at2026-04-27T06:26:44.938723
Raw LLM Response
[ {"id":"ytc_UgiWcB2yghriXngCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgjGpMIElKUcbHgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UgiQ91bEWOUkuHgCoAEC","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"indifference"}, {"id":"ytc_UgiKVLZeD_H-iHgCoAEC","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"}, {"id":"ytc_UggCYfK-9jRh7HgCoAEC","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"}, {"id":"ytc_Ugj-SSej1PEkcHgCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_UgimsOSI-0reJ3gCoAEC","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"mixed"}, {"id":"ytc_Ugh4WDO7or0sBHgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"}, {"id":"ytc_Uggp2zaQuKmd7ngCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UgjcFgyqVoluwHgCoAEC","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"indifference"} ]