Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
+Steve C The example is flawed. Just program the AI not to let itself get boxed in, and not to follow a truck closely enough that something falling off the back would be a serious hazard. Done. Next problem? See this is the issue, we can just program the AI to deal with these situations without causing accidents. ALL the examples of this dilemma I've seen are ones a self driving car will 100% of the time avoid in the first place. The kinds of accidents they will have is when another car swerves into them for literally no reason, or the brakes fail. ie. situations so rare that it almost doesn't matter how they react. Besides swerving is innately dangerous. You're almost always better of braking in a straight line. I think the people coming up with these dilemmas aren't transport engineers. These are problems humans have, not AIs.
youtube AI Harm Incident 2015-12-10T01:5…
Coding Result
DimensionValue
Responsibilitydeveloper
Reasoningconsequentialist
Policyindustry_self
Emotionapproval
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[ {"id":"ytr_UgggitcG_CbrUXgCoAEC.87WDKCb8uB_87Wc8Git_dg","responsibility":"user","reasoning":"consequentialist","policy":"liability","emotion":"indifference"}, {"id":"ytr_UgiwrXyY1rZ69XgCoAEC.87WAOkvgKe087ZUsAi0XrE","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"}, {"id":"ytr_UgiwrXyY1rZ69XgCoAEC.87WAOkvgKe087Zk-pe2Tvs","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"ytr_UgjFbGyekK77fngCoAEC.87VyK9Y8dlO87WlJMEeWoL","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytr_UgiiYSCGtUOQQ3gCoAEC.87VxmXkagiW87W5Qy0QLaG","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"approval"}, {"id":"ytr_UghlFJ0pJ4lt_ngCoAEC.87Vxapt4mjd87W8NWV8_40","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytr_UghfG4rKbyLwlXgCoAEC.87VxK1IcVdT87W9BZPSPbn","responsibility":"none","reasoning":"deontological","policy":"ban","emotion":"mixed"}, {"id":"ytr_UghiyJc91JDD0XgCoAEC.87Vw0Yij8ek87VyZ_yCQtY","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"fear"}, {"id":"ytr_UghiyJc91JDD0XgCoAEC.87Vw0Yij8ek87VzogUdvsL","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytr_Ugg-LAtK9Y2urngCoAEC.87VuXpLFzck87VxWEbSmkv","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"} ]