Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
In my opinion, I think that the car would not be in that situation in the first place. It would know that you need to be set distance from a lorry. But to answer the question, I think the car would break hard! The self driving car knows to never hit another car. It would do its very best not to.
youtube AI Harm Incident 2016-12-20T08:2…
Coding Result
DimensionValue
Responsibilitynone
Reasoningconsequentialist
Policynone
Emotionapproval
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[ {"id":"ytc_UgiDRHNP6Ll3F3gCoAEC","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"}, {"id":"ytc_UgiIYchWvUGckHgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgjiR5ifVgu5L3gCoAEC","responsibility":"distributed","reasoning":"mixed","policy":"industry_self","emotion":"mixed"}, {"id":"ytc_UgiJaxBMly9MvXgCoAEC","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"}, {"id":"ytc_UghSuhCsL9iAHXgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UghAA7dcebmab3gCoAEC","responsibility":"company","reasoning":"deontological","policy":"ban","emotion":"fear"}, {"id":"ytc_UghAsDUNhcPf4XgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_Ugi_4HU5JSF7SngCoAEC","responsibility":"distributed","reasoning":"mixed","policy":"liability","emotion":"mixed"}, {"id":"ytc_Ugiufh1PTT6cmXgCoAEC","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"}, {"id":"ytc_UghXJhtXibHvFXgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"} ]