Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
If the cars are self-driving, then it should only go as close to the veichle infront of them realtive to the speed it's going, so it always will have time to make a full stop before even getting to the veichle infornt of it. And yes, breaks can fail, but that is an mechanical error not an programable error, if it was a real person driving the car, then the possebility of break failure whould be as big, as an AI driving the car.
youtube AI Harm Incident 2017-07-25T01:0…
Coding Result
DimensionValue
Responsibilitynone
Reasoningconsequentialist
Policynone
Emotionindifference
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[ {"id":"ytc_UgzLfa4wDAxEE-DnOk14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugw8vnsRdhUYoVEeC7p4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugx_3fyLhrMbWTkim-l4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_Ugy3m11qmSsA2P8E3GZ4AaABAg","responsibility":"company","reasoning":"deontological","policy":"ban","emotion":"outrage"}, {"id":"ytc_Ugyo2hSnzg9Y8b7i16h4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgxJuGhYth23xfAg25V4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"}, {"id":"ytc_UgitX4hSzKK4wXgCoAEC","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"disapproval"}, {"id":"ytc_UghXkzlL2wwLPHgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgjW7cd-m5pz9HgCoAEC","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"mixed"}, {"id":"ytc_UgihoGq_oAtLbXgCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"} ]