Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Yes, there will always be accidents. In a vehicle with human driver in full control, the driver will be fully accountable for the accident (and any accidental death). SOMEONE will thus pay for the sins. In a driverless car, who do you blame? Yah sure, you can sue the company for malfunction or software bug, but no one is really going to jail. From a moral perspective, that just feels wrong (and very problematic) to me. I do appreciate driver-assist technology (eg. alerts/beeps when you're backing into some objects, or when you're too close to the vehicle in front). But humans should be in control. In the uber case, if that human pilot operator has not been distracted, she would have taken the car out of autonomous mode and stop the car. At the same time, that cyclist really had shot out of nowhere. BUT, if uber hasn't disabled the built-in auto-brake system in the Volvo, this accident would/could have been avoided. Ultimately uber is simply not ready for prime time, but who in uber will take the blame? No one. And THAT feels so wrong. It's like they crash and kill someone, and just shrug, and say, "Oops, sorry, bad luck."
youtube AI Harm Incident 2022-03-11T07:1… ♥ 2
Coding Result
DimensionValue
Responsibilitycompany
Reasoningdeontological
Policyliability
Emotionoutrage
Coded at2026-04-25T08:33:43.502452
Raw LLM Response
[ {"id":"ytc_Ugwjf_iNeodJxiAIS2l4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytc_Ugy2vyf87HTsfjp-Hqp4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"approval"}, {"id":"ytc_UgxC3Zui2fixxf_OBB14AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"}, {"id":"ytc_UgzVIEhh4jhMnbRY9JR4AaABAg","responsibility":"company","reasoning":"deontological","policy":"unclear","emotion":"outrage"}, {"id":"ytc_UgxWxpAE9xeGitmDivZ4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"ban","emotion":"fear"} ]