Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Great story! I would've liked more statistics, especially number of deaths and crashes in proportion to active Teslas with auto pilot vs. proportion deaths / crashes of some other comparable manufacturers. Because this is what all pro autonomous vehicle people is screaming: people still make more mistakes! Which of course on its own is not a sound argument: if a random selection of cars of brand A had brakes that could just, by construction, suddenly completely stop working for no reason, and you could prove that, they would not be allowed on the streets. Of course issuing a warning would not suffice. This is where you would build the case against Tesla: They are obviously promoting a technology which we are encouraged to hand over our safety to, while still claiming in the details that it isn't ready for that. Calling something that is SAE 2 'auto pilot' and promoting it this hard by constantly claiming it will save lives etc is straight up dangerous. I mean what is the point if you should always keep your hands on the wheel and feet on the pedals and be prepared at all times? Of course people will act the way the technology permits them to, not what Tesla writes in the manual. They know this, and this is why I actually think they will have to pay up enormous sums pretty soon. There's going to be whole law firms working on nothing but these kinds of cases very soon.
youtube AI Harm Incident 2024-12-23T23:3…
Coding Result
DimensionValue
Responsibilityunclear
Reasoningconsequentialist
Policyunclear
Emotionmixed
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[ {"id":"ytc_Ugylawk4Wwo2HaZN6Gd4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"}, {"id":"ytc_UgygC_qYcmGjSMiJmA94AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_Ugx3VkzIh25fPT5W7Gh4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UgzTcfQFE4lU3kA-jFl4AaABAg","responsibility":"company","reasoning":"virtue","policy":"unclear","emotion":"outrage"}, {"id":"ytc_UgygOgEYmoGpSAABP_F4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"fear"}, {"id":"ytc_Ugxo0KySK5XL5aODMIp4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"indifference"}, {"id":"ytc_Ugym8uQepetHZvqvByt4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"}, {"id":"ytc_UgzK4hC1-MQssaF_xpV4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"resignation"}, {"id":"ytc_UgzftlsHe8yLGgZJjS14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UgzCPvU_oF2h9AS6a_d4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"fear"} ]