Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
6:40 While I'm willing to trust you if you gave medium-quality evidence that Tesla's cars are overall more dangerous than humans, your "two cyclists are dead" argument is a huge no-no. If 10'000 drivers kill with a 3% chance and 10'000 AI kill with a 2% chance, that's still 30, and 20 kills respectively. One is clearly better than the other. Such statements without comparison (a.k.a. a Baseline) are highly irritating to me. By demonstrating that you do not consider ratios, you weaken your whole argument and your personal. Hospitals kill many people every day; it's one too many; let's remove hospitals! >:( You get my point... Don't make bad arguments, especially not when you cover them with cream of emotional manipulation, it makes you look bad. I personally believe that Teslas might be worse than humans, but I don't have the will nor time to check that out. I have better to do. I wished your video would have enlightened me on that point and given evidence.
youtube AI Harm Incident 2022-10-02T22:5… ♥ 1
Coding Result
DimensionValue
Responsibilityunclear
Reasoningconsequentialist
Policyunclear
Emotionindifference
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[ {"id":"ytc_UgwnBYHLKEqXh_e-0mB4AaABAg","responsibility":"company","reasoning":"virtue","policy":"liability","emotion":"outrage"}, {"id":"ytc_UgwpSsLV-ro0_E5CHLt4AaABAg","responsibility":"company","reasoning":"deontological","policy":"ban","emotion":"outrage"}, {"id":"ytc_Ugxo4o9yo2hLxj-msV94AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgwxZIr0zAr7vk6nDop4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"mixed"}, {"id":"ytc_UgwnldreHGeXyT2ZQKh4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"mixed"}, {"id":"ytc_UgzLw2H82mWnX4OXMqJ4AaABAg","responsibility":"company","reasoning":"unclear","policy":"unclear","emotion":"mixed"}, {"id":"ytc_UgwBPxs3sJxGmbQlV3t4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"fear"}, {"id":"ytc_Ugyls7BwIgpDs0M93ih4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"}, {"id":"ytc_UgwN8uZBJjWtqseMS114AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"industry_self","emotion":"mixed"}, {"id":"ytc_UgwlPDycl5TdNN0xkEl4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"} ]