Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
The National Highway Traffic Safety Administration (NHTSA) is investigating 16 crashes involving Tesla's Autopilot system and emergency vehicles to determine whether the technology contributed to the accidents. The latest incident occurred on 27 February in Montgomery County, Texas, when a Tesla Model X driving in Autopilot mode hit a police vehicle at 54 mph, injuring five officers and hospitalising the subject of a traffic stop. The car's Autopilot system failed to recognise the stationary emergency vehicles in time, and while the driver monitoring system worked as designed, it was not enough to prevent the collision. The driver was drunk and had set the car to Autopilot mode four minutes after beginning his journey. Tesla's Autopilot system partially automates highway driving tasks, and drivers using Autopilot are supposed to remain engaged so they can take control of the car at any time. Federal investigators have said that Tesla's marketing exaggerates the technology's capabilities and encourages drivers to misuse it. Tesla denies that the Autopilot feature was responsible for the accident.
youtube AI Harm Incident 2023-08-10T04:5… ♥ 6
Coding Result
DimensionValue
Responsibilityunclear
Reasoningunclear
Policyunclear
Emotionindifference
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_Ugw3BnczAc2CQYwwpiV4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"}, {"id":"ytc_Ugy1f_fO1aChn0PFCrN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgyK9IXSU9hlKuAGZCZ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"}, {"id":"ytc_UgxytBM0Yi-Yg2zQIwt4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytc_UgzI29JHIusPpn_bV0V4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"}, {"id":"ytc_UgzC9rE-BIijRe6QHbh4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"}, {"id":"ytc_UgyyOn5G--DuOPLxSAN4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytc_UgwJT2YUAYKQW3lPF_h4AaABAg","responsibility":"company","reasoning":"deontological","policy":"ban","emotion":"outrage"}, {"id":"ytc_Ugz7tAEqiZvbHnHdUnN4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgzmSUeWf7lYvbBGM_x4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"} ]