Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
For goodness' sake, people need to understand the difference(s) between Tesla and their "systems" and companies like Waymo, Zoox, and (the former) Cruise. These companies operate solely on roads heavily mapped by their technologies to (not being hyperbolic) the millimeter scale. Then they run those routes through simulators and, in the case of at least Cruise and Waymo, they run them with *_professional_* drivers behind the wheel who know when and how to take over, mark the problem issue or area, and pass that information on via various routes to engineers who can work on and fix the problem. Even then, they're not perfect, which is why then the default response to an unknown or potentially unsafe situation is to stop (not going to comment on Cruise on that) and wait for a human being to remotely assist the vehicle through the situation. This is in combination with the _heavy_ mapping, the radar _and_ lidar along with the cameras, and the very powerful on-board computers. Tesla sends out cars with some cameras, a computer that doesn't quite measure up, and the only human who can take over the vehicle is the person who paid for the privilege to be a tester. Basically the crash test dummy. Without the (to use the word again) hyperbolic leadership of Tesla, the immense amounts of money involved, and government entanglements, nobody would dare risk calling this "auto pilot". And I'm a pilot myself. When a pilot flies with auto pilot engaged, they are *always* prepared to take over manually and do not take attention away, even having a backup pilot for things like bathroom breaks. But political power is everything these days I guess
youtube AI Harm Incident 2025-08-15T23:2… ♥ 3
Coding Result
DimensionValue
Responsibilitycompany
Reasoningconsequentialist
Policynone
Emotionindifference
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_Ugzv4pN3rnMuKkLRBmp4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"outrage"}, {"id":"ytc_Ugy3iBPyXBIaO7ebPvF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UgzWiPQid7Tkxl0Vdi54AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"}, {"id":"ytc_UgyRapN3t_DnaAwySMB4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugx4_At8f_oM50EVYtN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_UgxVXLlzSTkEBR0r_a94AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"}, {"id":"ytc_Ugxg_YGN6bpA4zNC3Vp4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugy4c9PUwA-g08GmLgF4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"}, {"id":"ytc_Ugw9n-CBa91CgGzzz6V4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_UgxU5Yx9muDSkreeqJ94AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"outrage"} ]