Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Before I watch this I'm going to say this One thing: this is why we don't just use optical cameras and computer vision for self-driving. It is something Tesla's own engineers have complained about. Lidar and radar are important sensors that can augment computer vision and prevent crashes like this because while it computer vision sensor requires ambient light to bounce off the target lidar and radar emit their own radiation that bounces back for detection. So even when the camera can't see the motorcycle, a lidar or radar sensor will and might prevent the crash and the argument that Elon and Tesla have put forward of humans drive perfectly fine with eyes so cars should be able to as well. Is hair brained autopilot should not drive like a human. Humans are terrible. Drivers autopilot should drive better and it can if you use all of the sensors that have been developed for this purpose. And yeah Tesla cutting the radar sensor out of the vehicle is a borderline criminal move. These sensors are important safety features and it should be the case that removing them should be treated like removing a headlight. I don't think these companies should be allowed to cut costs by removing important redundancies and especially not when those redundancies are the only things keeping other drivers alive. We call it computer vision because it's subjected to the same issues. Human vision is It's just run by electrical circuits and not blood and oxygen
youtube AI Harm Incident 2022-11-20T03:2…
Coding Result
DimensionValue
Responsibilitycompany
Reasoningconsequentialist
Policyregulate
Emotionfear
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[ {"id":"ytc_Ugzxz48txMxhQeiCBNd4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UgzhuiJI9p_eVb8udIl4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgwzPyIM8H7iRtRiqqF4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"outrage"}, {"id":"ytc_UgyKQLq0botMsua25RZ4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgzR9i35eMPKGoLn7ZZ4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"mixed"}, {"id":"ytc_UgyF6aWxUW0YAyLJSst4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"}, {"id":"ytc_Ugx5oqkYGEbplxxy90d4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgynoeuXTRbvaAdSGoZ4AaABAg","responsibility":"company","reasoning":"mixed","policy":"unclear","emotion":"outrage"}, {"id":"ytc_UgyWHtt-bvpIkZp8YkR4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"liability","emotion":"indifference"}, {"id":"ytc_UgyIIb2CL_D9Co96AR94AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"} ]