Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Radar would not be good at detecting motorcycles (and even less able at picking out pedestrians) because of the small cross-section and limited amount of radar reflecting materials. LIDAR is not feasible for the reasons mentioned. Vision is what humans use and cameras can be better than human vision in adverse lighting conditions (dark, glare, etc.). Just like with humans, multiple cameras can be used to accurately determine distance -- at least within a prudent safety margin. Humans stop at least 5 to 10 feet before an obstacle. So, it is not a problem if an autonomous vehicle stops at 8' instead of 10' because it somewhat misjudged the difference. Likely the problem that occurred in these crashes was a failure to discern these were motorcycles. As a software engineer, I can attest we rarely get things right the first time. I'll grant you that when lives are at stake that's a poor excuse. However, human invention has been.a constant cycle of try, recognize a problem, fix the problem, and retry. I'm not saying it's OK to suffer a few deaths on the way to some cool new technology. However, if Tesla cars are already holding their versus humans in terms of accidents per miles driven, and problems are fixed as they are discovered, then without any extra burden on humanity the final outcome will be fewer accidents.
youtube AI Harm Incident 2025-05-28T18:3… ♥ 1
Coding Result
DimensionValue
Responsibilityunclear
Reasoningconsequentialist
Policyunclear
Emotionindifference
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[ {"id":"ytc_Ugz4AY59q36IgaWwHCN4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgyuafevFmBiC7Ztv1R4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"}, {"id":"ytc_Ugwn1VYzk_xDwR89xdp4AaABAg","responsibility":"company","reasoning":"deontological","policy":"unclear","emotion":"outrage"}, {"id":"ytc_UgxT6YA_aZ3VqIJrTMt4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"}, {"id":"ytc_UgwBZe5bn3i76vQbDgh4AaABAg","responsibility":"user","reasoning":"deontological","policy":"unclear","emotion":"indifference"}, {"id":"ytc_UgwLvhYllrtXrwbSWuN4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"liability","emotion":"outrage"}, {"id":"ytc_Ugy6yUZ7KedAeeVvDSt4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"resignation"}, {"id":"ytc_UgypzZdekfbePGFKeQ14AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"}, {"id":"ytc_UgxVxi5UFlSwO74tm_d4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"}, {"id":"ytc_UgwoIdAlgEJvWNNnS_94AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"indifference"} ]