Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
The video is not real! Each attentive human driver could have prevented the women's death. It is simple math: low beam head lights have a range of at least 150feet (60m). In addition there is stray light from street lamps. The women has been in the middle of the road when she became visible. This means at a speed of 38mph (17m/s) the time from the women becoming visible to the impact is approximately 3.5s. Car stopping distance including "thinking time used in The Highway Code" at 38mph is 110 feet. In the video this duration is just over a second. Note also the stark contrast in the video, due to the data compression, shades (transitions from bright to dark) are not visible, the human eye would see a lot more. Conclusion: 1. With a human driver, the lady would be alive. 2. The video is manipulated or of such a low quality that it actually hides the most important details. 3. Uber's automated car completely failed and its safety is worse than a human driven car. We can only speculate if the video has been purposely edited or whether it is just the bad quality; why is there no better video available from such a high-tech car? why the police officer made such a bold statement based on such a weak evidence, blaming the victim?
youtube AI Harm Incident 2018-03-25T13:1… ♥ 1
Coding Result
DimensionValue
Responsibilityuser
Reasoningconsequentialist
Policynone
Emotionindifference
Coded at2026-04-27T06:26:44.938723
Raw LLM Response
[ {"id":"ytc_UgwZGJQVFhhfAoxyYbp4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgyruTP4wNiUY9POttJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"}, {"id":"ytc_Ugx61-v8wPFo15TLnm54AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugw7nL9sDBF_EsRuC4J4AaABAg","responsibility":"user","reasoning":"deontological","policy":"ban","emotion":"outrage"}, {"id":"ytc_UgwUQZPD0JvYj892xkZ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"approval"}, {"id":"ytc_UgwmHjPsc_hvVK0MCah4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"ytc_UgxI0QdUij0_B8-T4Cp4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytc_UgxIgOSX75DZyc4h6wR4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"outrage"}, {"id":"ytc_UgyqtI8Flut6myfr0jV4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_Ugzq2JwLhXDXAuDZghl4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"unclear"} ]