Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
I get that statistically they get in fewer accidents, but programming bugs happen all the time for all kinds of reasons, even after systems have worked fine for years. I would rather take my chances driving myself than trust some obscure software bug not to show up, and this is a perfect example. For something with so many variables, I really do not think we should rely on full self driving. I am a programmer too, and I would never get into a fully self driving car. It is one thing to use driver assist with "your hands on the wheel" and be ready to take over, but at that point I honestly question the point of calling it self driving. If I see a car blazing toward a child or a school bus, I want to be the one in control, not sitting there unable to do anything.
reddit AI Harm Incident 1765229313.0 ♥ -1
Coding Result
DimensionValue
Responsibilitydeveloper
Reasoningconsequentialist
Policynone
Emotionfear
Coded at2026-04-25T08:33:43.502452
Raw LLM Response
[ {"id":"rdc_nt078e5","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"rdc_nszorut","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"}, {"id":"rdc_j446br5","responsibility":"company","reasoning":"virtue","policy":"none","emotion":"outrage"}, {"id":"rdc_j3wvoeg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"rdc_j3y8nqt","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"} ]