Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
There's more to consider than passenger deaths per mile traveled. When waymo cars drive into active accident response scenes, and when they turn into bricks during a power outage, there's a lot more to consider than whether or not the passengers are being put in danger.  When something goes wrong with a human driven car, the person behind the wheel is responsible. When waymo clogged up the san fransisco street grid during that power outage, they couldn't even be reached by emergency response in a timely way. Its more than just likelihood of an injury, its what happens when something goes wrong.  Microsoft can sell buggy software to customers who implicitly consent to being beta testers. If you dont want to be a beta tester for a software house, there are other places to spend your money. But sharing the street with these experimental robots, means theres no way to opt out of their testing cycle. They've managed to turn the whole world into their sandbox, without having to pay for it. I can see plenty wrong with that.
reddit AI Moral Status 1773270408.0
Coding Result
DimensionValue
Responsibilitycompany
Reasoningdeontological
Policyliability
Emotionoutrage
Coded at2026-04-25T08:33:43.502452
Raw LLM Response
[ {"id":"rdc_o9wx8yo","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"rdc_o9xyhxg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"}, {"id":"rdc_o9w2cei","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"}, {"id":"rdc_o9wack6","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"approval"}, {"id":"rdc_o9wsyba","responsibility":"company","reasoning":"mixed","policy":"regulate","emotion":"mixed"} ]