Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
I think Tesla's biggest fault was that they continued to let a user with many automatic disable events continue to use the feature. You could argue that McGee should've never been allowed to use AutoPilot after so many strikes. Back when FSD was first being released to Model 3 owners, you had to opt into Tesla's Safety Score system to even get a chance to get access to it. If I remember correctly, you had to have around 90%+ safety score to have a likelihood of getting access. That system sounds like it would've helped here, but it also had the problem of being incredibly restrictive for people that paid for FSD. I paid for it on my old Model 3, and it took me a while to get access to it. I think I had an average of around 85% and I have zero accidents and zero tickets with over 20 years of driving. The problem was that if you had to brake or accelerate too hard for what Tesla deems acceptable, then the system would ding you for it. It was so easy to get hit for things that were somewhat out of your control that people were gaming the system by signing up for the system and going out in the middle of the night to drive on the barren highways. The idea was that you could set it to use AutoPilot for around 100 miles, and any hard braking due to AutoPilot would *not* count against you. So, you just drove enough for the system to get enough information about you, and you'd likely end up with a perfect 100% score.
youtube AI Harm Incident 2025-08-15T18:5… ♥ 46
Coding Result
DimensionValue
Responsibilitycompany
Reasoningdeontological
Policyindustry_self
Emotionmixed
Coded at2026-04-27T06:24:53.388235
Raw LLM Response
[ {"id":"ytr_Ugxv1KqFmv4nmdmGM5F4AaABAg.ALrCgWHmHROALrHOVnSTJr","responsibility":"distributed","reasoning":"mixed","policy":"liability","emotion":"outrage"}, {"id":"ytr_UgwK9BAnYJiStXntL0l4AaABAg.ALrCKN4hqx7ALrK2GZsLio","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"}, {"id":"ytr_Ugy-wc-6U2A1Jl0St4F4AaABAg.ALrCFYlp7hUALreadi6edv","responsibility":"government","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytr_UgxZ2O-k-Jju9XGNmZt4AaABAg.ALrCCJsSnifALrF1Tmd7mm","responsibility":"distributed","reasoning":"mixed","policy":"liability","emotion":"approval"}, {"id":"ytr_UgxZ2O-k-Jju9XGNmZt4AaABAg.ALrCCJsSnifALrFPHs7Wh3","responsibility":"company","reasoning":"deontological","policy":"industry_self","emotion":"mixed"}, {"id":"ytr_UgwRryc8nBP0waW8VAV4AaABAg.ALrBo0rFly3ALrLfGT7u4s","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"}, {"id":"ytr_UgxPgSpdpfppnKSF_OB4AaABAg.ALrBjX9f_WsALrCspgJHwT","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"approval"}, {"id":"ytr_UgxPgSpdpfppnKSF_OB4AaABAg.ALrBjX9f_WsALrGLXizNfN","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"}, {"id":"ytr_UgzmoUT3xNeLGltDtPh4AaABAg.ALrBVRvYljKALrDznthT9u","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"}, {"id":"ytr_UgzmoUT3xNeLGltDtPh4AaABAg.ALrBVRvYljKALrEntAcpZY","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"} ]