Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Wow. Tesla auto-steer assisted driving must be really really good. More Perfect Union has to highlight self-driving flaws that were fixed years ago because there isn't enough wrong with the current release to make a ten minute video. Autopilot/Full Self Driving is now called "auto-steer". That satisfied the demands of the courts. Oh no! Tesla will be shut out of California. Somebody didn't read the lawsuit even once. Maybe they should have asked AI to summarize it. Self-driving can't be done without Lidar? Humans use nothing but their eyes. Yet we think a computer with eight eyes can't outperform humans? Hey, whatever makes you feel better about yourselves. 14 deaths! Ooooooooooooo, how outrageous! If every drunk driver had self-driving, we'd have saved thousands of lives. There's an utter failure to apply statistics to the prioritization here. Every time a human kills someone, the system (humanity) doesn't get a lick better at driving. When a Tesla car kills someone, the software engineers focus on the flaw that caused the death, and implements improvements to not let that problem happen again. The system gets better. Think about it like AI image generation. Six fingers on a hand! Ha! Artists will never be replaced! Three months later, it was difficult to generate an accidental image of a six fingered person. Self-driving is the same. 14 deaths is actually shockingly close to zero. The Luddite mindset is strong with this one.
youtube 2026-03-01T15:5…
Coding Result
DimensionValue
Responsibilitynone
Reasoningunclear
Policyunclear
Emotionindifference
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_UgyoZ0LyD7csfwHar6B4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytc_UgwYr5b79PW0d65mBJJ4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"}, {"id":"ytc_Ugzfq2DLzjjYbzl5kOp4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytc_UgzxjlMpkFJiSq2Gnix4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"}, {"id":"ytc_Ugz3H-QUuENUqHEdWM14AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"}, {"id":"ytc_UgzeXb-bZvf1bvok8sl4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytc_UgzIcFoix72lPgzZwUp4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"}, {"id":"ytc_UgzhYcY58bEaHC8oyFF4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"}, {"id":"ytc_Ugz6V6QvByYcJp2VS594AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"mixed"}, {"id":"ytc_Ugy0oCNPLtNa2ZzuPxJ4AaABAg","responsibility":"government","reasoning":"deontological","policy":"ban","emotion":"fear"} ]