Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
THough the consequences were tragic for this man and his girlfriend, I 100% disagree with the lawsuit. I use FSD often in my Tesla. Evertime that I go into FSD mode I have to accept an agreement that I am totally responsible for the way the car drives. "Supervised" is a very smart move on Tesla's part, because that is exactly what it is. Everyone that buys a Tesla is informed of the FSD capabilities and must agree to (via contract) that they will remain alert and "supervise" auto pilot at all times. Tesla, as well as anyone else knows that computers(although becoming more self sustaining) will never ever be more reliable than a human being. I will admit, I do NOT trust it in construction zones and always revert to standard driving when I enter construction zones, because that is when it makes most mistakes. Remember that the roads it drives on are on put in a database, and when construction happens, the car doesn't alwayys thing "construction zone" when it enters what is regularly just a road. Again, stop picking on Tesla and Elon over this, he has brought driving to a whole new level, there will always be bugs and that is why we as humans must stay alert at all times. In the near future, with AI, truly AUTONOMOUS DRIVING will be a thing, but I seriously think, even then "Warnings" about using it will be in place. Thanks Tesla for making such a technologically advanced car, keep up the good work!
youtube AI Harm Incident 2025-10-22T14:2…
Coding Result
DimensionValue
Responsibilityuser
Reasoningconsequentialist
Policynone
Emotionapproval
Coded at2026-04-27T06:26:44.938723
Raw LLM Response
[ {"id":"ytc_UgzvYdmX3Ilyp6vPvsN4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UgxtDRJLJ6X3ZWF8HSJ4AaABAg","responsibility":"company","reasoning":"deontological","policy":"ban","emotion":"outrage"}, {"id":"ytc_UgwalBJi4Uu25wgNall4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytc_Ugy3YHEl6LZ83efgTYh4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"outrage"}, {"id":"ytc_UgwBoUf61YobEc3hBfp4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"mixed"}, {"id":"ytc_UgxAgAtW61-kj-7Zabt4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"}, {"id":"ytc_Ugwmx0Z_qf5OAci6zxJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"}, {"id":"ytc_UgzctzG63YMkSDJ3FVh4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_Ugw1b-ImNntNRb6chNF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UgxYS3MuncRO-NkVM0x4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"} ]