Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
I’m a rider, and Tesla owner. I’m a bit perplexed by the use of the term Autopilot to reference the AI driving system know as FULL SELF DRIVING (FSD). Granted, the terminology used by Tesla may be confusing to some, but your video is clearly trying to infer that there’s a danger with the AI driving. It seems like the “Autopilot” that you refer to as the unreliable AI is actually similar to the enhanced cruise control that many new automakers put into their vehicles. As such, Noone should be surprised by the possibility of a rear ending on a straightaway in the darkness by an inattentive driver. I think it’s doing everyone a great disservice to conflate the two. When i use my FSD, which is in limited beta release, the actual AI has trained the car to look at everything on the road, and respond appropriately. Granted, it is not perfect, but I can tell you that it does see bicyclists and motorcyclists from a far distance, even in low light. I can see people using the enhanced cruise control doing this, but I really have a hard time believing that the FSD could have rammed into a motorcyclist from behind. Just my experience, all I’m saying. I think Tesla should do away with standard enhanced cruise control known as “Autopilot” and create a new term such as “AutoCruise” for AI assisted cruise control. It should use the FSD capability to discern highway objects to help it cruise. Tesla should make this standard equipped without additional cost or subscriptions, as that would be the right and ethical thing to do to make their cars safer. They can always sell the FSD for city driving or Robotaxis for profit. This would be the right thing for humanity, in my humble opinion.
youtube AI Harm Incident 2022-09-17T08:1…
Coding Result
DimensionValue
Responsibilitycompany
Reasoningmixed
Policyindustry_self
Emotionapproval
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[ {"id":"ytc_UgzEzw4ccp8J7Arzvdp4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"indifference"}, {"id":"ytc_UgwBXV0RxeUpen6ZGbd4AaABAg","responsibility":"company","reasoning":"mixed","policy":"industry_self","emotion":"approval"}, {"id":"ytc_UgyoK1cpLNqaZV80fHN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UgzRyADljOnn2vgpll14AaABAg","responsibility":"company","reasoning":"virtue","policy":"liability","emotion":"outrage"}, {"id":"ytc_UgzEY4gO8WBaJjMQscB4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"fear"}, {"id":"ytc_UgzywqUx_ffVWDIcZmJ4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"mixed"}, {"id":"ytc_Ugz0ohamZ0X3Wpqd5zR4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"indifference"}, {"id":"ytc_Ugy-qZSzVScDwQXFVAl4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"}, {"id":"ytc_UgwlUkkEdenbwVNiqg14AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"liability","emotion":"mixed"}, {"id":"ytc_Ugy9nu5XiXK84cthkE14AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"} ]