Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
The troubling thing about this video to me is the journalist didn't ask the questions "How do we know that autopilot was engaged?" and "Is autopilot the same as FSD?". In the video clip from inside the Tesla, the screen was blurry, but it was clear enough to make out the image of the steering wheel near the top. It appears to be a grey circle. When autopilot is engaged, it is blue. That crash was likely adjacent to the topic. Based on the visualization, the car was not in FSD mode, but in a mode called Autosteer. This uses a separate software algorithm than FSD. Us owners that have had a Tesla since before FSD mode was available tend to refer to autosteer autopilot as autopilot and FSD autopilot as FSD. This is based on differences in the software over time. At some point, FSD changed from being a setting to a different selectable mode with its own distinct settings. Long story short: in that portion of the video, the car was in autosteer autopilot. It likely disengaged with the user putting too much torque on the steering wheel. In autosteer mode, it forces the user to keep slight torque on the steering wheel, while in FSD, it follows the face and eyes mostly, while occasionally requiring the torque. Autosteer mode also has a cruise control step-down so if you disengage via turning the wheel, cruise stays active. Since the steering wheel was grey and the car didnt appear to slow down, I'm confident they probably turned the wheel in autosteer mode, taking it out of basic autopilot and into cruise control just prior to the clip. They didnt notice or recognize the sound it makes when disengaging and thought it was still steering itself so it went straight instead of steering around the curve. All that to say that it makes me wonder how much skepticism they used for the other examples. I agree with a lot of what they reported, but to prevent the "boy who called wolf" situation they need to do better about ensuring they keep it balanced.
youtube 2026-01-05T14:3… ♥ 4
Coding Result
DimensionValue
Responsibilitynone
Reasoningunclear
Policynone
Emotionmixed
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[{"id":"ytc_Ugx8qnwenVyLFcDifPJ4AaABAg","responsibility":"company","reasoning":"virtue","policy":"none","emotion":"mixed"}, {"id":"ytc_UgwfuBeeS9a8OfwSbVx4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgyvpT0MYX1teMVCYB94AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"ytc_UgxaMs9Pz-oJ6Bms9Zh4AaABAg","responsibility":"company","reasoning":"unclear","policy":"none","emotion":"mixed"}, {"id":"ytc_UgwMfQwd7l7_VDrYiaB4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"outrage"}, {"id":"ytc_Ugy6dsR-Eq7Ee_uF4lp4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"ban","emotion":"fear"}, {"id":"ytc_UgzI5wSJ1LZzxyHXAH14AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgxTwpBYQDxtu-Xd1BF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"}, {"id":"ytc_UgxMfIjd5ynydGgZblp4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UgxOTbLaMGHJAnEUW4Z4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"indifference"}]