Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
There's only two possible explanations, either someone/something pulled on the steering wheel, or the data is fake. I don't know if there is a servomotor connected to the steering wheel, but if there is and it had a fault, it could have turned the wheel. Usually the simplest explanation is the correct one though. Regardless of who/what is at fault, there is definitely a flaw in the logic of the FSD design: When overridden, it should first assess the environment and override the driver actions if there is a 100% certainty that the input would lead to a crash. If the driver fails to turn off the FSD before the desired cross street or exit, it's better to skip the abrupt turn, the same way any driver would ignore a passenger telling him to do something similar. I've had my fair share of sketchy quirks happen to me a year ago with a Tesla 3 rental where the car decided to accelerate into stopped traffic as I was releasing the accelerator, and I had to press extremely hard on the brake pedal to force the car to stop while the motor was still trying to accelerate. That was really unnerving! Also on the 5 freeway, the then experimental self driving mode would systematically drift to the right when the painted line on the right shoulder would disappear as a new lane from an entrance would merge with my lane, indicating that it was giving more weight to the right side than the left side in its tracking algorithm, even though the road was perfectly straight. The one time I had to intervene, it took so much force to override the steering that it violently veered into the left lane. After that incident I turned off that feature for the rest of the trip. Since then, I'd rather walk in the cold rain than hop in a robo-casket.
youtube 2025-06-01T13:1…
Coding Result
DimensionValue
Responsibilityai_itself
Reasoningconsequentialist
Policyunclear
Emotionmixed
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_UgzrWJstsJEmFO71eMF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytc_UgxHEAKhAO7x8sTXnEF4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytc_Ugw77GzS8Ksbma-vm_V4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"}, {"id":"ytc_Ugwe6cOnlNzpecTXGHZ4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"}, {"id":"ytc_UgxMrn7d5ofrk9IS5pJ4AaABAg","responsibility":"user","reasoning":"mixed","policy":"none","emotion":"approval"}, {"id":"ytc_UgwmFMrPlU3lDSzydQB4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"}, {"id":"ytc_UgxOjh51CmaJsTLlpxN4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"industry_self","emotion":"resignation"}, {"id":"ytc_UgwJLaUt8S2J2BKzTi54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UgwOoqoNlYCMCCvsWq14AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"}, {"id":"ytc_UgxUxe-Sls7XYXTDvEN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"} ]