Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Even a female robot is made with makeup. Why? She is hardly trying to make someo…
ytc_UgyoQ6r07…
G
Similar to you, I've got 40+ years as a software developer, beginning in Basic a…
ytc_UgxqRvqM2…
G
I once had a not dissimilar conversation with Chat GPT. It took me less time to…
ytc_UgyYi8sud…
G
This is disgusting. Stop these innovations which will harm the society. Corporat…
ytc_UgxW3ayjL…
G
difference is one is a line of work that people want to do while the other is on…
ytr_UgytsbYJQ…
G
Politicians aren't interested in changing anything until it effects them or a bi…
ytc_UgxeDeKHQ…
G
@OldShadeOfBlue fyi, those historical examples are automated tools. AI is automa…
ytr_UgymL5dhL…
G
Wait, people still use standard ChatGPT without humanizing? Bold move in 2026 lo…
ytc_UgzBS24JX…
Comment
There's only two possible explanations, either someone/something pulled on the steering wheel, or the data is fake. I don't know if there is a servomotor connected to the steering wheel, but if there is and it had a fault, it could have turned the wheel. Usually the simplest explanation is the correct one though.
Regardless of who/what is at fault, there is definitely a flaw in the logic of the FSD design: When overridden, it should first assess the environment and override the driver actions if there is a 100% certainty that the input would lead to a crash. If the driver fails to turn off the FSD before the desired cross street or exit, it's better to skip the abrupt turn, the same way any driver would ignore a passenger telling him to do something similar.
I've had my fair share of sketchy quirks happen to me a year ago with a Tesla 3 rental where the car decided to accelerate into stopped traffic as I was releasing the accelerator, and I had to press extremely hard on the brake pedal to force the car to stop while the motor was still trying to accelerate. That was really unnerving!
Also on the 5 freeway, the then experimental self driving mode would systematically drift to the right when the painted line on the right shoulder would disappear as a new lane from an entrance would merge with my lane, indicating that it was giving more weight to the right side than the left side in its tracking algorithm, even though the road was perfectly straight. The one time I had to intervene, it took so much force to override the steering that it violently veered into the left lane. After that incident I turned off that feature for the rest of the trip. Since then, I'd rather walk in the cold rain than hop in a robo-casket.
youtube
2025-06-01T13:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | mixed |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgzrWJstsJEmFO71eMF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxHEAKhAO7x8sTXnEF4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugw77GzS8Ksbma-vm_V4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugwe6cOnlNzpecTXGHZ4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxMrn7d5ofrk9IS5pJ4AaABAg","responsibility":"user","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgwmFMrPlU3lDSzydQB4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgxOjh51CmaJsTLlpxN4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"industry_self","emotion":"resignation"},
{"id":"ytc_UgwJLaUt8S2J2BKzTi54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgwOoqoNlYCMCCvsWq14AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxUxe-Sls7XYXTDvEN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"}
]