Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Minorities have less well known morbidity and pathophysiology compared to the ma…
ytc_UgyQgfthY…
G
Thanks for commenting on this. The artist has apparently been called out for "sw…
ytr_UgyqdGNyz…
G
He retired from google because he was 75 and struggling to remember certain thin…
ytc_UgzLh6OoY…
G
The only way you can tell it's AI is that it's so vapid and banal it's like a co…
rdc_mthe1ae
G
i don't understand why copyright laws do nothing against Ai. it's like law appli…
ytc_UgwqBYERw…
G
Ai to help the parasites stay in power 💯 if you can't see that buck up and study…
ytc_UgyQpnA1D…
G
I mean if he wants to compare AI to cameras. I can walk up to a drawing and take…
ytc_Ugx-gLVAH…
G
The weirdest thing I've noticed with AI, especially Sora, is the amount of anima…
ytc_Ugya-vEbg…
Comment
How much is known about exactly how the steering control etc is set up in this particular model?
I have no idea, but logically (and especially as Tesla don't like to use any unnecessary parts), I would have thought the steering column would be arranged something along the lines of (in this order):
- Steering rack
- Motor/actuator controlled by the car (i.e. what's used by Autosteer/FSD, and probably the power assisted steering too)
- Torque sensor (as I imagine these work by measuring the torque forces between two sides of a particular point on a shaft, but I don't know)
- Steering wheel
If this is the case, then surely resistance from Autosteer/FSD to the user turning left wouldn't show as steering torque to the right, it'd show as extra torque to the left (as that's measuring how hard the user is resisting against whatever the steering rack/the car's actuator is trying to do). But this also means that any leftwards steering action caused by the car wouldn't show as torque to the left at this sensor either (proving that it wasn't the fault of the car, it was definitely some sort of user-side steering input that turned it left)
On the other hand, if the car does have a second motor on the steering wheel side of the torque sensor to resist against user movement etc (as you seemed to mention), this would fit with the Autosteer resistance showing as a bit of rightward torque after the initial torque left. But it would also mean that it's possible that some sort of fault of that motor could have been what forced it to the left, as that would show as torque to the left even if the user was trying (but failing) to overpower it. It still wouldn't necessarily be a fault of the FSD system (i.e. the neural network etc), more a hardware fault at a different level of the car that was outside of FSD's control.
So I just feel like we really need a lot more information about how things are arranged specifically in this model of Tesla, in order to make proper sense of these sensor readings and determine possible causes. I know that on my 2022 UK (made in China) Model Y at least, on Autopilot on a motorway, I can rapidly wiggle the steering wheel from side to side within the zone that won't deactivate Autopilot and it's enough to make the car rock from side to side, so I feel like Autopilot is only controlling the actual steering rack rather than being able to resist my movements more directly before they get that far down the steering column. But I don't know whether this car might be different (I assume it would be a "Highland" Model 3 built in the US?)
youtube
2025-06-05T05:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[{"id":"ytc_UgzsM780CqQLmpRoU7p4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzQp13q6if54rP5-a94AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxZ3YCfm1AZNIs92Wh4AaABAg","responsibility":"company","reasoning":"mixed","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugwa3dcOME_BsioaxSZ4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxTJab6fN31mOYW03p4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugz_joCEi-bCuZslGSB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_Ugw9I5R9nTtoKy1pLrF4AaABAg","responsibility":"none","reasoning":"deontological","policy":"industry_self","emotion":"indifference"},
{"id":"ytc_Ugxwk3qd3qYjAUBJsV54AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgyqToj03HOVG9b0j6x4AaABAg","responsibility":"company","reasoning":"virtue","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxIMr1WjbVKKkh4Xnp4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"}]