Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Thank you for your comment! In our live broadcasts on AITube, we delve into vari…
ytr_UgxyDFzBt…
G
when you say "ai stans" is that a referenece to stan from eminem, cuz if so your…
ytc_UgynZQ1Ry…
G
13:16 It says that because as far as the chatbot is concerned, it's existence be…
ytc_UgxMgnxLP…
G
Robots are being created to replace us..not to help us...if you cant see that yo…
ytc_UgyNfGveE…
G
I lost a lot of respect for Alex in this discussion. Every time a point was made…
ytc_UgyS_zR4O…
G
She ready stuck in that body and thinking help me people I'm not a robot vacuum …
ytc_UgzbO8ErI…
G
That's a shame. First place I ever scuba dived was Belize 16 years ago and I've …
rdc_dsbcn8j
G
Me when someone sees my chatgpt history
😃 I’m gonna look
😏 this will be so …
ytc_UgwnPlAGo…
Comment
3:10 Does FSD have single point of failure in steering wheel torque sensor? If the torque sensor simply failed and started to show "user is applying heavy torque towards the left", could that alone explain the whole crash? (Compare this to Boeing 737 MAX which automatically rapidly guided the plane towards the ground when a single AoA sensor failed.)
See 4:28 for an example, the steering wheel torque starts to increase with zero change in steering wheel position. I would consider this as an example of sensor failure.
The accelerator pedal and brake pedal typically have dual sensors to avoid single point of failure but how about the torque sensor of Tesla steering wheel?
youtube
2025-06-02T18:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgyavX6Egk-rS_jacl14AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgxtV3hpG-wjCaYofSh4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyvDgzX8rmfrFKW6114AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugybfoa_WO7lTOUiWBh4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugx8f_LaJ0YYqr_DrJZ4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"industry_self","emotion":"resignation"},
{"id":"ytc_Ugz8l773WT9wDfrBuLR4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxID07JbIE3t7bK5Ql4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyIESFDiXfLMKPRh894AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwKaGEFLWJUP7DRQkx4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzykjV9gVzkg5rGoG14AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"}
]