Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Wow... this comment section is full of people going all white-saviour and rambli…
rdc_ibdd9sn
G
@DrDroneySomeone unplug that robot, put it out of it’s misery if it only exists …
ytr_Ugw-t75aO…
G
An AI shouldn't have the same rights of a human no matter how "fair" it is which…
ytc_Ugx4GSCuW…
G
uh huh. and I'm sure none of those podcasts were ai generated and the ones that …
ytc_UgwZtXrK6…
G
Everyone needs to know about anti facial recognition glasses. They're spendy but…
ytc_UgwzbDtPg…
G
If all will done by AI, people have no aim or meaning left for do anything or …
ytc_UgxCKkPQr…
G
Thanks, Jeff & Steven. We’re testing “Reinforcement Learning from Maternal Feedb…
ytc_Ugy-XPhxF…
G
i feel like ai art will become the sort of fast food of the art community, and h…
ytc_UgxsNhgPv…
Comment
Autopilot(AP) and Full Self Driving (FSD) are different features. AP is like smart cruise control, you need to be looking forward and grabbing the wheel— it really only works well for freeways and straight to slight curve streets. FSD is a different beast and regularly drives me across a whole major city thought lots of different terrain. FSD you dont hold your wheel and just need to supervise (look forward). It sounds like this was about AP but you repeatedly showed FSD and spoke about the examples and issues like it was the latter. This feels entirely like user error to me. If you push on the gas with AP the screen literally says something along the lines of "Acceleration pressed, autopilot will not break". The driver is going to end up taking full blame on this as he should.
youtube
AI Harm Incident
2025-09-05T07:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgzqoIP7u2GSwBi5rz54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugxa0-c3sxiuSGNPvEV4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzpGzAvcuyZt4SwF494AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgwNQlaSWmYVe_Asz1N4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgynzuDLzhF8xH0C6K94AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgyHPstaoYLAPfBa20d4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgxQKJJoStx2AYnSHit4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgzC_47EDVA5k-PojkJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgwBKeHwkxcw-79UqpN4AaABAg","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzqTAV7UHm32dM-OtN4AaABAg","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"outrage"}
]