Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Yes, things have gotten hectic and all the dumb stuff has been automated away.. …
rdc_dt8myzf
G
7:29
Bold of you to assume that AI bros value the opinions or livelihoods of art…
ytc_UgzjSo_Vj…
G
Then you're perpetuating the real problem, which is not AI or deepfakes, and she…
ytr_Ugz3iU01_…
G
The future is really going to accept the excuse of the "freedom" as a reason for…
rdc_degf19b
G
As someone trained as hypnotherapist, this happened due to the environment which…
ytc_UgxgZ1TPD…
G
To be clear, this doesn’t prevent AI from being involved in copywritten media. (…
ytc_UgzzT80-Z…
G
I gotta say, I think I would enjoy an occasional sarcastic barb: "what! you've n…
rdc_jigbqgk
G
AI replacing coders is a movement that is inevitable. I don’t foresee a future, …
ytc_Ugx2uABaH…
Comment
"Autopilot" is not the same as "Full Self-Driving". They are totally different software. People often don't distinguish the two. A 2019 Tesla Model X - which is the subject of this incident - is obsolete and it's capabilities are far interior to a Tesla of 2025. The car is a computer and becomes obsolete in a couple years. The camera also seems blurred by dirty glass and gross negligence by the driver. This is driver error, not equipment error. The driver should be liable as it is the driver who's in control of and has custody of the car. The car did not on its own, start driving absent the behest of the driver.
youtube
AI Harm Incident
2025-08-05T03:3…
♥ 2
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | mixed |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxndwlX-msSGFFXV9N4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgwpgKed5H-sGVoGpxJ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgzawCv9AZ4i5_fSyb14AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyYUMyAYfbnSIeTdqJ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgzmPvF4stYa7OsCz054AaABAg","responsibility":"user","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwhcvIY7aLONXPvLIJ4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwlgP59r_Q14rAVqcR4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugw0cWM6Do8Do3_PW0J4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgwWmr4QRheAtxVRic94AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugw4t6_xl72BjqctnjB4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"}
]