Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The value of those homes and houses will sink tremendously, because NOBODY will…
ytc_Ugx6WuUvI…
G
AI isn't life, it doesn't experience pain or emotions, they can only simulate th…
ytc_UgzQaxjMd…
G
„i cant draw so i just use AI to make oc‘s“
*throws gachalife and picrew at you…
ytc_UgwccMxAX…
G
@blackvulture7999 it's funny, one of the first things i tried asking chatgpt was…
ytr_UgwSNE47Y…
G
The AI's concession that it makes ethical choices seems unnecessary to me. There…
ytc_UgyKjDp0n…
G
Umm. . . . i hate to say this, but Neil is NOT on the INSIDE information in thes…
ytc_Ugzl_Hg-_…
G
What’s hilarious is that people wanna bitch an moan about AI art, but they are l…
ytc_UgxqBte6v…
G
Twitter users when someone posts an AI drawing [He had "AI artists" in his bio (…
ytc_UgxeQcR-m…
Comment
Everybody talks about the driver being at fault (which is true), but I just want to point out that Teslas use cameras when every other self-driving technology is based on Lidar. Lidar can sense the depth of obstacles, cameras can't, it can only guess obstacles based on photo/video taken.
So why didn't Tesla use Lidar? Because it is cheaper. Tesla remains the only company that uses cameras for it's L2 capabilities, and uses misleading marketing to call them "autopilot" when in fact they aren't.
That is how these crashes happen. Lies. Greed. An intention to mislead users and push blame on drivers. A misinformed driver who wouldn't drive intoxicated otherwise, broke every rule because he was misled to believe, by Tesla, that it was safe to let Autopilot do the driving for him.
youtube
AI Harm Incident
2025-02-22T05:2…
♥ 2
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | outrage |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgyAFQBeSTNdDFCNLCd4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgzFJUGkjhTnVZaO2cd4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugz8Kq0kkDafXWeuKAZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxyJw0lBpwlB8t0uM94AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugx6BdvFC32STL9goAp4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyVwdsHIRRfPHIeXlh4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwisFB6Iwt5bwYPyDh4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwFsr2csxlbD3r8UiN4AaABAg","responsibility":"company","reasoning":"virtue","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgypHeJg7YS7YKruSSd4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"mixed"},
{"id":"ytc_Ugxh50gLsl7n_pjy-594AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}
]