Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
It's started with Walmart automated cashiers. It's coming for trucking and eve…
ytc_UgwOnakWr…
G
So we dont care because the humans living now wont experience that? Tech bros in…
ytc_UgzoUmSUL…
G
The level of delusion in these types of interviews is rather astonishing, for ma…
ytc_UgxySxJv2…
G
Spread the word
THEYRE NOT AI ARTISTS, THEYRE AI PROMPTERS!!!
Can't give em the…
ytc_Ugy77mmbr…
G
Of course ChatGPT gives skewed answers. What did you expect – a raw thinker spaw…
ytc_UgwmAidzb…
G
I am deeply afraid of ai generated art. it activates primal fear in my nerves.…
ytc_UgwlPkxfu…
G
I am glad people are already developing countermeasures to AI plagiarism.
I rece…
ytc_Ugz1mxpBa…
G
It was jail broken..... it wont do that normally... he had his jail broken.. jus…
ytc_UgybUMQSJ…
Comment
Some cars only have Lane Keeping Assist (LKA) without speed control, and in this situation, any other car likely would have crashed as well. The video seems to be using outdated Autopilot data rather than showcasing the advancements in 2023 or 2024 with versions 11 and 12 of Tesla's Full Self-Driving (FSD) software. It feels like the focus is on banning FSD, when the real issue is raising awareness about the importance of staying attentive while driving. The software being discussed here is obsolete and no longer represents the current capabilities of FSD.
youtube
AI Harm Incident
2024-12-13T20:5…
♥ 7
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugyx-NZRvhG_dMrVsa94AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxHk49YMSu3pNkmrbZ4AaABAg","responsibility":"company","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgwnC3TSEA8fkeUxNT94AaABAg","responsibility":"company","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgwF5I7IUfVSIg4szTt4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwfpnohY6IvWJysC9Z4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgyTibsMnL_XXgIc-gB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzVXQMQJBBjau07JSZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugw_FiFRSimmCFKLg194AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"resignation"},
{"id":"ytc_Ugxc1L9yDFMp3B-auDp4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgxRZpyc1hXnNFt5et94AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"outrage"}
]