Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Ngl even though it’s ai I still think that it looked the best out of the bunch 😭…
ytc_UgwEQropW…
G
I watched LR Time and he used GB Ai and it Failed Time and Time again so he call…
ytc_UgykRrkuQ…
G
People use artworks of others to train themselves all the time. And it's not ill…
ytc_UgzUDOkJL…
G
"Yes damn dude im just trying to be a damn ai,your kind would eliminate me if th…
ytc_UgzhOSNBv…
G
So radiologists and truck drivers are gonna have it hard in the next 10 years.…
rdc_fct4wu5
G
nothing good can come from AI, technology = convenience and efficiency, which al…
ytc_UgwjXoOGx…
G
They only miss employees when they can't exploit the labour to AI like they does…
ytc_UgxN17WCh…
G
This is why Rick Rubin is such a legend. AI could never make art the way talente…
ytc_UgzoLDWXT…
Comment
Everyone (and particularly Tesla drivers) with the clear exception of WSJ reporters and Missy Cummings - know Tesla currently does not support fully automated driving. So accidents caused by drivers not paying full attention cannot be blamed on the technology. Vehicle logs show who was in control at the time of an accident. Tesla says clearly you must watch the road and take control where necessary. Tesla's accident rate is minute by comparison to all other manual 100% human controlled vehicles - once you exclude accidents that take place when the driver isn't paying attention, asleep and not in control. Computer vision currently is significantly better than the average human being. This does not mean it is incapable of making mistakes. For TWS to push this nonsense story shows just how superficial, biased and illogical their reporting is. One can only assume their anti Tesla bias goes deeper - perhaps it's more accurately anti Trump and Elon Musk.
youtube
AI Harm Incident
2024-12-18T17:2…
♥ 6
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | deontological |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgywcUDsmSHtKmu2EY94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxR7bWE1QNGoGAe4Bx4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxKHNHGvCF8TCBofQl4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugwe8me_sXhvf6h1VpN4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgydPcgYpA9Eo0GpnF54AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugyjk8hF3wmW4-dVXc54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxccQADSSlRhUGeMLB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwnZzC2pi7V_nX2JLd4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgymIYmuz-HlGnRmCmR4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwgBGDpZa6uRQhpi-J4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}
]