Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
@thunder_nuggets Youtube's algorithm punishes small creators at every step of th…
ytr_UgxPgLpt5…
G
The "blue blood" of artistic talent??? What does that even mean lmao, why is it …
ytc_UgyEjOvy-…
G
Probably the most honest and informative interview on AI I have seen, but the ma…
ytc_Ugxsq1ztA…
G
That, and the liability issue. The question of who is at fault if a self-drivin…
rdc_dftz5j7
G
Ok everyone knows that they are programed by a human. He intelligence lies with…
ytc_Ugz6ME4iy…
G
Poor FSD AI. If AI is still in its infancy then you are scaring a toddler……
ytc_Ugz97-BFx…
G
As an innocent character AI user, This is relatable as hell because I was chatti…
ytc_UgxvqISaQ…
G
@laurentiuvladutmanea uh, have you ever actually used Ai before? Both of those t…
ytr_UgxJx1zpT…
Comment
I understand the issue, but the blame here shouldn't be put on Tesla, at least not entirely. Both crashes could have been avoided, only if the drivers were paying attention. The issue is people not taking the warnings and precautions seriously. We need more emphasis on the fact that IT'S NOT DONE, not autonomous and won't be for a good while. There are some issues with the way Tesla presents autopilot, such as the name, it does make you think it's autonomous, though it's not.
youtube
AI Harm Incident
2022-09-03T16:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgyjO7jU7C9_3dXuZMh4AaABAg","responsibility":"company","reasoning":"virtue","policy":"none","emotion":"disapproval"},
{"id":"ytc_Ugx-qsSCuJt0RzRmIXZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugz7w8Aw361ruube79x4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgxTlwzwIfpB9C6WUex4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgxhkbLDptzQXqM_pgh4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgxI0TlqLfvFDclRmwJ4AaABAg","responsibility":"company","reasoning":"deontological","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugz_ILKcfWNf4hu_VVt4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgygVskiDzdHMoyyDBl4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxNmV0D8WkYPuNVKiR4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgzoB34e3dW3eIJCvdZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}
]