Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
If assuming that there are no cars behind you or that its a self driving car, th…
ytc_Ugzo4zGIs…
G
a couple things:
1) fwiw, when people say that (theoretically ethically sourced…
ytc_UgwHNsFoa…
G
Ai is destroying the environment, so Ai slop bros are right about one thing: We …
ytc_Ugzg8MToI…
G
Guys the more we rely on ai, the more we think and I'm sure why they said ai is…
ytc_UgyAmA-jY…
G
I BEEN DOING THIS FOR TWO MOMTHS STRAIGHT LETS GOOOO
No because I actually got …
ytc_UgwOQz7ma…
G
10:36 remember, everything will only original for just a short period of time, l…
ytc_UgzFpSMe_…
G
Ive been straying away from making digital art cause i hate being accused of AI.…
ytc_Ugz87FjDc…
G
Dude I'm watching and went down to see if someone was disliking the dude who kep…
ytc_UgxcyXmVG…
Comment
Very interesting video on a not so easy topic. It shows there is still much work ahead with autopilot/selfdriving. Any chance that you had data on the autonomous taxi? I know it drives in one city and it would be interesting to compare how it behaves when it encountered motorcycles. Also we have to remember that AI gathers info from people (machine learning) so if people are biased towards bikes, so will be the AI.
youtube
AI Harm Incident
2022-09-03T15:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | mixed |
| Policy | regulate |
| Emotion | mixed |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgxH7ErMYOciKg6bwgN4AaABAg","responsibility":"company","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgxyDEHfLLRrsMLn4wt4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugxpd_mmFDBv3Rd0DBt4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgyhnFjC9GTLXqJaP-N4AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UgwfVIT2WIsP5uNuK_94AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"indifference"},
{"id":"ytc_Ugysal4QDDo27yY-RXB4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"fear"},
{"id":"ytc_Ugx8monRN5oBIK8jbVd4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugwjy2JjjWosMU4Q0ad4AaABAg","responsibility":"company","reasoning":"virtue","policy":"liability","emotion":"fear"},
{"id":"ytc_UgyomYtHOoNFrKSAS954AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UgzkcEu_F-mkV7Mv9nV4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"resignation"}
]