Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
and AI has only got 10X bigger since this video lol..youtube cares about hits an…
ytc_UgzVIkpxu…
G
Though I may disagree with his conclusion, the conversation is interesting becau…
ytc_UgxxHXRxJ…
G
These videos make me laugh so much. Ten months later AI won. Now the artist wil…
ytc_UgwJZTF9M…
G
people who defend ai don't know what their talking about, they are also the peop…
ytc_UgyOhakx6…
G
@HeeeeeeyHowAreYou government can increase tax for robots and ai use, and decre…
ytr_UgxihDeCB…
G
AI produced imagery is already becoming fatiguing. I can spot AI thumbnails a mi…
ytc_UgxWuvCKE…
G
Im with Hayao on this I am generally disgusted with Ai art because there’s no cr…
ytc_UgxLf4HbA…
G
Good LLMs need multiple 40GB nvidia cards to run at any usable speed. You're loo…
rdc_jg87h7y
Comment
Look, I despise Musk, and anyone being permanently injured and/or killed on the road is a tragedy. I also don't own a "self-driving car" nor have I ever owned a Tesla. That said, if we compare the accidents humans cause versus the number of accidents self-driving cars have caused, the difference is astronomical, even factoring out the percentages of said vehicles on the road. Not only this, but a self-driving car never drives drunk, never gets into road wars with other cars, etc. Driving to work, you can see dozens of people on their cell phones. People can't seem to put down the phone, so I think we need self-driving cars......
youtube
AI Harm Incident
2025-08-16T09:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgzY9pmELx64ghMC_wV4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzLoINY9kFe03x5mZl4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgyN1EBnNyibJ7O40DB4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugz_jWCDbuSaRQBAPBh4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgwkZkk1SpBCdpTvToZ4AaABAg","responsibility":"company","reasoning":"unclear","policy":"ban","emotion":"indifference"},
{"id":"ytc_Ugz6chf2timwKrLqckJ4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzRD7zfqtNLq9FR8ux4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugz8dm3JWe9ATDoddep4AaABAg","responsibility":"company","reasoning":"unclear","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzHwNntf2AVGseRYtp4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxRV6RhMObFzKd5x5t4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"}
]