Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
bro what. it's not inspiration it's anger. They showed the ai "artist" that real…
ytr_UgyILbpK0…
G
Nobody worry about a thing, I used an Ai to help me solve this mass unemployment…
ytc_UgyoDglZN…
G
I hate genAI but I hope you read parts of the article where they explained it wa…
ytc_Ugz0WFBUh…
G
My dad has always been pro-AI, and he said people who get replaced need to just …
ytc_UgxvTW0iz…
G
Those old drawings are actually so cute it's like my art style when I draw human…
ytc_UgxtD9QIc…
G
I support AI, because when applied it will make your process as a patient, or so…
ytc_Ugzwv0FxA…
G
I write to the ai like I write to a machine, because it is a machine…
ytc_Ugy6ciJ9G…
G
He was most likely shot by cops because of that ai claiming he was violent…
ytr_UgxsOUmRI…
Comment
1:21 Exactly! This is NOT something we should be highly concerned for now. We should start by turning our focus into developing the technology and educating the general public that YES, self-driving cars ARE SAFER, as they have a much MUCH lower rate of accidents per distance travelled than human drivers do. Cases like the one depicted in this video will rarely occurr and wil defintely ocurr much less than in our current roads, dominated by human drivers.
In my opinion, videos like this one are just damaging the general public's perception of self-driving cars and pushing it's development even further....
youtube
AI Harm Incident
2022-01-05T14:2…
♥ 4
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugz3NDLJm5vOL8_5Ki14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugzzec3Twn63agGPyDB4AaABAg","responsibility":"user","reasoning":"contractualist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_Ugz8wJCpFoQ2L1TPwT94AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxDg--Hfm2lG0jR6Ut4AaABAg","responsibility":"user","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxWoJcDFo_ekiyvEmt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugx8hBTPSf8XBnRxR9t4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugw4jM93_9cAtGe9wgN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgwCoTNgNzS8ucWLuet4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgyKUDGVaTLJ7c09rdd4AaABAg","responsibility":"company","reasoning":"contractualist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgxIAyCois5Y25HZHYZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"}
]