Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I hate to be the one to say this, but you learn nothing by getting ai to write y…
ytr_UgzRNb_Kl…
G
Who will do the check on AI? I think on each layer there were still human needed…
ytc_Ugw6T6aPU…
G
thanks mam now i understand basic concept of ai and mc , and i am now motivated …
ytc_UgzDioRm_…
G
Self driving cars are inevitable. What is kind of scary is human drivers sharing…
ytc_UgzuINoqH…
G
Uhm...since the public are shown these, imagine what the realistic ones can do i…
ytc_UgwgA_LXd…
G
AI art can’t be copyrighted, so I assume there can’t be legal commission money m…
ytc_Ugx7ARs6X…
G
I understand that it is a win for your cause of AI. However please be aware that…
ytc_Ugx6UTwAv…
G
Government support means government owning a % of liability and blame. Doesn’t f…
ytc_UgxtlV0aT…
Comment
Would have not mattered if it was a human driving. Honestly as tragic and terrible as this is. I have to honestly put more blame on that woman for jaywalking in the middle of pitch black night, dark clothes, no lights or reflective gear. Plus the headlights were on so she didn't look both ways. Honestly this is just a terrible thing to happen but 110% preventable, mostly on the victim side. Yes, the "driver" wasn't paying attention but what's the point of self driving cars if you can't take your eye of the road here and there
youtube
AI Harm Incident
2018-03-22T11:3…
♥ 12
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxzxBdh4ic0G0rVpBt4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzH4QZI5SK5ZDudo5N4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"resignation"},
{"id":"ytc_UgzeR2MAoXpzAeoJuSt4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxaPEnnVPVJKXCaAH94AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzENXYaM8mBbW0okPB4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzTx0_sm-U6lOTFXgZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzEyiLflYKd8fQR_zh4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyclY637vkvrbuvSPF4AaABAg","responsibility":"company","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgwCv5H0mcTPDuELTat4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgyGFiRB7ETNFOrLDy54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}
]