Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I'm calling it now: One day there's gonna be an animated film released that was …
ytc_Ugy5e7enJ…
G
i hate how pro ai ”art” people use disabilities and disabled people as a shield …
ytc_UgxxEolU-…
G
I like to use AI generated art as concept art when I'm trying to put something t…
ytc_UgyzoTRuf…
G
Being an artist can mean a lot of different things and isn't some kind of super …
ytc_UgwK7BOI5…
G
Thank you for your insights. The AI world is coming on way to fast without human…
ytc_UgwEFTJol…
G
Hollywood Writers: “AI will take our jobs!”
Fans: “You say that like it’s a bad …
ytc_UgxBNCyRD…
G
This video misses the long term goal of AI. The goal is to replace as many jobs …
ytc_UgztYM1l2…
G
I have to get this off my chest.
I am jealous of people who can do things I ca…
ytc_UgwJTyFFy…
Comment
People who say it's the pedestrian's fault, and the dumbass should have looked before she crossed ....blah blah blah ....they're just totally not understanding the issue here.
The issue is that this type of scenario is extremely common. Any competent autonomous car should have detected the pedestrian. So what if it was dark..? I would have thought they're supposed to have lidar & radar and shit like that for seeing in the dark.
What gives?
It would appear there has been a critical failure in either the sensors ...or the software simply failed to respond correctly.
And either scenario is simply unacceptable. This goes to show just how shit the technology still is ...and how much more work still needs to be done.
youtube
AI Harm Incident
2018-04-28T03:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | deontological |
| Policy | liability |
| Emotion | outrage |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwT6CzwEalsc0GRvk94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugx81CE_mpNZMIFasoB4AaABAg","responsibility":"distributed","reasoning":"virtue","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugw37tso1jWIqITp3op4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugwn08bQGUe5XQClK7N4AaABAg","responsibility":"user","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugwi7VOngjBC1BEMqD54AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgyzUGCjuZwGNqNx4FJ4AaABAg","responsibility":"user","reasoning":"virtue","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgyOyYIaRH_XV46nRiZ4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgwLTVoZzUN7QFqSVMp4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugw4p04_FSUAO5gGf-14AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzKcm7DeUQgw_CM4Mt4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"outrage"}
]