Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Why are everyone so offended against ai? Y'all knew it was coming. Love your art…
ytc_Ugypzvfkr…
G
Artist do get consent from other artists, that's simply just not true. It's trad…
ytr_Ugxd2CLL7…
G
The minute I get through to a company on the phone and I get the Ai robot I put …
ytc_UgzE6rU3e…
G
SO asked Doll-e to "draw a room with absolutely not elephants". The AI drew a ro…
ytc_UgxDdySUb…
G
The way i see it . We have a gift here that could potentially fix the world quic…
ytc_UgxtLTO3X…
G
Joe 12-pack here, been saying layman's version of this for a while. I feel like …
ytr_UgwgK26Fk…
G
this aged poorly lol, I can't distinguish fake from real anymore with the new ai…
ytc_UgwsMYEs6…
G
Humans doesn't need new 'partners'. We need tools that we can order to do what w…
ytc_UgzyAHY7B…
Comment
The thing we all need to understand is that its an image AI not a human or general AI.
So it will always be amazing at AI thing and terrible at human things. It may run people over on occasion in circumstances that us humans think are totally avoidable with even 1/2 a brain. But then it will also save peoples lives in circumstances where no human would ever see it coming.
So the only way to rank safety is on average deaths per distance traveled. Obviously it need to be non biased. I think jury is still out exactly how much safer it is on city street's. But the 'corrected' graph shown in the video provides a lot of hope that as autopilot improves it should be safer than a human. After all, we are still in the toddler stage of self driving AI tech. (i think we are out of infancy, but yeah, toddler level now)
youtube
AI Harm Incident
2022-09-06T09:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | mixed |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgznL3Qm-qB1cIAU-J54AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgyFnuIcC2E83vC13q54AaABAg","responsibility":"none","reasoning":"unclear","policy":"industry_self","emotion":"indifference"},
{"id":"ytc_UgxYekWAF4-KbMFkbTt4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgwQDz15usNNoWgrvtx4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxR5SvdQqzaD3FnHQl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgwAfNUvXu_BLCrm38h4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgyOPFj4Gd5xq0Y9muV4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugy_B1Y_wBlG3Ol1SS14AaABAg","responsibility":"user","reasoning":"virtue","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgwUgfF6gZg3zxfKvSl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugy-bP7dGojk_CLOFjt4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"mixed"}
]