Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
If you're replacing employees with shitty AI products, you deserve to go out of …
ytc_UgxPF_D6Z…
G
Hi there! In the video, the presenter asks the robot about the meaning of the na…
ytr_UgwBZOzF0…
G
OK, I’m a disabled and I can’t believe I have to make this point over and over t…
ytc_UgzwDTUW5…
G
AI is trash and the people who use it to steal other people's art are trash.…
ytc_UgxpGo2x6…
G
It is always easier to build rather than fix. Execs just don't understand that c…
rdc_ocp33xq
G
"I don't consume the method, I consume the product" is such an amazing argument.…
ytc_UgxXMgVDc…
G
Here's a scary and true fact: We, humans, built filters directly into AI program…
ytc_Ugwf_E1ec…
G
ACLU configures facial recognition software for 80% confidence threshold then wa…
rdc_ewsmh6i
Comment
If those robots are real,and not CGI, then that was a very dangerous thing to do.To
Give the robots live ammunition and not be wearing a bulletproof vest.Also not having the robots in a secure place.What was to stop the robot turning the gun on the man,due to a malfunction.
youtube
AI Harm Incident
2024-04-11T07:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | consequentialist |
| Policy | liability |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgwILSFFpqwGfeCtyOJ4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyJ2_XcoFXgmZGKBbp4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgxUTH-x1M3twxs3agl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugy3S8195aWS9rAHqZJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyX-QVBEIq7SjpPwRp4AaABAg","responsibility":"user","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwAWurUgEJMDu4FQDZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgzmnqSeNYvJsIHb3tN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgxRlZiJmoWLDaiDAfR4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxoXxH7sL5vWMikddV4AaABAg","responsibility":"distributed","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwDT1m4wd9LIWmtvCV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"}
]