Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Interesting how after Trump was elected US president, /r/Futurology/ was swamped…
rdc_deni72e
G
As long as Trump isn’t in office right? Because if it’s done during his administ…
ytc_UgzH2AttD…
G
I've been trying to explain this very point to a lot of my friends. We don't und…
ytc_UgxaJbaZF…
G
While it is true AI models infer your input to generate code based on data they …
ytc_UgxA6jgyB…
G
Please stop using AI. Think about the polar bears. They are DYING because of app…
ytc_Ugw1klNXz…
G
Fascinating that Hank anthropomorphises AI referring to ‘them’. Can we separate …
ytc_UgyrknVmt…
G
Realistically ai now just learns off of text prompts you give it. But if we deve…
ytc_UgxAOAm9z…
G
Wild idea. Our AI will join the existing ones. Same ones that have been running …
ytc_UgwarddtR…
Comment
Why are we blaming user error on the product? First, it was books, movies, and video games; then, it was the internet and social media. Now it’s AI. Some people need to accept personal responsibility. Yes, we must make the product safer, but we can't always prevent 5% of users from misusing it.
youtube
AI Harm Incident
2025-11-08T15:5…
♥ 25
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | deontological |
| Policy | industry_self |
| Emotion | mixed |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgyS2tt-BKp_nb0VmoV4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgysmBY68nNWN1U0pVJ4AaABAg","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugz0_KneLw9TemsiZdh4AaABAg","responsibility":"user","reasoning":"virtue","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgwG6tsm4rKoFpw52VB4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxhRn-GYRPjwACfZfB4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwFM0aqiexAoJFWV2h4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyEcKxZGW34vEOHDgF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgwadR1cgGxGfg4gxH54AaABAg","responsibility":"user","reasoning":"deontological","policy":"industry_self","emotion":"mixed"},
{"id":"ytc_UgxX4vJuOhum0otOidB4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"resignation"},
{"id":"ytc_Ugzhct9OxwXoJgW3tAJ4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"mixed"}
]