Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
What is not being talked about is why students feel the need the use AI. Here ar…
ytc_UgypH83aE…
G
I have a GitHub Copilot and a Claude Pro subscription. I hardly ever use Claude …
rdc_ohurhq7
G
7yrs of a mystery illness that nobody can figure out ... Maybe I'll download CHA…
ytc_Ugweako5Z…
G
Is AI seriously not that obvious (in the last case) as it is to me?…
ytc_UgyidDNZ1…
G
Dont use autocomplete AI. Use direct queries for programming. It'll send you in …
ytc_UgyGbfsVQ…
G
Currently have been writing fiction and other things that I post on YouTube, tha…
ytc_UgwzAoLP6…
G
To me it seems like the AI bot is like a speak and spell that nephilim can use w…
ytc_UgxDe6OkL…
G
All jobs that can be done online in danger , lawyer , doctors , teachers manager…
ytc_UgxLt73wo…
Comment
Well, if Ai would to know all of a history of a one person, the AI might think he is a criminal. Nobody gives you the full context on how they got to the positions
You could also say that AI just mimics human behaviour.
Also, I like how when 300 human kills 300 someone it's not that interesting, but when just 1 AI does it, it is lighted at the top of a tower with bright red arrows pointing at the headlines.
like, literally, AI learns only what it sees, if sees constant aggression towards each other, then it will do it too
youtube
AI Harm Incident
2025-07-24T01:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | mixed |
| Policy | unclear |
| Emotion | indifference |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgzHN0aHaovq7eeuyCV4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgzXGDEgrTHvwu0uLmJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzVgOy5YXG04NKcc954AaABAg","responsibility":"user","reasoning":"unclear","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgxkKOodE7wVmT03RJV4AaABAg","responsibility":"company","reasoning":"virtue","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxAmjXa6TmFc0mYUnJ4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgzDnmttbqB9oF5m0uh4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_UgxcukoamWoMCCWN0Eh4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwLYHV3zssnKAr7Kyt4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"mixed"},
{"id":"ytc_Ugx1cjMRO8w7VjwC8T54AaABAg","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxsjlRRJmn5aUpBubd4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"industry_self","emotion":"indifference"}
]