Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
@oscardalmatiner8724 If you get a commission do you own it? Obviously in this si…
ytr_UgyM7cYee…
G
Okay. Well, we know what to do about A. I pictures, but what can we do about ai …
ytc_UgzRIlTjn…
G
i have always hated nightshade in the way it messes with how the art looks, it m…
ytc_UgzWFQ0Kx…
G
LOL if you fed an AI every psychological study ever conducted it would probably …
ytr_UgxhljGMJ…
G
You can get the materials in description of my channel or you DM me at my instag…
ytr_UgwX3w8m-…
G
this dumbass: guys look AI art has actual soul
the internet: And thy punishment.…
ytc_Ugz6fvsZC…
G
S.W.I.M. Has a friend who works in A.I.
C@vid was, apparently some sort of pre r…
ytc_Ugy7TgrkU…
G
The problem with this kind of AI that they are not thinking about is that they m…
ytc_UgzI6Ti0M…
Comment
Lets face it most programmers are slightly or even highly sociopathic so we should not be surprised that AI has gone this way. I think Peter Thiel might already be part Robot: Am I right Peter ? Mark is clearly already part Robot, no contest.
youtube
AI Harm Incident
2025-07-27T21:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | virtue |
| Policy | unclear |
| Emotion | outrage |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgxpiLA1zq4Ppu5p15x4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgwpF6LjhH6AaLtygLd4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugxv-FyaK1TeJBc_Zyt4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UgyrWA4M48esA6RUaYx4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"resignation"},
{"id":"ytc_UgxN-7k2nynL9jwlXTF4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgyTEur-qpjJSebMDYZ4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxUn6_JiOkMTNsshOZ4AaABAg","responsibility":"user","reasoning":"virtue","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgygmrgwFTrGVpNuC9l4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwH482PKgkFoJItJbx4AaABAg","responsibility":"unclear","reasoning":"virtue","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzdJDFxnFWmy3Vokt14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"}
]