Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The day AI can do my job it will be, NEVER in a million years.…
ytc_Ugy2z29Mw…
G
That’s preprogrammed. In scripts we can create failure scripts. If the AI code c…
ytc_Ugy3OL67-…
G
Honestly, this wouldn't be a problem if the ai art people commissioned artists/p…
ytc_UgzzpfmKt…
G
AI art is not art, You're not making anything, You're just asking a computer to …
ytr_Ugwt6AEko…
G
I feel he’s trying to relieve himself of guilt by saying all these things now. H…
ytc_UgxeRjmkj…
G
The beginning of an android A.I like commander Data from star trek how cool is t…
ytc_Ugwbj1Djx…
G
if i were the president of a country that has deepfake issues first of all i wou…
ytc_UgwFVbqlm…
G
"AI is the biggest threat to humanity". So is it Humanity vs AGI (we would lose)…
ytr_Ugyv5XTkm…
Comment
Pretty much the only thing going through big tech's head is "ai might be able to replace humans thus saving tons of expenses, and the first one to sell a consistently effective AI model to cover all human tasks for other companies will become the riched man on earth,"
thus they're all speedrunning it with crazy large datacenters, blatant propaghanda-like hype pushes, and more all at the cost of the average joes of the world.
They all want to basically rule the world by being the one in charge of the AI that becomes in charge of everything else in the world
youtube
2026-02-24T17:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | outrage |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgyIW6-rUtJ-5sMQV094AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwHHHR0CQ6D16epj5h4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugz_jNRk3rIv0ko_VgN4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgxC9ZBYM4XHd8luqZx4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyKwq3h7N1O3dKnOaV4AaABAg","responsibility":"company","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgwlZxWiCN70PllPsy14AaABAg","responsibility":"company","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgyIlCZE4RVe3DEdh3R4AaABAg","responsibility":"company","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgxKE7akxzQdplO3Ol94AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgxQl1QZQmX5lR8UasR4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugw4VUAxtk1VDDuvk5h4AaABAg","responsibility":"government","reasoning":"unclear","policy":"regulate","emotion":"approval"}
]