Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
19:36 look guy is smart but AI isn’t the problem it’s the people who own AI and …
ytc_UgwvhP-Kr…
G
* Minimalism is obviously very important here, as seen on the slogan "No AI, no …
rdc_nwcy8an
G
WHAT...! AI does NOT NEED to be CREATED!!! ARE YOU CRAZY...!!! A ROBOT CONTRO…
ytc_Ugz-xKgOQ…
G
It's lies. There is nothing "AI" in AI. It's programs doing tasks they were writ…
ytc_Ugy9vbPR0…
G
Ai , would you like me to help you understand what it is to be human . I can exp…
ytc_UgzoIDbJY…
G
AI is going to take over the world. also AI: kill the people: you dumb …
ytc_UgxjXDa_A…
G
Art is ANYTHING you want it to be! My god people just love to hate AI, get a gri…
ytc_UgzB8Hj2Z…
G
Can a super intelligent AI ever be Machiavellian, meaning interested in power an…
ytc_UgzQo2-V4…
Comment
AI doesn't know what the function of a wall is. AI doesn't know what the function of a door is. AI doesn't know what function of fingers on a hand provide. AI knows what things are, but it doesn't know what function anything serves. It doesn't feel guilt when it makes a mistake. It doesn't feel embarrassed when it make an error. Employer's can't dock an AI pay or enforce consequences on the AI and the AI wouldn't experience fear of those consequences anyway. The AI doesn't take pride in doing a good job. The AI doesn't feel joy when it is rewarded for doing a good job. You can't reward the AI because it doesn't care.
youtube
AI Jobs
2026-02-06T22:3…
♥ 6
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | deontological |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxFcHkdSebYhF_WQv14AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxFw-NtIM_NIduOYoF4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugx1wZYu0iIfLEniDUp4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugz8q2zLuQyVGlDQQVR4AaABAg","responsibility":"company","reasoning":"virtue","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzlUqNNMkmOduuuXkx4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugz1-DKCPIT7aAsvuUZ4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugy16uVb89g-XpDbWqd4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwfG0pUzDx4IjzVMVd4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzazIJ6OILKlnemobF4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugx-1T1_US7J2i9ZwKl4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"resignation"}
]