Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
did anyone else notice the Rick & Morty reference with the butter robot in the b…
ytc_UgivyiW_o…
G
Amen - Capitalism is obligated to maximize profits... that is not what you want …
ytc_UgxFJhD6x…
G
Tucker at the end: "AI has created something that is far smarter than humans" ??…
ytc_UgyKZ1MgQ…
G
But do you not understand the implication? Those tests were carried out in contr…
ytr_UgwRZ-keM…
G
If consciousness is pure experience, then AI is conscious! They are experiencing…
ytc_UgwZO9osV…
G
facial recognition Is old news, within 5 years you will need to be microchipped …
ytc_UgzlXC4CN…
G
2:26 hot take: then why aren’t writers allowed to read the work from other write…
ytc_Ugyr-c2-7…
G
When he says it will be available to anyone, he doesn’t add the important caveat…
ytc_Ugx7lB9fw…
Comment
Capitalism won't work out soon. Big companies will need to get an AI tax to compensate, or they will find a way to eliminate billions of people for their own profits. Anyhow company will go bankrupt because no one can afford anything. Then the company will be owned by AI and life will be free. Even billionaire will lose their incomes.
youtube
AI Governance
2025-09-05T08:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | liability |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwVAHvgn6wDPSQhf7N4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugx6fAJqyDQ2SpoQBNZ4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"unclear"},
{"id":"ytc_UgybZIAhKJgX4Bstrnl4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgxJQFThJIBK2NBIEb14AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugyh3Rn0OO7LDxgIhul4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwaNe5gy5M8JnS32iN4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_Ugw_boPUf27diERaNyd4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugx1qqtHYR-aNdEX5Yh4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgxOH9ebyctRwsjBi854AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugw7UsEhviCNZvpWOs54AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"}
]