Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I love the "stealing argument".
"AI stole art of others" they say. Let's see :
n…
ytc_Ugyhk4B2k…
G
We need to ensure AI is policed by White people.
Rather than Israelis or Chines…
ytc_UgyjfjwTd…
G
There is not enough research on this and how we need policies and regulations on…
ytc_UgyH9gTJX…
G
At least AI police will not violate basic human rights. Unlike the police today,…
ytc_Ugyfaovwy…
G
People use artworks of others to train themselves all the time. And it's not ill…
ytc_UgzUDOkJL…
G
AI is just a scapegoat. The real question is what if we outsource all our jobs o…
ytc_UgyJI5ZNo…
G
The only thing you can do now is with AI. Is to use it to your advantage. Adjust…
ytc_Ugy7uXax1…
G
@malekith6522 Well, with mathematics; At some point it gets so complicated and …
ytr_UgxeOeRl8…
Comment
It's so ironic AI wants to shoot itself in the foot considering every "Free Speech" argument is built on : Search engine searching cannot alter your personality beyond what you are, as a legal adult, able to cope with. How is stripping, scraping, and building a "talking machine" that spits Google Search and reddit back to you somehow altering people's innate personalities? AI psychosis is just regular psychosis. Apparently they don't even catch when people are manically googling how to murder people...
youtube
AI Moral Status
2026-01-31T06:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | deontological |
| Policy | unclear |
| Emotion | outrage |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[{"id":"ytc_UgxE5O_6IPYeLiKzglN4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwpoSnEXR6W1SDyLvl4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxlYGZ7EraHkhkGAAZ4AaABAg","responsibility":"distributed","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgygYISJbpBWVYyGRVt4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugyi9xQZ9jvC5uwC9qx4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwmQRKeKDjKB7_TVFV4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgxjEqsug_i7fLNH8Td4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugzki8yUHirHgts5jQt4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugwz3jKaa5FhkrhQg4R4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwSbwzClGucP85hD5l4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"}]