Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Machines don't make decisions. AI does, that's why all jobs will be gone in near…
ytc_UgzSrRT0Y…
G
Truth is who is training AI? Is it learning from humans and human history? What …
ytc_Ugz8CI30S…
G
This algorithm is probably just a pilot program being tested on a more local lev…
ytc_Ugw-qKHY0…
G
I think the only reason it’s a problem is because the “artists” didn’t draw them…
ytc_UgyraPifJ…
G
Robot rights: Robots are not allowed to harm humans in any way, directly or indi…
ytc_UgybYziq2…
G
I think a good giveaway to how AI stans think about art is taht they keep referi…
ytc_Ugx7iOkVR…
G
I still have not heard a SINGLE good argument why Fanart isn't treated like AI a…
ytc_Ugyi6XRnA…
G
Look there's regulations on toothpaste so if someone thinks that there shouldn't…
ytc_UgzP6J3uf…
Comment
What that means is your niece isn't needed. Once AI can do the job of healthcare workers people will be obsolete and companies can hike the cost so that only the elite have access. This is a very rose glasses view, but that is not reality. Humans will only become more evil. Humans program, until AI becomes self sustaining. Then humans are not needed. In fact humans are the threat. Not only to AI but to the earth at large. Extermination will have to begin if AI is more intelligent than humans.
youtube
Cross-Cultural
2026-02-06T18:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | liability |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgzKfpzMmHQo9tLg0pF4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgzM8dDiIWsB7qL-29h4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgxpcDdAPyQzmfOmFf94AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugx8OYPBU1n7B1TRADZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzcKJs0wMGUoVXmb2x4AaABAg","responsibility":"unclear","reasoning":"virtue","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzgWWZIFfdhwNBArkl4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzNE41cBURNwJ4Vlt54AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgzDtO1am7tYKzm0HPt4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugyj8jEMNLCHWGFcAUt4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyOo4679r4C8loVMol4AaABAg","responsibility":"distributed","reasoning":"virtue","policy":"industry_self","emotion":"approval"}
]