Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
AI is great. If it wipes out humanity along the way, that's just a bonus.…
ytc_Ugyvli20W…
G
Corporations: *despite thr outcry of Twitter artists, still using AI as an art t…
ytr_UgyzoAtFQ…
G
I have a love/hate relationship with ai I think it it were trained on data espec…
ytc_Ugxr10jRP…
G
What was as impressive as chatgpt was how articulate and fluent with words alex …
ytc_UgxZOCYY_…
G
Is art innately human when we "make it"? When someone takes a photo of nature, d…
ytr_UgyTA8dE0…
G
Why do you think Hegseth keeps demanding that Anthropic remove its safety barrie…
rdc_o7oo3ld
G
It seems that AI is really in your head in that place that has no light......ins…
ytc_UgyiA1Auk…
G
Why don't they use all of this AI stuff to figure out cold-fusion? You know... T…
ytc_UgxIcG10u…
Comment
I’ve heard so many TED Talks about AI art from artists, and honestly the arguments feel weak and biased. They fail to identify the real issue. One major problem right now is AI companies training their datasets without permission. Is it true that training on someone’s artwork requires licensing? Then let’s resolve it properly. How? Through the right forums—courts, open debates, proper institutions.
Because in reality, artists also learn from other artists. The difference is just that we’re less advanced and learn in smaller volumes than AI does. I think humans shouldn’t obsess over any artwork or output that exists in the digital realm. Humans naturally belong in the physical world—performing, interacting, looking each other in the eye.
youtube
Viral AI Reaction
2025-11-19T00:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | deontological |
| Policy | regulate |
| Emotion | mixed |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgzTCqveFEB229smLs94AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgwXclyAv53DhP2YUCR4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgzRuyXY5MtVGBd-brt4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxtVDFXIipoqflYjQ94AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxjSnZOoUdmavv31BZ4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzusAWlNTnAPmvLV6x4AaABAg","responsibility":"company","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgzLm8NiQHBSONq22V94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxMa0UfW3RJ5B31MrF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugz2Zt2Ljrn4l2BnzB54AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UgwJEShoEL-3U3DIKHB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"}
]