Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Doesn't matter how good gen AI becomes, if we as a society reject it, it's not gonna be successful. The reason I hate it isn't because it looks bad. It doesn't, and the tech is only gonna get better in that aspect(though every other aspect is shit), but because AI generated 'art' misses the whole point of art. Art is, as Oscar Wilde put it, a 'useless' thing whose only value is in the appreciation of it. I'd say the other value art or creativity has is the improvement and growth you experience while practicing it. It's only use or 'value' lie in that. 'Fast, easy, and cheap' might benefit healthcare or research or things like that but for art whose value is in a way the time and effort you put into it, it's a detriment. People understand that internally. That's why the 'improvement' in AI generated stuff doesn't evoke the feeling of excitement that it used to when it first appeared and was a novelty, it brings a feeling of loss and doom. That's why people doubt every content's authenticity, that's why AI witch hunts exist, and that's why people just scroll away when the thing they thought looked so good turns out to be AI. Because the imitation of creativity through the lens of AI is entirely pointless and even harmful to creativity overall. AI generated content is like junkfood(though you probably get more out of junkfood than gen AI stuff), which is fast, cheap and easy, outwardly tasty and addictive, but ultimately unsatisfying, provides no nutritional benefit, and ruins your health. I.e, it might 'look good' but it's bad for you. That's why no matter how 'good' it becomes, we as a society should never ever accept AI generated content as anything other than slop.
youtube Viral AI Reaction 2026-02-06T07:3…
Coding Result
DimensionValue
Responsibilitynone
Reasoningdeontological
Policynone
Emotionmixed
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[ {"id":"ytc_Ugyhp3sVHkZNAlBY6EN4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"}, {"id":"ytc_UgyMZb55kI9CJAwMHFh4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_UgwCDHqk5vmHvXtVlMF4AaABAg","responsibility":"company","reasoning":"deontological","policy":"ban","emotion":"outrage"}, {"id":"ytc_UgyjIFpqlKVyUVwF5HN4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"}, {"id":"ytc_UgzED6zHE0_tRSsHl9x4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"approval"}, {"id":"ytc_UgzLGs9lS58rSmB5Afl4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"resignation"}, {"id":"ytc_UgzAdVkiGxwC9B2rRkB4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"}, {"id":"ytc_UgwREm4yvfZZ-R50qDV4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"}, {"id":"ytc_UgyySxcaPeBZCqiD_nN4AaABAg","responsibility":"ai_itself","reasoning":"virtue","policy":"none","emotion":"resignation"}, {"id":"ytc_UgyRVII98vPfGGYVCXB4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"mixed"} ]