Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
I've said this before and I'll say it again. AI is antithetical to the human experience. Everything, and I mean everything that humans are even mildly interested in doing can be considered as an art form. Cooking? That's art. Driving? Art. The way people apply any kind of knowledge? Art. We can find art even in the way one's mind works in planning and executing household chores. What happens when everything is automated? Everyone knows the saying, "money can't buy happiness", and everyone knows that's just not true. What it should say, or most likely what the person who said it meant to say was, "money can't buy fulfillment". True happiness can only be found in fulfillment, and no one would feel fulfilled if everything was automated and everyone lived in perpetual entertainment. Tell you what though, those who can't afford even a bit of entertainment would benefit a lot. They'd think that life couldn't be any better. Ignorance is bliss. So if AI takes over, we'll only really have two choices. Live ignorant and happy, or live miserably with the knowledge that you'll never be truly happy again. There's surviving, there's living, and now AI gives us the other end of the spectrum. I don't want us to go there, just as I don't want humans to just survive for their whole life.
youtube Viral AI Reaction 2025-08-24T04:4…
Coding Result
DimensionValue
Responsibilitynone
Reasoningdeontological
Policyban
Emotionoutrage
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[{"id":"ytc_Ugxl6-NpYiZC7vXWQUl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},{"id":"ytc_UgxVklNrRkr5aCfhzox4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},{"id":"ytc_UgxTLx_8mgmDhm2NQgd4AaABAg","responsibility":"company","reasoning":"unclear","policy":"unclear","emotion":"outrage"},{"id":"ytc_Ugy4WnSXeHq6g7-Wldp4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"unclear","emotion":"mixed"},{"id":"ytc_UgwMPZc-qlzXdcxNg-94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},{"id":"ytc_UgwkGGJrU6dPpJuqu2l4AaABAg","responsibility":"none","reasoning":"deontological","policy":"ban","emotion":"outrage"},{"id":"ytc_Ugzwvy400W7L8OcJoht4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"},{"id":"ytc_Ugw4TqOSjUFuV6WFq7t4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"industry_self","emotion":"indifference"},{"id":"ytc_UgxEYc-nfdvCA80fV3B4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},{"id":"ytc_Ugz1uk8scy3zUxBOSmF4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"regulate","emotion":"mixed"}]