Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Honestly, between the brain rust from AI and the lives it ruins in the present... I don't care how good the AI could be, we are all better off if the technology is destroyed and forgotten. In a time of mental health crisis, we are further pushing towards higher isolation levels when we need MORE COMMUNITY. Productive efficiency is worthless when you crush people. People are meant to work. Work is good. Not to the extreme levels required to survive now, but AI gives me less and less reasons to use it every single day despite others encouraging me to use it. UBI simply is unlikely to work due to the reality being that companies are busy trying to squeeze out the people and the government is and has been failing to due ANYTHING about it for years. There is no way the top people will agree to give their money away, as AI is a tool for greed and profit. I don't mean to appear as a conspiracy theorist, but those at the top are spending more rather than fixing the current debt. AI will only increase mental health problems whether through lack of job opportunities, inflation, AI usage, etc. People were already struggling enough as it is, now we throw in a "tool" that messes with your brain's ability to think for itself? It just seems counter productive to be spending so much on a product that INCREASES the severity of a significant health problem in a time of great mental struggle.
youtube AI Jobs 2026-01-02T02:0… ♥ 2
Coding Result
DimensionValue
Responsibilityai_itself
Reasoningconsequentialist
Policyban
Emotionoutrage
Coded at2026-04-27T06:26:44.938723
Raw LLM Response
[ {"id":"ytc_Ugw5goZXzu--LaifmGx4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"outrage"}, {"id":"ytc_UgzJAE_pkszJaNiDhCV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"ytc_UgwrziOXjOIXCL037jR4AaABAg","responsibility":"company","reasoning":"deontological","policy":"unclear","emotion":"mixed"}, {"id":"ytc_UgxGr11Y_jn3OQATgVF4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_Ugzvuv_XQ6FrpZx0dK94AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"outrage"}, {"id":"ytc_UgxTP0m5p5lUv6DZXwt4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"ytc_Ugzu58v7ke4d5kxUmvV4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"}, {"id":"ytc_UgwtbAlNbZ0j_c-sIWh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"ytc_UgyeY_0dQMqQ_EKiNAR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"ytc_Ugw1RQe0AmAG05VYKDV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"} ]