Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
What a crock of fearmongering shite. So-called "AI" is still a hypothetical concept. What we have now, is machine learning and large language models. Essentially, the tools that are supposedly going to displace us all, are a glorified version of the predictive text that already exists whenever you type something on your phone. It's that, but companies have speculatively invested billions of dollars into this technology being the next big thing that replaces us all. Companies which, might I add, are already in billions of dollars worth of debt over this speculative AI push--are going to somehow win at capitalism by pushing infuriating AI chatbots and phone assistants onto a populace which is increasingly dissatisfied and pessimistic with what's already been presented? These worthless LLMs, and their investors' FEELINGS about them, are going to supercede the interests of those that they claim to service? Yeah right. If you genuinely believe that human jobs can be successfully replaced by a digital word-guessing machine, and that a population of vindictive, starving, and enraged humans armed to the teeth are going to helplessly stand by as the food becomes unaffordable and our means of surviving are eliminated with ZERO pushback, that tells me that you understand nothing about the nature of either humans OR computer programs. This whole video is created by useless corporate fart-huffer software, and it shows.
youtube Viral AI Reaction 2025-12-04T04:5… ♥ 1
Coding Result
DimensionValue
Responsibilitycompany
Reasoningdeontological
Policynone
Emotionoutrage
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_UgzXaro_eWEitlE2e1Z4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgwAKVMOk1ZFgYygQcJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"}, {"id":"ytc_Ugzff53kFGrHPyPYD9N4AaABAg","responsibility":"company","reasoning":"unclear","policy":"none","emotion":"mixed"}, {"id":"ytc_UgwDpkb1npRHdxjyFI94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_UgyKcwUIjfDwZ2zHTP54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_Ugxab3YhxvsPG9DIkw14AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"mixed"}, {"id":"ytc_UgwDJ-9Q6TqKJ2fLAmF4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"outrage"}, {"id":"ytc_UgyUFzM7WVIHEaM58-94AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytc_UgwFbI3udAQ0xsVcGml4AaABAg","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"outrage"}, {"id":"ytc_Ugz976H2IybsCb4F7-V4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"outrage"} ]