Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
There’s something deeply contradictory about the modern world: the system needs humans to exist, yet it constantly works to get rid of them. It demands that we be productive, efficient, and stable — and once we are, it invents tools that make us unnecessary. Every technological breakthrough seems like a collective victory, but in truth, it’s a victory of the system over humanity itself. Artificial intelligence, robotics, automation — they don’t emerge from nowhere. They are built from human labor, human thought, creativity, and experience. But once the machine learns, the human who taught it becomes disposable. It’s almost poetic, if it weren’t so cruel: the perfect student replacing the teacher. And here lies the core of the problem: The system depends on human beings to function, yet it’s racing toward a point where it no longer needs them — where robots take over, never sleeping, never demanding, never getting paid. The result isn’t a technological paradise, but a distopia of inequality, where a small elite lives surrounded by synthetic nature and automated comfort, while the majority are left outside the walls. They say AI will free humanity from work. But free us for what, if the system refuses to share the benefits of that freedom? Freedom without access to life’s essentials isn’t freedom — it’s exclusion. And that’s already happening: mass layoffs, concentrated wealth, entire professions erased. In the end, the system is digging its own grave. Because by eliminating the human, it erases its own purpose. An economy without consumers, a society without meaning, a machine running on inertia — an organism devouring itself until nothing remains. Perhaps the true challenge of our age isn’t to build a smarter AI, but to redefine what it means to be human in a world where everything can be replaced.
youtube AI Jobs 2025-10-09T15:0…
Coding Result
DimensionValue
Responsibilityunclear
Reasoningdeontological
Policyunclear
Emotionoutrage
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_UgyMa3s7kBK9En6wAdJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgzqD3W6ojh1Y4xjh9V4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytc_UgyvNfnX1vmDh4_gsjl4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"}, {"id":"ytc_UgxaG5Jpv_OPwpxKyQx4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_Ugw2qFr6FYccCXhjZqp4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"}, {"id":"ytc_Ugzi7dQNS65aImL4Z_d4AaABAg","responsibility":"system","reasoning":"deontological","policy":"unclear","emotion":"outrage"}, {"id":"ytc_UgzAyW0Xps6qpMjGp9R4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugz7D2oN8Lu3PEjma2F4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"outrage"}, {"id":"ytc_Ugz4gi6-QJMZMRWw3eZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgwG3FLXI7fStOs4DSN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"} ]