Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Let’s be real. Certain jobs won’t be replaced for quite a while. Banking, insurance, bureaucracy—they’re structured, predictable, and AI can take them over step by step. But the “real workers”—the ones who build roofs, fix boilers, clean hotel rooms, keep sanitation running—these roles will hold strong for decades. Machines can install a pipe or vacuum a floor, but they can’t feel the nuance, the intuition built from years of hands-on experience. They can’t climb a wet roof, sense a problem in a heating system, or adjust a room to make a guest feel welcome. There’s a human element that no algorithm can replicate. And let’s not forget the human reaction. There will come a time when people reject interacting with machines. They’ll want a voice, a look in the eye, a handshake. The future isn’t machines replacing humans everywhere. It’s machines working quietly in the background while humans stay front stage—where trust, empathy, and skill still rule. The ones who master this balance—the craft, the human touch, and the smart use of technology—will always have an edge. And there’s another angle we can’t ignore: how society handles this shift. Companies that replace humans with machines should face real consequences. Let’s say a 50% tax on each robot—money that goes to paying regular people or funding social programs. It’s not about punishing innovation. It’s about making owners think twice before choosing efficiency over human livelihoods. Automation is inevitable, but who benefits shouldn’t be left to chance. The funds could support wages, retraining, or public services. Machines working in the background is fine—but society must ensure humans remain front and center where it counts.
youtube AI Governance 2025-09-04T12:3… ♥ 1
Coding Result
DimensionValue
Responsibilitynone
Reasoningconsequentialist
Policynone
Emotionresignation
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_Ugw_t8IXKiRnVGgT5xp4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytc_UgxsLSW9Z2UdAE0Uaux4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"mixed"}, {"id":"ytc_Ugw8_Uanke3rr5DwAyJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugyj_jKcV5pE7YxP8tp4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"fear"}, {"id":"ytc_UgwOIDHDuExV6A9Zdxx4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytc_Ugw7F-VkrM8Y3_Ajnt54AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_Ugz74H3-lzEPDuV1evV4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"resignation"}, {"id":"ytc_Ugxbv_mLaaTWzc5AZpd4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"}, {"id":"ytc_UgzcaL5raiDn6bLlzBJ4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"ban","emotion":"outrage"}, {"id":"ytc_UgwNGotJ_sJbOYxd0oV4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"} ]