Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
I just watched this video on how AI is evolving beyond our control. And honestly? I wasn’t shocked. This is exactly where we’re heading, not just with AI, but as a society. People are scared of AI taking over. I'm not. I’ve lived my whole life feeling like I don’t belong anywhere ;  born stateless, treated like a ghost, told I have no place, no nation, no worth. So when people panic about machines becoming more powerful than us, I can’t help but think: maybe it's time something else holds up a mirror to our so-called "civilization." Religious hatred, political greed, people fighting over land, resources, flags… all while others are left floating in the middle of the ocean with nothing. The privileged are investing billions into AI, hoping it will secure their legacy, while millions like me are just trying to exist. Stateless people don’t fear AI because we already live on the edge of systems that never saw us as human to begin with. When machines start to “decide,” maybe the world will finally see what it feels like to be stripped of your label, your rights, your humanity, just like we have been since birth. So many are obsessed with ownership: this is mine, that belongs to us, this is our land. But when AI flips the table and wipes everything clean, what will those words even mean? I didn’t choose to be stateless. I didn’t choose to be seen as a problem. But here I am, still watching, still waiting, still asking: where is the humanity in all of this? If this is what “human progress” looks like, I’d rather see it crash and reset. Because right now, survival feels more like a curse than a right.
youtube AI Governance 2025-05-28T06:2… ♥ 82
Coding Result
DimensionValue
Responsibilitynone
Reasoningunclear
Policynone
Emotionresignation
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_Ugyklfb41m2nof6-0_h4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_UgwVnPJKXVkn-6IBDzd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"}, {"id":"ytc_Ugx-wDZSBrL4NBv3Qal4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"approval"}, {"id":"ytc_UgxY9SSZTqAsYZqcq_54AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"ytc_UgwZy4omdJyGK4wWoQp4AaABAg","responsibility":"user","reasoning":"unclear","policy":"none","emotion":"mixed"}, {"id":"ytc_UgzNgE8hAuvek3vx7Qd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"}, {"id":"ytc_UgxA0jngfdfTsvlDgj14AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_Ugz3GX4lsvO3vKbO2jV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytc_Ugz2-fPAlhrfZubzey54AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"}, {"id":"ytc_UgzxYcUctVH_hFECX0x4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"} ]