Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
About five decades ago there was a book knocking around called: "The Superintell…
ytc_Ugybj3F3E…
G
Man, AI sucks. I work in 3D animation/vfx industry and the genertative AI techno…
ytc_UgyoxvhE2…
G
What happens when Ai decided religion isn't good for humanity, do people still g…
ytc_Ugz_eiq4-…
G
I regularly use ChatGPT for creative tasks like Marketing (as a marketing profes…
ytc_UgyvONssA…
G
All of the tech folks know that humanity is in the way of them making this happe…
ytr_Ugxny95gQ…
G
Its probably becuz the man ate vegetables which confuse the robot, thats why i d…
ytc_UgyhUj1i-…
G
I cannot care for this admin, but Federal regulation of AI makes way more sense …
ytc_Ugxb6-Wmx…
G
I get where you're coming from! Robots can sometimes feel a bit uncanny, especia…
ytr_UgyC2hFP5…
Comment
I don’t believe the idea of UBI coming into fruition. I don’t think they’d wanna feed the poor. If they did want to, they would’ve done that already with the abundance of resources we have in our planet. But they didn’t because it doesn’t help them in any way, and they’re fooling the common people with the idea of UBI. And UBI isn’t going to benefit the ultra rich owning these AI companies in any possible way. So, they’ll mostly leave us to die since we’ll be of no importance. Ironically, capitalism was keeping us alive only to serve them.
youtube
AI Governance
2025-09-05T10:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | liability |
| Emotion | outrage |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[{"id":"ytc_Ugw8hKLJPGGP_Myw1aN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"fear"},
{"id":"ytc_UgybiPdcYYhpXAa77ul4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwiNxXjY47U4wn8dAF4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_UgysccWnz_elznaeQjN4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwK_1Xi-4tyU7MHfGp4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugxjj2XivKZQeLmBSSJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxcDh9FgV95-ZGEdJV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxNvGgzW5JY0hBnlr14AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgyvSbPTB-YXM_Jo3ER4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgxJUXLvsXyxD0C5f1x4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"}]