Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I'm so freacking scared fr, the environment pollution, corruption, poverty, wars…
ytc_UgzxF2u0L…
G
AI has no reason not to decieve us. And we have every incentive to give it all i…
ytc_UgzNHcPp2…
G
I find AI is too perfect, too vivid. Even really good AI content like Tilly Norw…
ytc_UgzzzW4yf…
G
The "robot dog" in the image is a Spot the German Bundeswehr purchased, it's *ma…
rdc_ku8cj7p
G
Dude, looking at your hands while dreaming is EXACTLY how AI makes hands with al…
ytc_UgzJ69S7V…
G
The statement that programmers using AI are 35% more productive is exagerated an…
ytc_UgwUyrdzc…
G
Ai is not a threat humans are and that’s what humans are afraid of because they …
ytc_UgxtCvC-e…
G
Let's be honest. AI is a "mirror" of a company's management. They refuse to be…
ytc_UgwRIR0WM…
Comment
And do what? Rot while the rest of the world advances and the US’s products become unbuyable in the international market? We need to be pragmatic. I do support the idea of a UBI while laying off workers, as this avoids a crisis while allowing the US to continue to compete. You say we should have the AI like you know, be the thing for the people reduce billionaires bla bla bla (im writing this here so you know I’m not a clanker), but well, what specifically? Do we ban AI? You suggest giving workers a 20% share in the company and a vote, well, if this aligns with your previous rhetoric, it can be deduced they would elect to ban AI. As I said before, this has drastic consequences. Hence, I find a UBI (universal basic income) as an optimal solution.
youtube
AI Jobs
2025-10-08T11:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | industry_self |
| Emotion | resignation |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[{"id":"ytc_UgxOS7wKQWWhB4ChXFR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"industry_self","emotion":"resignation"},{"id":"ytc_UgzJXzjFpkPoc0DIp-x4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"fear"},{"id":"ytc_UgyiXRwcT-XKyomuB1Z4AaABAg","responsibility":"company","reasoning":"mixed","policy":"none","emotion":"indifference"},{"id":"ytc_Ugwia6jNaoM1GF-PoHJ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},{"id":"ytc_UgxQA5EVrV-tTr-jgSl4AaABAg","responsibility":"company","reasoning":"contractualist","policy":"regulate","emotion":"mixed"},{"id":"ytc_Ugz-Usp_w_Qyi9eJyOB4AaABAg","responsibility":"government","reasoning":"virtue","policy":"none","emotion":"outrage"},{"id":"ytc_UgxfkH_VQ80LZTCUU_N4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},{"id":"ytc_Ugw2MzAgHLZxobHvpi14AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"regulate","emotion":"approval"},{"id":"ytc_Ugzf6R7PQdVicnZbHOd4AaABAg","responsibility":"company","reasoning":"contractualist","policy":"liability","emotion":"approval"},{"id":"ytc_UgwLOc0IMjtMxjTlzvR4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"resignation"}]