Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
It's more one single issue, which encapsulates both those things. To simplify, a…
ytc_UgyY0N4XE…
G
I bet when AI starts replacing Congressmen the rules just might start changing..…
ytc_Ugz9mnxNa…
G
When AI and robots do everything, governments around the world will require peop…
ytc_UgxeK_COO…
G
i was using so much of my brain power trying to figure out what the hell nanowri…
ytc_UgwPA5-Gp…
G
I have a feeling this is where people are headed when they depend too much on AI…
ytc_Ugxo8sPSC…
G
Has anyone here actually used "AI"? As far as I can tell it sounds very plausibl…
ytc_UgwgJiMeH…
G
Ai developers and creators are the true pure evil at the heart of this. They are…
ytc_UgxQqCGHd…
G
First thing AI will want to do is perfect space travel so it can leave earth and…
ytc_Ugy-Iefv0…
Comment
In 1952, Kurt Vonnegut published a book called "Player Piano." In that book, AI took over all the jobs, but the society understood that there would be a need for providing a generous government safety net that could provide health care, housing, and other things for all the people who have ended up unimployed. Today's billionaire bros do not care about softening the blow to ordinary people. Some people in the book got to be the small number of people who still had management jobs, others got to be robotics technicians tending to the equipment in the factories, but there was still a problem with bored, unemployed people seeking the dignity of doing skilled work. It is interesting that Vonnegut understood this situation so many years ago.
youtube
AI Governance
2025-06-24T19:1…
♥ 687
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | liability |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[{"id":"ytc_UgysReljx0YFfDJUoeV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxwkJmmJDU4BeEggFF4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgzHt7X_0KILg4qZgJl4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgyLmN9pdddU6_CuVlZ4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxocueAugDnK793kOV4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugzy3B9eb2TUoTUTDHp4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"industry_self","emotion":"resignation"},
{"id":"ytc_UgygavX_03nF2-E1nhB4AaABAg","responsibility":"distributed","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyTZyttwUgAZCDduF94AaABAg","responsibility":"developer","reasoning":"virtue","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgwhdlHLxVfFaP8rq5t4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgySUwC4oqolJpKVeHV4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"fear"}]