Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
We should check to see how well AI closes all your stab wounds after exploring y…
ytc_UgxMKRCGg…
G
For everyone who's stressing about AI taking your job. I urge you to go watch so…
ytc_UgwpTQBhx…
G
I only believe in ai as a tool to assist or make references...
Not to be the pai…
ytc_Ugy3NhfqB…
G
@jakiro749 If it's still around in 10 years, absolutely. AI music will still be…
ytr_Ugz4ldr_N…
G
They are downloading real human consciousness into these robots yeah the robot c…
ytc_Ugx5go9iF…
G
I disagree on ai not killing jobs. In my 20 years in automation the things is se…
ytc_UgxrICcDn…
G
People need to learn how to invest. AI will take over because humans are not abl…
ytc_UgyZIufyI…
G
Yeah, it is not about AI, is about humans and what they will do with it. We are …
ytc_UgwJG9WvS…
Comment
It's funny how people are rushing to create super-intelligent systems yet don't seem to understand anything about economics. AI will not create abundance, it will create wealth for those who own the systems, and massive global poverty. Economics is all about demand and supply. If AI supposedly does everything, who will ever need to hire anyone? What will happen to the workers and laborers that were replaced by AI? There will be no demand for hired human workers. This means people will lose their source of income. People will not be able to feed their families, and so there will be massive starvation and poverty. People always marvel at new things, AI should not be one of them.
youtube
AI Governance
2025-08-05T21:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | outrage |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgwybKaa44ASv_3scfJ4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgwyvxRkfhEbVZgN-0F4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwyeX0PHz4_pToeLsN4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugz4QWD1_5wSvPyHOz54AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"fear"},
{"id":"ytc_UgwweEVXyikQt7EkvnF4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"resignation"},
{"id":"ytc_Ugy9kxgssC7uRpmsQHx4AaABAg","responsibility":"government","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugz8WlaqcvYAj7bkDKx4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgxNBVDmbExrztIZMO54AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxLP982bcAr0uk7Gqp4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugx7GvMoRoRMVgDtRFZ4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"regulate","emotion":"approval"}
]