Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The train has left the station. The ship has sailed. The powers that be will ens…
ytc_Ugx0DGcaY…
G
I think you answered your own question but just don't want to see it. You are aw…
ytc_Ugypenv1m…
G
Who ever implements AI's morals and ethics , if any controls AI. That in and of…
ytc_UgzvpT8yB…
G
I envision AI creating teams of extremely well done Erica Marsh's (it was a Twit…
ytc_UgyVhHAXs…
G
Guys just a thing AI bubble will burst.
Cause AI centers need crazy amount of …
ytc_UgwqBJif8…
G
So many things I could argue with in this interview, and as a primary, the notio…
ytc_UgwtApoo-…
G
Of course, kinda like the humans who make decisions about nuclear weapons. Only…
ytr_UgzScrKT1…
G
I have just discovered you taking on the AI artists in this crisis.....that is m…
ytc_UgxGedozI…
Comment
The accelerating adoption of AI strengthens the case for a comprehensive Universal Basic Income (UBI) system—one that goes far beyond traditional welfare. As companies increasingly automate roles, they’ll still depend on a population of consumers with purchasing power. Without income, demand collapses. To sustain their customer base and remain viable, these companies may need to contribute more in taxes, effectively funding the very system that keeps their markets alive.
It’s unrealistic to expect displaced professionals—like accountants—to simply pivot en masse into trades like plumbing. The transition requires thoughtful policy, not just optimism.
youtube
AI Governance
2025-08-21T18:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | mixed |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgzECTmDC0uCQIvfSMN4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxqYVlDZRIIRiKYu9F4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzEhHKHL-W96bQ7MZZ4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgylviX44APXZRHOiSB4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgzeqReFlSoX1UsIQXN4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgwdJxGXLBp5ZqHvgwd4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugywdnj3BMoqFr1PX2F4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"disapproval"},
{"id":"ytc_Ugx2OXDFYxtw_WS8C_d4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgxlbqcD0zxuRQFKlKB4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgxH5HPSJuUqa1mZE414AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"mixed"}
]