Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
And yet they continue to do the thing that they believe to be wrong. Might be th…
rdc_g9cs35i
G
I wonder if AI would be better or worse at diagnosing and nailing down really ra…
ytc_UgxyPffcz…
G
DESENSITIZATION!! People Don’t fall for the quirky dude presenting. You guys thi…
ytc_Ugx1cBIxt…
G
Orrrrr. We just. Don’t use AI. Algorithms shouldn’t be used to make decisions li…
ytc_UgyyH-65A…
G
I'm one of those people, i used to be able to draw (mediocre) but for the past f…
ytr_Ugz679GRD…
G
This guy may have good knowledge of AI, but his knowledge of the potential of ro…
ytc_Ugw2O7CFC…
G
Talk about being an obvious CCP bot. The most technologically advanced civilizat…
ytr_UgwB3aKq8…
G
AI kind of can't be used for such kind of art.
I tried many times to use it for …
ytc_Ugz8I7djU…
Comment
Let's assume that what some claim is true—that most of the jobs currently performed by humans can be done more efficiently by AI.
If this is what happens in the near future, it means there will be millions, or perhaps billions, of people without a job. If most people no longer have an income, who will buy the goods that companies so willingly sell us?
Will people without an income passively accept starvation? And if no one can buy anything anymore, what's the point of producing anything at all?
In short, either people will be paid to do nothing, or trade will simply cease to exist.
But isn't that what our capitalist society is based on?
Frankly, I find this all rather funny.
Why? Because it seems to me that human society is run by complete idiots who don't know what they're doing.
youtube
AI Governance
2025-12-09T14:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | mixed |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugz2aG7N3OMQ3Rntjwh4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgztS0q8_1H6nvUNjLd4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxGTEpfVBbzTPfVp_Z4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwQrQ96usVKwd9P00p4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugzi0cA4lSkk8w_dZox4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzTsObAhGsVY2Dk_IJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxpIrXnQ63fSjI-RtJ4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgztULkQzdrjx6GpvZ54AaABAg","responsibility":"none","reasoning":"mixed","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgyPTmGp6f8gH9Md3lZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugwu44HtihKP4y8xWRN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"}
]