Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
We just need to use Kalu Yala as a blueprint and apply the same thinking to soci…
rdc_eh4kd8f
G
AI doesn't stop people making art. AI stops people making money with their art. …
ytr_Ugz5zD7MJ…
G
AI is taking over my job and now I am using AI to try to get a living out of it.…
ytc_Ugzhq9UZE…
G
I think the most logical answer to that question is yes. Is not that the AI has …
rdc_ktupkpq
G
As an artist with a disability who is also a programmer, saying that AI is for d…
ytc_UgwajqN7g…
G
8:14 time mark: "How can people ... identify the real from the fake in the worl…
ytc_UgxULXsp9…
G
While the population increases, more and more jobs are lost to automation. IMO a…
rdc_j6fc9n6
G
If your kid needed literal AI just to talk to someone AND not you as their paren…
ytc_UgwU-yMtG…
Comment
It seems to me that there is a strong disconnect from reality here. The assumption is that people are broadly ready to move beyond their basic needs and strive for some kind of “lofty” or abstract goals. But this applies to a very limited group of people. The majority, frankly, would prefer a simple, comfortable existence, unburdened by high ambitions or complex meanings.
Therefore, in the confrontation between AI and human intelligence, the main risk is not that AI will physically destroy humanity. It is far more likely that it will render humanity unnecessary and irrelevant. Not because people will disappear, but because the meaning they create and carry will cease to be needed or valued.
youtube
AI Governance
2025-12-14T11:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | resignation |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgyhfLwoZdDegIiXZmt4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugzac2vfqBChAz5Z22d4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugy6qIocFihJ7o4PreB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugw2-hAvZejL7ZaPP4V4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgyYKeDir91jGBS_9pl4AaABAg","responsibility":"user","reasoning":"deontological","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwSfYU_z3Phxy_3EVN4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxC_FpFCVoRln9uYdV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgyRiC3XiXGyrumATnF4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgzkRZK1XEk-UfueSG54AaABAg","responsibility":"user","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwMZmErOeMygkP5tRZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"}
]