Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
"I love my uneducated people." ... Neither Chatgpt nor Gemini nor George Orwell …
ytc_Ugw3jPbsq…
G
This is why there will be nothing designers can do. The people paying for the wo…
ytr_UgyM0JWF_…
G
Unfortunately not a realistic approach. - Yes there is still time for AI to whol…
ytc_Ugx7BzHAh…
G
Personally I adore the usage of ai as a personal tool. Many say this, but they d…
ytc_UgyPU2rCS…
G
Abhi to kal parso ek 100x engineer kar insta page hai....wo keh rahe the ki kisi…
ytc_Ugx8WSlR1…
G
i cant help but noticed you didnt adress what an ai model actually does, which r…
ytc_Ugy3zFJny…
G
One good thing about driverless trucks is that you can't be charged with assault…
ytc_UgyHsVK9P…
G
The data isn't biased the data is factual.. You can't be angry that it shows cer…
ytc_UgxR0oetz…
Comment
I think a lot of these AI doomsday talks reflect a very Western fear. People have gotten so used to sitting behind screens, eating food someone else grew, relying on systems for everything, that the idea of losing “jobs” feels like the end of life itself.
But humanity has always had a fallback: the land. Farming, growing food, working with our own hands — these are not new inventions, they’re the oldest survival skills we have. Civilizations have collapsed before and people went back to the basics. Harder? Yes. Extinction? No.
The real danger is not that “machines will wipe us out,” but that we’ve forgotten how to live without machines. A simpler life is still possible, if we choose to remember it.
youtube
AI Governance
2025-09-24T15:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | mixed |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgyJ-03TV5ouysbDied4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"industry_self","emotion":"resignation"},
{"id":"ytc_Ugx4FFSo3xZVhVeQdxh4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyWYH2mhT3EWB1r29p4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwJoMaoTjrGAxgNLQB4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgyMGGTmZqm6yGxpPxt4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgwWnvraP-gwZLrM5ZN4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugw4jKbb4XN8QZZ6ZXl4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxlC1GUZe6FMiaOOBV4AaABAg","responsibility":"unclear","reasoning":"deontological","policy":"ban","emotion":"fear"},
{"id":"ytc_UgxSUZgHLDy0ISO_t0h4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugz76BnOvpA87SkgHNh4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"}
]