Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
hi, hypermobile artist with chronic joint pain here: I dont find AI "art" to be …
ytc_UgxzKLnij…
G
Sounds like AI doing work of five people with one person is going to lead to a s…
ytc_UgzNGJp-P…
G
They need to re-do the safety measures around morarlity and killing anything. Th…
ytc_UgzFf0Foa…
G
The only way to solve car centered societies is to replace cars with… better tra…
ytc_Ugx24hqkt…
G
Companies won’t pay anything for not hiring you, they will take profits from you…
ytc_Ugx9jUPho…
G
Wow, cops and government dont give tickets to AI for its mistakes? AI is the fut…
ytc_Ugy1lZy6y…
G
But if Trump is colluding with Musk to illegally advance AI training why is he p…
ytc_Ugy0Yzwj2…
G
I actually remember hearing, maybe a decade ago by now, this museum using scans …
ytr_UgzoGi49I…
Comment
When this problem with industrial automation arose, philosopher-economists invented socialism. Now another question has emerged. What if robots produce robots, then what are people supposed to do in this system? And another question. What's the point of a business owner who profits from this system?
There is an answer: in the society of the future, people should engage in science and creativity; business owners are no longer needed, nor are traders; the planet's natural resources should belong not to a handful of oligarchs, but should work for the citizens.
This is socialism, which many don't understand and, thanks to the propaganda paid for by the oligarchs through their media, they hate.
youtube
AI Governance
2026-04-20T16:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | contractualist |
| Policy | regulate |
| Emotion | resignation |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxSBiA4fMkzapFNoul4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgxJ3-alPC27h5kk8WJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugxkr4EsuUk3IV_errh4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxZSqg6QEfyjtLJegt4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgxmafQrz0Av6n78q6l4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgygkseFd1HLE2ErTm54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwPd6FftoHX4pmd-tp4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgyFl74l1ay5bCgzR9l4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgyFw7UgE3PP2EvTRhJ4AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"regulate","emotion":"resignation"},
{"id":"ytc_UgxmBi7t03LBZ_aXZOh4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"resignation"}
]