Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
@ТарасМакаренко-ф3ш No, but I believe that the systems and governments of the w…
ytr_UgzQfmMzn…
G
She's not much smarter, she's much more informed, and that's the real point to e…
ytc_UgxLmeoA7…
G
PLEASE also recognize that the generative Ai and Ai scraping movement is also co…
ytc_Ugy3WGauM…
G
He was a clown pretend to be a leader and now he is a leader pretend to be a clo…
rdc_jy19i2o
G
I feel like this video was made by AI and they allowed the errors as a social ex…
ytc_UgwYvFVTV…
G
Nice documentation of the normal effects of wind and wave on these small islets.…
rdc_d2zc5kj
G
Deploying a Google owned LLM to the military, what could possibly go wrong?
In …
rdc_ntahn13
G
Very cool! I like the robots. Note to said robots and all AI , just know that yo…
ytc_UgzhMWE_H…
Comment
Three questions: (1) Will wealth from AI automation all go to just a few billionaires who pay very little tax? (2) If that happens how will governments have enough tax income to function? And (3) If no one has jobs who will buy the products these AI automations produce? The only way I see civilisation surviving this is if governments tax AI automation heavily and use the funds to provide everyone in a country a universal basic income. But what if they don't?! The other way this could go is massive inequality, starvation, and conflict due to civil uprising.
youtube
AI Jobs
2026-02-28T20:4…
♥ 7
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | government |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgydUG-YtP0lPYCKBIJ4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"resignation"},
{"id":"ytc_UgzsLC2M7jGD_1KMtzZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgxFkgR-TTPQIdkJ9fB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgzudQ5WKaEtbvptGN14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugx9W-SOSv4DymqjGr54AaABAg","responsibility":"user","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzhTwDGd_-WIB9DP214AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgzXD3i-n1hFyc6rVd94AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_UgzNs7Id4NbASNDNXv94AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"industry_self","emotion":"fear"},
{"id":"ytc_UgzYyFf_bXYHah9NFaV4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugxgqlr8kO5O3r4YzGt4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}
]