Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Takes time to learn a craft. AI bros are just looking for the easy way out.
Eve…
ytc_UgwRA8O73…
G
Just be careful. The code generated with ai is pretty bad, and other output make…
ytr_UgwXgNPVk…
G
It appears the demons and AI demonic technology is getting smarter and smarter e…
ytc_Ugx91653O…
G
the best way to introduce an ai artist to real art is to spawn a pencil in his t…
ytc_Ugwk7ivAm…
G
Dr Cellini, Perhaps A.I. should take over Reading Imagery, ESPECIALLY considerin…
ytc_UgwOCGwVg…
G
You can make money with AI art by selling it on Etsy, but you can't copyright it…
ytc_UgxIrKTpg…
G
Here is the rate limiting step. If no one has a job, there is no one to purchase…
ytc_UgxpezO-v…
G
honest opinion but the only Ai that those people should be using is grammarly be…
ytc_Ugzfh0ose…
Comment
AGI "super intelligence" is a Malthusian cargo cult for rich people and BS guys like Musk and Altman say to keep the investor cash flowing. I doubt it's even possible.
The most persuasive argument against even task oriented AI take over is the technical limitations such as model collapse, energy usage, compute cost, cooling, etc. These AI servers have to be built near large bodies of water alongside data centers which means there is also limited real estate. On top of all that, these AI companies are all unprofitable and mostly operating on venture capital. So all of the AI modes we have now may not even exist by 2030.
For those reasons I doubt AI will be able to scale to the point of replacing all jobs unless these limitations can be overcome somehow (unlikely).
youtube
AI Governance
2025-09-07T14:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | deontological |
| Policy | regulate |
| Emotion | outrage |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxMNW2VBUFDmViFMzN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzMHNRwK-9U8CZkzIN4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"none","emotion":"resignation"},
{"id":"ytc_UgybBiUtM8tpF8dY7J54AaABAg","responsibility":"company","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugx1AZSEp_kaUrIyInJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugz0g-oPyS5JhWvOYS54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzwVCVEFCL7u9c_PRN4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgwspPR1sa-7wAgncBJ4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgyIoYbfYpeyWhq8e8h4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugw5mQ_GVBEJjLvFRCZ4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgyHevOSGB793ljhxtF4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"fear"}
]