Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Those problems created by AI (too much time on your hands) can be solved by AI..…
ytc_UgyARMDRJ…
G
2:24 Curious about where its words come from? Google. disconnect that service an…
ytc_Ugy-0OOXD…
G
In my experience AI is currently at the Dunning-Kruger level of intelligence.
I …
ytc_Ugw16U_Xd…
G
Imagine 1000’s of vehicles that ‘turn’ on humans & FLOOR it 100mph + .
No brakes…
ytc_UgwWuwclA…
G
The reason why they are like that what you see is because humans have created th…
ytc_UgzW0HP9j…
G
Virtually every conversation with experts about AI focuses on the risks and how …
ytc_Ugz8irGap…
G
AI has leapt over the "barrier to entry" this year. I think that is what scares…
ytc_UgwCOPu7F…
G
Why couldn't Gemini find cheap places where my friends and I can hang out. While…
ytc_UgxAaot-N…
Comment
The real issue is energy, it's highly likely there isn't enough energy on the planet to run AGI. ChatGPT alone took enough power to run an entire suburb for a whole year just to train it's 4th model. Human-Level AGI would require the entire United States energy infastructure to maintain itself. A human brain only requires around 20 watts sustained, it's actually cheaper to use humans to do our thinking. And that's assuming constant power, if the grid goes down the AGI dies. Humans can hit starvation, fall into a coma, clinically cease to exist and still come back with full memory and cognition.
youtube
AI Governance
2025-06-24T20:4…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgzggaTBHzHZbffaHBh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugy8gYeWsViy1EkLlYl4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugw3yJi8bMVXGtxoQLd4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UgxioJ6OWIryOkvsEvF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugyip9cpna_ev3nrny14AaABAg","responsibility":"none","reasoning":"unclear","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgyVDTQ_Co8759GfV5J4AaABAg","responsibility":"government","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_UgwI1SJdGhu8RAb7j0l4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxBVjtBRhd2mO__Zf54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxUUyw7VGem5NU3G_V4AaABAg","responsibility":"government","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_UgwBBtMaEtkBGk9zHrZ4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"}
]