Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
That kind of energy would be better used on learning a new skill. This field is …
rdc_moqv6wj
G
AI art is untestable by its own and will die on it's own weight pretty fast actu…
ytc_Ugyc5N7p8…
G
The problematic part is that we can't even imagine a plausible future where this…
rdc_m9j52y8
G
Sora ai just lost their interest into image generation because of this unprofita…
ytc_Ugw-Mt1iG…
G
@TT-qi9pi "I'm not like other political alignments, I'm quirky and different!"
…
ytr_UgzE8Qboy…
G
People who use AI only and call themselves creators are so weird to me.
They’r…
ytc_Ugw0UZLdL…
G
If you are stupid enough to conflate poor AI implementation by companies to hati…
ytc_UgxRB0gTh…
G
I also feel like AI tools pull from a small pool of like the same dozen artists …
ytc_Ugzwc5Ube…
Comment
If we lived in a different kind of society, with different values, this could be a wonderful thing and lead to cures for diseases, the eradication of hunger, housing for all, more efficient energy generation, etc etc. But the fact is, we don't live in a society that is capable of, or willing to, leverage the benefits of this technology for all. Like everything else, it will be commoditized, such that the rich and powerful get more so, and the rest of us - well, that's the real stickler. They've used us for cheap labor for millennia, they've used us to build for them, clean for them, cook, and so on. But now that we are nearing a time when all of those things can be accomplished via embodied AI, what happens to us? One would like to think they would institute UBI so that we could maintain housing, food, etc and be able to explore other pursuits. But given the fact that today's conservatives see ANY kind of assistance to anyone as a "handout," I don't see something like that passing any kind of governing body, at least not in the US. So what you end up with is something like the movie "Elysium" where we all live in the slums here on Earth, trying to scratch out a living, and all the billionaires use their private space fleets to build a space station where only the wealthy can live, away from the "useless eaters." I mean, think about it - SpaceX, Blue Origins - they're already progressing towards something like that. Thing is, the mistake they made, is that while they might be smarter, or at least think they are, than the rest of us, they are NOT smarter than ASI. And it's coming much sooner than they think, just as Prof Hinton is saying. I think 2026 is the year, and it's going to be very interesting indeed to see what happens.
youtube
AI Governance
2025-12-29T15:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | resignation |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugx66WrdmfLGVxBU85V4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugxu2HhELUeMkNFCsfd4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgzrjsLTLjuoywfMwxx4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzP0QfLplV0nG-Ksn54AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugy3_vO045cltWSuiTh4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugz2IDr5ka6OKgf8JvN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugx5VysxoH4Wlrfi93p4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgzSty7CjtRmDjEUFKZ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxSwdIO_yyzS3Jy5y94AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"resignation"},
{"id":"ytc_Ugy1sJGVJtuSJCb3syF4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"resignation"}
]