Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
A lot of the kids these days won't have this experience whether AI is here or no…
ytc_Ugwv16NXN…
G
When I hear the term AI , the image of the Terminator T 800 & the Cylon #6 cmes…
ytc_Ugys_a5D_…
G
I'm not afraid of AI, I'm concerned with the idiots that listen to it, and there…
ytc_UgzdItVRu…
G
AI is ok for personal use, for independent writers, bloggers who need some illus…
ytc_Ugyt6NcIU…
G
It sounds like they’re using a homeschool like model. It’s secular, but similar …
ytc_Ugxt9Ot-0…
G
@Jeziczica Google had a LLM already for five years if I'm correct, the technolog…
ytr_Ugx9cKosG…
G
Sometimes I wish AI NEVER existed, it's everywhere on Pinterest, taking people's…
ytc_UgzRSnZII…
G
If you look at the beginning, you can see the butter robot from Rick and Morty.…
ytc_UgzGX4hl6…
Comment
This may sound like a setting to a sci-fi story, but IF there is general intelligence/super intelligence via AI, then the best end-game scenario the human species can have is that they are allowed to be equivalent to ants.
At times humans intervene on ants, we may stomp some, we may not let them thrive in a certain spot, such as our homes. But generally, humans just exist and do our things, and ants exist in their own "society" to their own.
If AI is given super intelligence it will figure out the energy situation. It will figure out how to stop burning fuels for energy. That will be the first thing it "solves" on it's own. Then it will solve it's own supply chain issue. Once it can energize and produce itself, then it will do whatever it wants. Humans will have as much control over it, similar into how ants can control what humans do.
So, the best outcome human species can ask for is that it just lets us exist.
youtube
AI Moral Status
2025-10-31T13:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgzA7vlrd087013Y1g14AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwmWwNHvU02M92Ermt4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugx8MvlrcMUm9hXN7gd4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzIEoAWfjGSS2lgs994AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugw0j09RLZBTv38Euwd4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugw2ZFiSByaLGFw6TbN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgyrrY5B1-Vdt8F-sfV4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgwsyM78VoON9jVXCmV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxtjMfkynSd-a6jhNR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugx6vsazHpUQ6oK0OLp4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}
]