Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
your art as a beginner is still better than anything I can draw, lol. I've been …
ytc_Ugzrmlntk…
G
This just proves that humans have flaws, but thats what makes Humans perfect. AI…
ytc_Ugz8Cnp3M…
G
Scared people are the easiest to manipulate.
"YOU WONT BE ABLE TO AFFORD FOOD, …
ytc_UgyEYbIot…
G
A.I. doesn't exist, all we have are advanced algorithms that steal other human d…
ytc_Ugz34NVeJ…
G
In modern economic philosophy, there are 4 classes.
Capitalist, who make money…
rdc_lkhqcck
G
Seems to me AI is mostly just hype, no way it replaces that many jobs in 5 years…
ytc_UgzMh059d…
G
When i was little i thought having a robot would be awesome. Now it’s terrifying…
ytc_UgyYRzgUy…
G
Copyright is always provided to art and generally protects you from anyone else …
ytc_UgyPh7QIE…
Comment
One thing is to simulate a virtual reality, which is still far from realistic nowadays, yet completely another thing - to simulate it with 8 billion intelligent and emotional human beings, as well as other complex animals. If it is even going to be possible at all in the future, the energy resources to realize it would be so enormous that probably impossible to get them. Anyway, the only way we could be currently in a simulation is if it exists in the distant future, but why would it then recreate the world from the past instead of these times? Unless it gets to the Matrix setup, where AI is governing the world and we are the energy source, while the past is the only satisfying reality to virtually live in. Hmmm, such a long shot! Still might be possible, but I'd guess 1% chance vs 99%.
youtube
AI Governance
2025-09-05T08:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | unclear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwVAHvgn6wDPSQhf7N4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugx6fAJqyDQ2SpoQBNZ4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"unclear"},
{"id":"ytc_UgybZIAhKJgX4Bstrnl4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgxJQFThJIBK2NBIEb14AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugyh3Rn0OO7LDxgIhul4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwaNe5gy5M8JnS32iN4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_Ugw_boPUf27diERaNyd4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugx1qqtHYR-aNdEX5Yh4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgxOH9ebyctRwsjBi854AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugw7UsEhviCNZvpWOs54AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"}
]