Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Ai studio ghibli copies do not looke perfect! Please!!! Rest assured, human arti…
ytc_UgxwiCzGM…
G
Once invented by humans, AI robots could actually evolve just like other animals…
ytc_UgiYKjtUo…
G
At 12:40 he stops Sophia, when It starts telling about the coding being worked o…
ytc_Ugwiy00wk…
G
Ah, these good old delusions of Penrose 🤣🤣🤣
- AI will never attain consciousnes…
ytc_Ugx9nSQil…
G
Just limit AI's available environment and we are safe. i.e. Dolphins are likely …
ytc_UgxvPFxcU…
G
Okay, AI is coming to replace all our jobs. What I don’t understand is if compan…
ytc_UgwZCKZ4v…
G
Oh good thing the US is in the Paris Climate Accord, that’ll make everything bet…
rdc_gx5qgzb
G
I think of it through 4 points.
1: This is why we have a surge in energy product…
ytr_UgwbKz_6E…
Comment
But why should this be inevitable? If the critical mass of people worldwide would refuse to use this technology and would boycott the companies working on superintelligence, the industry would not invest a single Dollar into the further development of AI. No further investments from shareholders /the industry would make it impossible for states to create a superintelligence. Such a common approach would also avoid the upcoming financial crisis once the AI bubble collapses (its just a matter of time). Why are humans so stupid and rather willing to put the survival of our entire species at risk for personal wealth? If this isn't the core of intellectual stupidity, I don't know what is.
youtube
AI Governance
2026-04-10T12:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | contractualist |
| Policy | ban |
| Emotion | outrage |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugzr5AISxs1E6AYuwjZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzJSGmP4YZXEEpw_zx4AaABAg","responsibility":"company","reasoning":"contractualist","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgzXOeF7l4qD9JIFC0d4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzZheIhL5ZJ9I1-ljJ4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwVmZy7c9ga4ZLVIn94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgzSIFmqUmiuJKsrbhN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugyrx5fJ0inUihpRtx94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugwkh8sp0lU9jR6Y4zx4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugz7YUl3BDUWIwEsERt4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugz_WJCgeaUoyZlOBJR4AaABAg","responsibility":"company","reasoning":"virtue","policy":"none","emotion":"mixed"}
]