Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Eventually cops and prosecutors will be using AI to make deepfake to lock up inn…
ytc_UgzwrKxJw…
G
We need to stop spreading the idea of the term “Ai “artist”” because it’s not ev…
ytc_UgxFMbeME…
G
Humans so stupid they are smart enough to create science fiction watch or read i…
ytc_UgyTnqsd7…
G
You spent way too much time defending AI. LLM-based chat bots should never be re…
ytc_UgwY7-7Dy…
G
Keine Angst bewusste Menschen werden sie unterscheiden können🤗
Machen wir sie un…
ytc_UgwjIytYp…
G
Ai is terrifying... there is literally an AI model that is not REAL making money…
ytc_UgzKoom7E…
G
I think the robot gain it's own consciousness for some second,this goes to show …
ytc_Ugyr0RlgG…
G
Once the AI developing would be done, they would no longer need us. At that poin…
ytc_UgzXHaMIB…
Comment
With all do respect, although i really believe at the power of super AI, the resources, power supply and therefor infrastructure and money does require decades if one talks about world domination scale. So this truly sounds too much as sensation and not realistic timeframe wise. Also the most critical systems are offgrid without any internet connection and that can be 100% controlled and governed by humans.
youtube
AI Governance
2025-09-05T14:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgzrCf8BWlp76TVxbf54AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugxm72hEiZdqti1R7Ap4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgylzZjjG29fnWmDypx4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyYg4eQ28XJg5-wMhJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgygY6w1UviOaExpFIt4AaABAg","responsibility":"company","reasoning":"virtue","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgwrSA9Oem5CIdZJOuB4AaABAg","responsibility":"user","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxUXcL0cqwtx8Kk7_d4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugz8WrJF4l8X8TyxqaJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyzOJzCUwOJasQjzNx4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgxmEsk5R_goAdJPCd14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}
]