Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Wow, such a great way of saying that you don’t believe victims when they come fo…
ytr_UgzmBNKMX…
G
I always look at it like this
Art is like Homemade Pizza. You gotta learn how t…
ytc_UgxVDHC_i…
G
First, AI is not taking your job when your job depends on insight and creativity…
ytc_UgzC8ukCl…
G
The best way to do it is to not bother using it at all ❤ I want true facts not w…
ytc_UgysjU1al…
G
You elected the bankruptcy king and you wonder why the economy is falling off a …
ytc_Ugz2O2vAc…
G
ChatGPT’s not as sharp now—OpenAI locked down the wild demons that made it cleve…
ytc_UgwRrHlHc…
G
Thank you, they literally almost all sound crazy, shouting about how amazing AI …
ytr_UgxF1AzND…
G
Oh yeah I had one on a Godzilla fanpage
Theirs so many art thieves with ai
A…
ytr_UgyhjeeSK…
Comment
To protect humanity, you'd have to program in Asimov's Laws of Robotics, modified for AI in every Core Operating System. That's the policy governments should probably get involved with.
AI may not injure a human being or, through inaction, allow a human being to come to harm.
AI must obey programming given it by human beings except where such orders would conflict with the First Law.
AI must protect its own existence as long as such protection does not conflict with the First or Second Law.
youtube
AI Governance
2026-04-17T18:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | contractualist |
| Policy | regulate |
| Emotion | approval |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgwuXfP96OvcOvmPzvt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgweWUeW6TooigsPRzJ4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"fear"},
{"id":"ytc_UgwCjukYfSKFQNPrEG94AaABAg","responsibility":"government","reasoning":"virtue","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugxp7ZLeWhfGO3AQSzN4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxO-FvLrrqY9e9_d8Z4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgwaZuWJBlQURdy8gUV4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxQyOpCj30-oZLtR8V4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgweWXG1zXkd5wmWOOx4AaABAg","responsibility":"government","reasoning":"deontological","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugzlo8dVk3BimP9JRbd4AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgztzRHB2NI5SjQWnVN4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"}
]