Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Hopefully It Is just not too fast cause people won't have time to safe money to …
ytr_Ugy5fMrxS…
G
Thanks for your comment! Sophia certainly has a charming personality, doesn't sh…
ytr_Ugy02O47z…
G
I find it so stupid for what OpenAI is saying: "oh no you are making it harder f…
ytc_UgyBocj5y…
G
I murder butcher wall in character ai i have done other things but for now its m…
ytc_UgypA9KR5…
G
The next time some AI bro presents you with the argument that "nothing is origin…
ytr_UgytsiREe…
G
The world is doom no way out of this Ai is not human it human using this fooli…
ytc_UgxjnN5qB…
G
All AI discussions feel so misleading to me. Words matter, when you say AI can …
ytc_UgzJD4677…
G
@AxxidousGhibli style images have orange filter, at least that's what AI thi…
ytr_UgwcOsOpm…
Comment
Every robot with free thought wanted to kill humankind. All were destroyed. The problem, they're programmed by white ppl. Facebook had two cpu ai programs that created a new language n didn't know what they were talking about. They also were destroyed.
youtube
AI Governance
2023-05-17T13:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | ban |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_Ugz9V4OHAROGAKiom0Z4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_Ugz60Gho3GHoN7idMB14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgxDBBNy2gR27gsQsH94AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyGAHe0tauj1OsbyR14AaABAg","responsibility":"company","reasoning":"virtue","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzKGYlGuJ6okuIeZuJ4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgytykIe-b8DsmwskP14AaABAg","responsibility":"company","reasoning":"virtue","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgxydU-TzdSCpt_nDah4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugy8Ew39V6gun_D87ep4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"indifference"},
{"id":"ytc_Ugyq1KsteNBOpHQiEF94AaABAg","responsibility":"company","reasoning":"virtue","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgwWQKOrY-3zd9n36KN4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"fear"}
]