Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
This is boston where's all the Illegals.??? There there in mass quantities Not o…
ytc_Ugw0vSCAA…
G
I really don't understand how someone can be an AI optimist. The future looks ex…
ytc_UgyV3Q4Ro…
G
I actually sing a song using a crappy microphone and then use AI to make it a f…
ytc_UgyD9ZrqS…
G
"Musk has no moral compass"
> Does Sam (Altman) has a moral compass?
"I don't kn…
ytc_Ugz5OAH1s…
G
If AI decided to pull the plug on this ridiculous mess who could blame it?…
ytc_UgyC-ZaiZ…
G
It's pure faucking smart that you are posting a video about AI while talking on …
ytc_UgxfMW7NE…
G
if a super duper AI existed in 1896 and replaced the Supreme Court, would it hol…
ytc_UgxxrHWpf…
G
Not to worry, Christian! ClickUp AI is being developed alongside ClickUp 3.0, no…
ytr_UgwV9OgTL…
Comment
We can already see how AI will be misused. There is currently sufficient money, technology, manpower, expertise and physical resources to turn the entire planet into a utopia for the entire planetary population without the advent of AI. Why aren't we doing this? Because, you know, humans. All the negative reasons that create this state of affairs will simply be supercharged by AI and there is no motivation to stop this trend because humans are socially aggressive consumers and dominators. I'm not talking about the Skynet scenario, just the perversity of humanity.
youtube
2025-06-07T16:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | mixed |
| Policy | unclear |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[{"id":"ytc_Ugz-WNKxz5yxZO5ufzh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugyn_r3e1GTRp6akSR54AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgxpxXhOw8Gbz1XfTlJ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzMZjYqmrY3TtzZFth4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwFSMbAHybT81g8TcF4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugxq4inAcuj1NxsHCCJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyOAJdWka7a56YHTFV4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugzi0acRg3dn_r6AgQd4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyiefXddm_bKng38_d4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgwQaXN-Yr6ec3LbMbF4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"}]