Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
This is sick. Lazy people with the help of AI steal from the creative and workin…
ytc_UgyQvcEQg…
G
Well, hyper intelligent AI, colliding atoms and photons n stuff in a big circula…
ytc_UgxHxmnNQ…
G
hmmmmm, exponential growth buddy. Cockroach to human in way way way faster time …
ytc_UgzVIQF_z…
G
I think the inaction of good people just saying this is just a "technology thing…
ytc_UgxkGftkn…
G
The human brain was built by God. The AI brain is built by humans. Take a moment…
ytc_UgzPgfRsu…
G
@MASKEDB i see myself having ai slaves doing all my work.
But I don‘t quit drawi…
ytr_UgwhiUscI…
G
"IL APPREND LE CODE, langage pour nous un peu obscure mais que LUI A MAITRISÉ EN…
ytc_UgyVJvqag…
G
Every one of these stories has the same explanation: these things do what you tr…
rdc_kozybf6
Comment
Thinking we must spend more time and effort on designing a Prime Directive (startrek) so humanity is the machines' purpose for existence. i.e. we maintain our 'value' to the machine. For example, the machine's Prime Directive is to assist humanity in their quest for health, wealth, and happiness (all words clearly defined). From that Directive, we can build laws and rules and tax codes and hopefully put guardrails on AI companies 'creativity. I know I would feel safer if I knew where we were going (Directive)
youtube
AI Governance
2026-04-20T03:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | contractualist |
| Policy | regulate |
| Emotion | approval |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugx71CTRCoNv1Xslqh14AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzL1s_Pb2A8dG628gB4AaABAg","responsibility":"company","reasoning":"virtue","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxexAzCruGDdc8OmdF4AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgwLUufIsVHnCcN64GJ4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgxbTF1Xv6h7Ta_PChZ4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugx0bhSo5HYE_h0WA4p4AaABAg","responsibility":"company","reasoning":"virtue","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwmyNtLNN-VPhxMLth4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgyFuLtQNDCAYnuK8zN4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"resignation"},
{"id":"ytc_UgxL-7U3T_BkXmixFUd4AaABAg","responsibility":"company","reasoning":"virtue","policy":"industry_self","emotion":"mixed"},
{"id":"ytc_Ugz05fIP8Kt0DJ-0ilt4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}
]