Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
A MORTAL CANNOT SLOW DOWN SOMETHING 10000x SMARTER THEN HIM- .. THATS WHY DE…
ytc_UgwwIFQs-…
G
You do not have to touch water to know that it is wet. An AI does not have to be…
ytc_Ugw0ucRl5…
G
I think the hate ain't cause of the technology but because of the lies, give us …
ytc_UgwIDHNYI…
G
To say its just a better google search when companies are firing en masse to the…
ytc_UgwEfVtyt…
G
AI, transfer me to a real human who is not a filipino college dropout and a poss…
ytr_Ugxdl6XgC…
G
The real dangerous ai maybe developed as a weapon in secret. And everyone will d…
ytc_UgwW09xt4…
G
this is a massive wake up call. I'm a second year comp sci/ cyber student and ha…
ytc_UgzP00WiK…
G
AI needs electricity. Unplug it, and you will quit your plumbing job, to go back…
ytc_UgyYDxmhw…
Comment
The "Gorilla Problem" analogy is actually terrifying when you think about it. If we are building something smarter than us, we have to be sure it's aligned. But honestly, the more immediate problem for me is aligning my budget with all these new models. I canceled my direct OpenAI and Anthropic subs because it was getting too expensive to "keep up with the race" Stuart talks about. Switched to omnely so I can access all the top models in one place without going broke before the singularity hits.
youtube
AI Governance
2025-12-07T18:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgylOcMtmfYPRLyA_uV4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugz0s_5F0fL7Yc6h9pB4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwCTFAC3tuaqQyd4rJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwmHU68lQswaDEmhOd4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugy9c5M8aiFACFvwDkd4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzslRkuK_KSVVjq6CV4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugxqiq20CC4lLEtT6Oh4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzAcj9D3tb7hktFuIJ4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzUI-XmKN6ijviTF6N4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyW72yvXfzgauYvOVF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"}
]