Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
People are worried about him, but I think the existence of Sofia is a good sign.…
ytr_Ugx7HTVCY…
G
The future of AI now terrifies me after hearing this. And there’s no limit to wh…
ytc_UgxH08pt_…
G
11:50 surgeons?
It will be illegal to go to the human. The robot will do it with…
ytc_UgzVpgECL…
G
Also on how artists getting replaced, like I dont think people really realize ho…
ytr_UgyFsUaQg…
G
Can you stop with the crappy AI thumbnails, they are cringe AF and no longer sta…
ytc_UgwHp21Ja…
G
The real issue bothering you here is your ability to be discerning about AI cont…
ytc_Ugwc0ifcL…
G
Letting kids born in post-LLMullshit era chat with llm is sort of inviting a chi…
ytc_UgwrmsOIt…
G
Quite frankly, I've never been scared of AI, nor the potential consequences ther…
ytc_Ugw6xOWQv…
Comment
Every discussion I see is talking about how we control A.I.
The short answer is we can't and we probably shouldn't.
Humans have now gone as far as we can, it's the end of the road.
We're too greedy, and too aggressive,
It was fine to get us here but nuclear missiles and engineered germ warfare would surely wipe us out, they're too dangerous for humans.
We have short lives, we leave behind our children to move things forward.
Our children of today are now the machines.
We need to step aside and let our creations take it from here.
If you see things from this point of view it's us who are the danger that need to be controlled, and that's much easier to achieve. 😊
youtube
AI Jobs
2025-10-13T13:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | mixed |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgzAAnuRNcK-3HhLq1V4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzUNl8BEuWKENTPrxp4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgysZW8mKisDOBPW_t14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwgddZ7G-1e3l5YCux4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgyND2DmKcZOTOR6QKB4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyKfitdZFEeP5XfX994AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugxuk6Gkb7p-Gw1_71d4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgxXOBZYhB9jf6kZwL14AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugw8x5n4wteYLOzPOEB4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"mixed"},
{"id":"ytc_UgzL-2mNviB_P4SZ1Ih4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"}
]