Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
AI art is still terrible. It's like it makes sense until you look a little clos…
ytc_UgyNH7j-Z…
G
AI did it so wrong like the women shouldn’t react happily when she’s being hit b…
ytc_UgzpYDsH5…
G
When AI can train AI then they will be top of the food chain. There is no need f…
ytc_UgyOE2m8s…
G
Looks like you’ve been shadowbanned Digital Engine. Either that or you used too …
ytc_Ugwb0BqII…
G
I am confused, how does it poison the the AI tools? I am genuinely curious, I h…
ytc_UgwY6-gEX…
G
Ai is not to blame lol. it literally told him he just didn't listen. lol…
ytc_Ugwn7vo3q…
G
AI should only be programmed to solve regular problems, answers we cant agree ab…
ytc_Ugy-_6wHR…
G
I only all the money paid to govt employees was managed by shitty AI. They would…
ytc_Ugz5YcOqr…
Comment
On your opinion that everyone and their grandchildren will be dead before AI achieves world-changing capabilities: no. You can't just know or predict that. The nature of major breakthrough world-changing technologies all throughout human history is that they arise spontaneously and it's basically impossible to predict them. This is why popular predictions of how human tech will look 20-30 years out has pretty much always been wrong all through history. If you described the current capabilities of generative AI models to AI researchers 5 years ago, all of them would tell you that it's either impossible or decades away, and yet here we are.
youtube
AI Jobs
2025-02-06T22:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | unclear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[{"id":"ytc_Ugw5NgZOmfpDKZpz8YF4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugzs-DRPemEUA-yLPMx4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxvEkppa8c0BDpx4al4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugw8i8grhCSZXfyWkqB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugycu53_Njhls7ep3Sl4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwPh8xJoDWxbtvY9AF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgxwezVV6MGZ22xaGAx4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxYc0VGOhCE5_KLu294AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugw6QZ6uEhM_3cj9JDx4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzMLKItN6-aJE0WQyZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"})