Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
To be fair all these vibe coders are starting to realize that AI isn’t really al…
ytc_UgynuJxod…
G
They are betting that by increasing AI access, human input will get more valuab…
ytc_UgyFCB1nD…
G
The singularity, AI takeover is a hypothetical scenario in which artificial inte…
ytc_UgyZ0tjC3…
G
I tried using an AI thing… it looks better than the one the AI for this video ma…
ytc_UgxC-IY5w…
G
Remember kids, AIs aren't human! Even if AI training was exactly identical to a …
ytc_UgxQ0V9lM…
G
For that person, and that person personally, yes, you, let me set the record str…
ytc_Ugyd9Ycoi…
G
In case people are wondering, on 31.03.2026 OpenAI has has acquired 122 billion …
ytc_UgwQfwqjx…
G
well rehearsed word Salad. He is here just to profit from the development; I don…
ytc_UgzRUe8li…
Comment
Nobody paused and nobody is _ever_ going to pause. If one person pauses, another gets to use that opportunity to get ahead. This is game theory 101. It doesn't matter if that seems illogical it is literally never going to change. This is a pandora's box situation and treating it any other way is actively harmful, reallocating resources toward pursuing an impossible outcome is tantamount to throwing them in the garbage. We should allocate far more resources towards advancing AI safety in parallel, not spend time and effort trying to get developers to stop developing.
youtube
AI Responsibility
2025-05-21T17:1…
♥ 923
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | unclear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[{"id":"ytc_Ugze51w3Ob9E4jZ76514AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_Ugz3bv5ABhU4oZwVkzh4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugx6ly5qkRd63SuRPdJ4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzlGAr1woayHFmckBJ4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"mixed"},
{"id":"ytc_Ugzqgxx_GOyIrIhQv_h4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyKckZe8u1grR1nO1l4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugy3ivKUgyUtbDqQlWh4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyRUGKC9HtfhWQ421t4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgxxFhxvn75o0NBT56B4AaABAg","responsibility":"company","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgyuV1EknIGxbZqBp_V4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"unclear","emotion":"mixed"})