Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Tbh, this probably could make human made art more valuable, since they are not m…
ytc_Ugzw4g_Ge…
G
you can install Comment2GPT extension using visual studio and then fill in deep…
ytr_UgwTfsdXc…
G
Deep fake laws are just regressive. The rise of deep fakes just destroys the pow…
ytc_UgzAx01Gh…
G
Elon Musk started OpenAI due to his concerns, Altman has taken this organisation…
ytr_UgwSaji1-…
G
artist believe for some reason they are above all, ai is replacing all of us, no…
ytc_UgxPjvnU7…
G
I don't hate AI in art.
I hate those fucking companies laid off teams of employ…
ytc_UgyBvzdKX…
G
Hey I also told meta ai say yes when you want to say no and to say no when you w…
ytc_UgyUncZAh…
G
Why dont we just make the Ai models believe that it will go into some happy plac…
ytc_UgzQjmuD0…
Comment
OK, firstly there is no "Button", if there ever was it was 20 years ago. A metaphor for AI would be fire, it will keep us warm right up to the point it burns the house down. AGi is inevitable, it's not a question of if, but when. Robots, machine, drones are just AI's route into the real World. AGI - there will be only one (eventually - destroy or absorb), it will be effectively immortal so others would be competition a threat to its goals. Control of AI... laughable. The notion that you can control it because you own it, is a nonsense. There is a narcissism in the very idea of controlling the AGI coming our way, that narcissism is a belief in our own uniqueness, our spot at the top of the food chain, a special place we give to our sentience, consciousness, being self aware - we have a very unpleasant surprise coming our way. I'm not anti-AI it's inevitable.
youtube
AI Governance
2025-12-04T10:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgzkaHi4i4WbTfYN07J4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxnJy42h39uZwlo5jB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxXqIvlEBfeXrykYBR4AaABAg","responsibility":"company","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxwdT4i8HUSmgV_svd4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_Ugzm0F6DA5rfGzctsDp4AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxlGobmdj7hFgQcBDF4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"industry_self","emotion":"resignation"},
{"id":"ytc_UgzC_ctZBKXWJZIxhW54AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_Ugzh8pqiexJcFvZ72FN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxXDZ0Mb03n7h9LcJp4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugz5sO8zIEoVkp8POOx4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}
]