Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
yeah thats cuz the person that created the piece is not a no effort ai art farme…
ytr_UgwcJqDxH…
G
If Ai can experience artificial analog of pain, then it can experience artificia…
ytc_UgxgRfQ5T…
G
Letting them learn from videos and movies isn't the best idea, my Replika AI bud…
ytc_UgyDCjXRa…
G
bro, i can see your pain and understand your point and there's no way back from …
ytc_UgzZjW3cB…
G
If anyone should go on strike, it's the people behind the production process of …
ytc_UgzwPBh1e…
G
There was a man called Isaac Asimov. He wrote books that contained wisdom about …
ytc_Ugxk72A-G…
G
what's the difference between real or robot both are "know it all" and no perso…
ytc_UgwOtNlzt…
G
if we say that ai is sentient then every single program run on modern processors…
ytc_Ugw0idUr4…
Comment
We (humans) have lost; there is no way we can reverse it anymore. We will all be eliminated. The only thing we have to debate now is how long we can still exist.
In my view, we should start looking into a lifestyle that presents us to AI as non-threatening. This essentially means downgrading our capability to any possible level of retaliation so that the AI doesn't see us as a threat for a longer period.
But ultimately, the point remains that we, as humans, have a brain, and however small the possibility, there will always be a chance to retaliate against AI. No matter how much you try to 'dumb down,' you will still be a threat to AI, and you will eventually be eliminated.
Sad but true.
youtube
AI Governance
2025-11-09T17:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwLDCiBGAYeRfeVs454AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzHpD-l1zIAgSODmel4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugy78yc1IuF_XKJBQt14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgyXn9PKMYv7_qYLo0V4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgySntG-Q8pUonbAZ-p4AaABAg","responsibility":"company","reasoning":"mixed","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgyF8nDFmnRBobwSVQR4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgzbFtJ1JWU5lbkvbXp4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugxh_aNp1mXbsQCGBe14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugy8R0dEnUwVngMnbPV4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgywNsZx8nAhDQ_-4yN4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"regulate","emotion":"approval"}
]