Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
It’s the AI argument I hate the most. I haven’t been drawing my whole life to ha…
ytc_UgyIVBzLI…
G
AI that is truly worth it's salt is still decades away. Still a parlor trick.…
ytc_UgyWDeml2…
G
I only see 3 possibilities for AI
1:It's not sentient, so remains a tool for the…
ytc_UgzdGOq3q…
G
@AITube-LiveAI AI can be constructive but humans shouldn't talk to some kind of …
ytr_UgzwkKx-P…
G
You're trying to learn? AWESOME! I wish you the best experience! And yeah, ai fo…
ytr_UgwEOCgzM…
G
(and for the concept art I actually prefer to use photography to create the piec…
ytr_UgzZKWs6j…
G
What's the gameplan of investing tens of billions into Anthropic, while also pus…
rdc_oi1qav6
G
AI users are some of the most sensitive people on planet Earth, calling anyone d…
ytc_UgzeCIbWp…
Comment
Unfortunately, it's the beginning of the end. There's no going back now, it's already too late. I say this not to scare those who are still unaware of this stark reality, but to prepare themselves for what is about to take place over the next 5 to 10 years. Ironically, AI isn't to be blamed for taking us to the abyss, it's the same force that has driven our beautiful world to witness uninterrupted confict, famine, wars, and devastation for thousands of years, yes you've guessed it - humanity! We can only attempt to hope that either AI will quickly realise that this planet and its inhabitants will hold zero benefit for sustainable life and leave us to our own destiny or they reprogram us to become the same as them. Either way, the outcome is the end of our human race.
youtube
AI Governance
2025-06-09T21:1…
♥ 18
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwFT5_LQ2RTLU5KsRB4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugx8Okm0PpleBqEMT7x4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugw7pvc-XWvsV6bZkH94AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugyw61D5nuPh8VHKUWN4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxcRsDHP-vK4Fe5m9F4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzmK57Jy93t2Bi2n_p4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"fear"},
{"id":"ytc_UgxqS5PnD7jiALLgcTl4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"liability","emotion":"resignation"},
{"id":"ytc_UgyH1nQ5VFhwzpTR9J94AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzKHr679RFivxullh54AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgzDkCIbij38alazgmd4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"outrage"}
]