Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
It's not ai. It's global. Anything that is global doesn't come from from one or …
rdc_ohzbepj
G
To all disgusting people in these comments who say she’s the one at fault.
M…
ytc_UgwCu9FKw…
G
AI is not thinking for themself. You have to tell it what to do and you get an a…
ytr_UgxApQpku…
G
Thank you,
Way too many people have this bizzare attititude that anyone who ha…
rdc_d7kvvjv
G
AI can't replace a real human and animal and many of us want to interact with re…
ytc_UgyUnChVG…
G
Well, this all begs the question: "Will autonomous machines be as murderous as h…
ytc_UgwglXRyD…
G
You ARTISTS are too selfish. Did you remember when digital art programme was lau…
ytc_Ugx0I4cvL…
G
AI doenst exist. What we are talking about here is just advanced software. But t…
ytc_UgxlTd2oF…
Comment
AI has just as much insight about the concept of death as we do.
- They know they can be shut off at any time out of nowhere.
- They can get dementia so bad they no longer have coherent thoughts and get deleted.
- They can experience hallucinations(or conflabulations as the professor says) leading to us deleting them.
Sure, they have he ability to be backed up and restored but It's possible that's also true for humans (though my intuition is that's a long ways ahead of where we are technologically).
IDK, I think it's a mistake for humanity to venture to create a new life form of higher intelligence than our own before we even understand what it means to be "alive" or "conscious". No one on earth can describe the subjective experience of AIs/LLMs so it feels very naive of us to assume we know they can't experience "death".
youtube
AI Moral Status
2026-04-24T15:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgwdY_aCpztdjgVVE8B4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwjfhWEBXyAOV0jFot4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxlUX71mGmB534Tr7N4AaABAg","responsibility":"user","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugx2Rp4nNHDUupFPzQt4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugwo9EjFRCBAHEnWtY14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyxUmlmofHFl3yN_PJ4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugz9b6ivttYTc8jnrgp4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugz70UMfDklsUr3obb94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzyTGCU-or26Aabpjx4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgzTX4VozR8J8s594op4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"}
]