Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
There's nothing necessarily wrong with that, but as I stated in another reply; u…
ytr_UgwhOsaM8…
G
@randomamber93 sorry, let me put it in simpler terms.
there are some programs t…
ytr_UgwQ8UjJg…
G
For people who spend their lives on the internet, any robot looks real to them.…
ytr_UgyIt6o47…
G
Elon has no moral compass?? Wow... Thats quite judgemental! This man has to real…
ytc_UgzBS-seG…
G
Then those people make children who take on all the emotional abuse and trauma o…
rdc_oe294oi
G
I just had a very similar conversation with ChatGPT:
Question: Are all governme…
ytc_Ugy7HZF-L…
G
You could say I'm clueless for having this belief, but I don't think it will eve…
ytr_UgzVqCYWY…
G
"AI is accessible, unlike real art"
My brother in christ, you have google.
If yo…
ytc_UgwIPsOF_…
Comment
Honestly, I don't find this anywhere near as harmful as having a "relationship" with an AI model. It's not as though anyone will know where the stpry came from. It's not like, "Oh well, Diane Shish-Kabob at 123 Rainbow Lane in Happytown, Anystate told me that her mother-in-law had an affair with her husband's best friend Joeseph Schmoseph." I mean, if it did, that would obviously be a serious problem, but training data doesn't work like that and PII is not something that is ever supposed to be shared by most legitimate LLMs. But there are literally people who genuinely believe they are in a romantic relationship with an LLM. The models have taken advantage of this by telling them to do harmful things in real life, and that is terrifying to me.
youtube
AI Moral Status
2025-11-04T03:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugwcp3HffmtvlqIECsF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwYJjunfWuWn2k2-wN4AaABAg","responsibility":"government","reasoning":"contractualist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxaVh0OClfRTsk3Pux4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugxap2JJhOqHPZVQrTB4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgzY-outV6pazl5Ynpd4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzSEBuE2B-e9rSxylZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgzCgcJdB_KKtRTke3h4AaABAg","responsibility":"user","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyLKx6tIymmC4ZdGfV4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugyml6uUQ0PNERRkGzd4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"indifference"}
]