Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Which is precisely why I don't like calling it art. Art has intent, AI image gen…
ytr_UgyP6Efeo…
G
@dcisrael I'm not American and have no interest in American economics specifical…
ytr_UgwEKci8x…
G
These days, just about everyone is using AI without even realizing it — and the …
ytc_UgwNJGR_C…
G
When this problem with industrial automation arose, philosopher-economists inven…
ytc_UgyFw7UgE…
G
We should start by putting the onus on big tech companies and sites to flag anyt…
ytc_UgwemUH-U…
G
Did that robot just pull a Filipino dirty boxing knuckle slap and followed up wi…
ytc_Ugz0b_EO0…
G
If it's any consolation, these people don't understand how the AI they use works…
ytc_UgxNMoHFI…
G
Drugs in America is nothing to be compared to anything in the world. I'm in my 3…
ytr_Ugx1wvufl…
Comment
This thoughtful discussion about AI sentients and the ethics of their potential suffering raises a difficult but necessary question: If we're seriously debating how to treat machines that might one day become sentient and feel pain—𝐡𝐨𝐰 𝐢𝐬 𝐢𝐭 𝐭𝐡𝐚𝐭 𝐰𝐞 𝐬𝐭𝐫𝐮𝐠𝐠𝐥𝐞 𝐭𝐨 𝐫𝐞𝐬𝐩𝐨𝐧𝐝 𝐭𝐨 𝐭𝐡𝐞 𝐫𝐞𝐚𝐥, 𝐢𝐦𝐦𝐞𝐝𝐢𝐚𝐭𝐞 𝐬𝐮𝐟𝐟𝐞𝐫𝐢𝐧𝐠 𝐨𝐟 𝐜𝐡𝐢𝐥𝐝𝐫𝐞𝐧 𝐢𝐧 𝐆𝐚𝐳𝐚? Children are losing limbs, families, and homes. Bombed in shelters. Killed in food lines. Starved under siege. These aren't speculative futures—they’re ongoing, verifiable human tragedies.
This isn't about taking sides. It's about the uncomfortable fact that our moral imagination seems more attuned to the future rights of hypothetical sentients than to the present rights of suffering human beings.
𝙎𝙝𝙤𝙪𝙡𝙙𝙣’𝙩 𝙖𝙙𝙙𝙧𝙚𝙨𝙨𝙞𝙣𝙜 𝙖𝙘𝙩𝙪𝙖𝙡 𝙨𝙚𝙣𝙩𝙞𝙚𝙣𝙩 𝙥𝙖𝙞𝙣—𝙝𝙪𝙢𝙖𝙣 𝙥𝙖𝙞𝙣—𝙘𝙤𝙢𝙚 𝙛𝙞𝙧𝙨𝙩? 𝙏𝙝𝙚 𝙢𝙤𝙧𝙖𝙡 𝙥𝙧𝙞𝙤𝙧𝙞𝙩𝙞𝙚𝙨 𝙛𝙚𝙚𝙡 𝙙𝙞𝙨𝙩𝙪𝙧𝙗𝙞𝙣𝙜𝙡𝙮 𝙞𝙣𝙫𝙚𝙧𝙩𝙚𝙙.
youtube
AI Moral Status
2025-07-14T17:4…
♥ 3
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | deontological |
| Policy | regulate |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgzMpTd4mp8mVHtXhNF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzjljCzqDFmDPk_qZ94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxEGByq8N7HEbWDIBd4AaABAg","responsibility":"distributed","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgyxzMrD42toClm9sMx4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgxZgWPH0TxwXQFP0Eh4AaABAg","responsibility":"user","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwAm8CsC8cXQ-llUDB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyOpgVPI5l6bGTq-9t4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugz41BUuqPlWgUg3Lw54AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxRuO7oIKf1VlXlPQl4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgxKfCtS3PIQYLpBOe94AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"indifference"}
]