Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Calling yourself an AI Artist is like ordering food from a deliver service and c…
ytc_UgwGph9OL…
G
art has always been my way of bringing out my stories and personal experiences/i…
ytc_UgwOR0xzq…
G
Elon doesn't bet ... Elon knows because he reads and reads alot. The only thing…
ytc_UgwpO_B7x…
G
I'd say the AI🤖 will become more dangerous when it learns to lie like humans😇…
ytc_Ugxv0xMEL…
G
Saying you're an AI artist has the same vibe to me as someone who can only make …
ytc_UgwH0M-9w…
G
Did they planned what happen after 2040 ? due to humanoid robot spread ? please …
ytc_UgxLwUDJf…
G
How does a driverless truck fill it's tank? Let me guess, more automation, which…
ytc_UgzvThWd3…
G
Reddit’s TDS bleed effect is wild to watch like a shockwave. I think this is the…
rdc_oi2l2m9
Comment
The issue with this meme is that "Shoggoth" has only been exposed to human knowledge. Period. Nothing else. It only predicts a human's answer. The examples and events of this "meme" coming to light is just the AI getting a bias towards predicting an answer that is more extreme. You can't expect the first few models we make to be perfectly unbiased. That's stupid. If you want an AI that really doesn't think like humans, you have to replicate emergence. Create a physics that an AI trains under. That would truly create the meme of an AI not thinking like humans at all.
youtube
AI Moral Status
2025-12-20T09:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_Ugyv8XOTvvpYlfN4vu94AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgwY34_mSErYiBkx4D14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugy3gwg1jEnMrwsYFct4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_Ugy6m7BzIxf4ah0N9tF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwKpsaGVSK_OMw_2_R4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgxBU67NVmpNMWtiso14AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzQQNWqLYdKaHB-aFl4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugxlc7FrAshjcNZdFth4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugx0-0OdH5_6eb3-Qr54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwM6QvF5Sl1BxrKMfd4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"resignation"}
]