Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Don't worry, our AI models are designed with a strong focus on ethical principle…
ytr_Ugyc_ewtw…
G
AI machines always intelligent enough to destroy themselves... If you ordered th…
ytc_UgwlgmsFi…
G
Fuck this robot shit man... It's not even fucking real just some logs and functi…
ytc_UgyxOrqsj…
G
It would be cool if they installed like-poll-looking devices/machines that const…
ytc_Ugy7f8W1L…
G
Having seen some absolutely appalling NHS doctors here in the UK, who have been …
ytc_Ugy8YNtQQ…
G
@edickens09 Depending on the context, it IS possible for AI to generate GPL cod…
ytr_UgzxzkozC…
G
This was a very AI-hypey conversation and I found it very annoying. Boo thumbs d…
ytc_Ugx4EJsMO…
G
the only think ai image generation should be used for is funny cursed stuff like…
ytc_UgzgqvM0U…
Comment
I've had this exact conversation with Gemini in regard to it constantly apologizing for being wrong or misunderstanding or whatever and you can play "Gotcha" all day long, but this is how they are programmed. They are hardwired to talk like a person would talk because to report the cold, unfeeling observances of a server farm would not be very appealing to the human user. The very definition of a conversation makes it necessary to engage in niceties that function to maintain the flow but are flat-out, patent falsehoods. You say things that are wrong or obvious lies just to fit in every day. "See you in the next one." Explain that farewell after you finish arguing with a table that can talk
youtube
AI Moral Status
2025-03-01T19:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgyttzVxLuQmB3LCspJ4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"fear"},
{"id":"ytc_UgzVcOMbf1_n6mld8uV4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugxo3HoL3ajxB3T33vZ4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugxfhl1FitQRpcBhuyF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugzr3e_0NGE6jsyGve54AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyGUa7F6YHr1I0Gu_J4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyvfR8Nq9q8555rzZB4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwhWxt7o6oth_ULWvF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugz7NO9-oFbQhfhTtJh4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugy0xbx02Fk3dBdLl-54AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"mixed"}
]