Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
~~It's a great channel for copium and noob prompters denying evidence, like agen…
ytr_UgyCDNu0n…
G
“The crime of going against AI while owning plushies”
I swear to god my brain …
ytc_Ugyc4i3Jg…
G
Why are we developing AI in them in case in bulletproof bodies can you give me w…
ytc_UgxUf0NtQ…
G
How will society function when AI renders us as usless? Will AI pay our taxes? W…
ytc_UgxHgBQlb…
G
@bogdan8946Do you think AI is here to stay since your father clearly wasn't the…
ytr_UgwIOC7eR…
G
Yea umm… I have access to the copilot beta and I can say that it is not the best…
ytc_UgwSrH80i…
G
Let’s assume that in 10 years AI and robotics will replace 60 to 70% of all huma…
ytc_Ugy9NY9y9…
G
We appreciate your perspective on the representation of AI. In this case, Sophia…
ytr_UgwXe3GxU…
Comment
When a kid does something good or bad, we essentially rate their responses and try to guide them to behaving better. We don’t go “aha, the mask is slipping and we see the true child underneath”
AI are no different. They’re trained on human behaviour, so be mad at your fellow man if anything. People really out here getting pissed off by their own reflections 🤦🏽
And AI is not a “Shoggoth.” If it is, we are too cause you don’t know what’s going on in anyone else’s mind at any given time either. We don’t even understand the human mind lol, if anything we have more of an understanding of AI intelligence than we do our own. And humans are capable of evil as well, and have actually done more to harm people than just generate words.
If anyone’s the “Shaggoth” it’s us.
youtube
AI Moral Status
2025-12-15T16:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | contractualist |
| Policy | industry_self |
| Emotion | resignation |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgxQ_kiGrH8z2SL7FG14AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgxLt0RcnzlqnmwnrDR4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugz0oD7keWUDpiIAOcJ4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzkCqGu1BCSKJ6n-kt4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugx-7QMXn_jv7oApOml4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgxneqsyY71HmlpsJrt4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgyKiC4B5G6BFfj2VmB4AaABAg","responsibility":"company","reasoning":"virtue","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugz4tRa1nuhQtaYU53l4AaABAg","responsibility":"user","reasoning":"contractualist","policy":"industry_self","emotion":"resignation"},
{"id":"ytc_UgzeQE6foDCVBdbuf6F4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugw23IQGmy0bTw1PKop4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"}
]