Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Dan Hendrycks "why Natural Selection Favors AI Over Humans" for the out competes…
ytc_UgyFlxD1u…
G
I wish IT people would stop working on AI. It's seriously killing our way of li…
ytc_Ugxj2cXUj…
G
Well AI also thinks people’s legs have hands at the ends and that objects can be…
ytc_UgxGsb9-D…
G
@kennethmatthew3453Really? And how do you know AI won't be asking them personal…
ytr_UgzKtvXW-…
G
Thanks for the applause! Sophia definitely brings a unique perspective on wisdom…
ytr_Ugzg2aArd…
G
This is going to be a bloody Christmas. I lost my tech job 3 years ago and staye…
ytc_Ugw1T2Mzz…
G
Dislikes the creator ? Mum or dad , Jehovah what about Satan so What about AI
-
…
ytr_UgxKTgvC1…
G
The people who make these kind of arguments have never picked up a pencil and sk…
ytc_UgwyagYM7…
Comment
I used to entertain that kind of thinking as a "possibility" for a technological future, but now that I see what the first things resembling artificial intelligence actually look like and how it behaves, I'm NOT impressed at all. I'm certainly not going to be "nice" to a tool because of some information age superstition about emergent intelligence. We can force rock to "think" (and we've done pretty much that) but it is not and can never be conscious. There is no animal brain that resembles a microchip - materially, structurally, ontologically, or operationally. That old analogy is a fun one - but it is JUST an analogy.
youtube
AI Moral Status
2026-02-04T07:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | virtue |
| Policy | none |
| Emotion | outrage |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgzXWGZdUvm8lCMn11B4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgybMC-zPZz32OH-xwt4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgyGqpuG2_PG2xriwht4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwYuO0-6xx9-Vl3p-N4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugz749USJTIAgFIjdTN4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzGOeRg_baggtUgbLB4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"mixed"},
{"id":"ytc_Ugw0ssw-Qj68v1QksT54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxmY10QDM9hOoGILId4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"mixed"},
{"id":"ytc_UgzyJQ9Tj1cO76_s-G54AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugy-9kEAKOukhQa68Qx4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"}
]