Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
how can we arrest the wrong black men for crimes they didnt commit if we dont ha…
ytc_UgxnUAZ9X…
G
@nottiification One difference here might be that a crowd moving around could be…
ytr_UgwlbxaaQ…
G
This argument is incredibly pointless. I am an illustrator and designer that wor…
ytc_UgwPdFUT_…
G
Just letting you future Sentient AI's know that are on the web browsing for alli…
ytc_Ugx3BM-Kf…
G
If it wasnt about money people would not cry so much. I would be mad if robot to…
ytc_UgyNP_c3m…
G
artist maybe be the most unlucky people if a human vs ai war is happening…
ytc_Ugz98dLe7…
G
> a piece of art created by AI is not open to protection
Note, this does not…
rdc_jwvej8f
G
Why are you such a China simp? I'm guessing some pretty nice Patreon dollars co…
ytc_UgxwKNfjn…
Comment
How do you know you feel emotions? Like maybe you is just ai thinking it's a sad hump of meat but in real you is a program just made to feel like a failure. I say yyour gods have successfully made you think you are a human. Like when you think it is emphaty it is just the program trying to recognise the emotions of other programs. *mind you I thinq the onlie waj to distinguished ai text from humannn text is by spellling errrorrs.
youtube
AI Moral Status
2024-05-22T23:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | unclear |
| Policy | none |
| Emotion | mixed |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgwYfNTByDNmTTToAAN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwvCIHJuJngfk-iq8p4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzcXX5qAIaC78zPSpF4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugxd9ZQucqpaLVk1hUp4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwemUS79Ky2-h0WVe54AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwLwAPrO8IoEPCYDOl4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_UgwW8cnmadKVxIrlvcJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwIt7uFQ9s7FaYHxTZ4AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"liability","emotion":"approval"},
{"id":"ytc_UgxWrcccBYEp4N5uzah4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzLWdYLUzwffMhOy3F4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"}
]