Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
There is only 1 job that could survive
Prostitution.
The oldest profession....…
ytc_UgzGmaRQp…
G
My personal opinion on AI Art : if it is just used for casual fun and referencin…
ytc_UgxkuslcU…
G
The funny thing is i try my absoulte best not to use ChatGPT and shit, but it ju…
ytc_UgyFyk4Z3…
G
The fact that the NHTSA is touting Autonomous vehicles on their Government websi…
ytr_UgwoUlB_D…
G
You're off your rocker like all the ai guys 😂. Ai today to what you're explainin…
ytc_Ugw2dqO7M…
G
AI hype. So far LLMs are a sad joke. If thats the level of thinking it beats 60%…
ytc_Ugz1Y8K3I…
G
OpenAI programmers are training OpenAI to respond as such. It's a bummer becaus…
ytc_Ugzs7Ahd0…
G
@slothguy5946 do you think women are the ones creating deepfake pornography that…
ytr_Ugwv8XZeq…
Comment
LLMs are absolutely not conscious. And drspite the appearances they don't even reason or understand.
All posts on the topic mention Geoffrey Hinton, but while he's somewhat smart (and right about the necessity of socialism), he's really not vety smart. His argument on the quoted interview for instance is completely fallacious "logic" :
If you replace a human neuron by an artificial neuron, you may consider you basically kill one neuron. Let's see his argument undet that lens :
Kill one neuron? Yeah, still conscious. Kill 2,3? Yeah still conscious. His conclusion : kill all neurons, still conscious. Obviously false.
And the fact he doesn't immediately realize it and states such an argument in an interview is proof of very average intelligence.
youtube
AI Moral Status
2025-07-04T06:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | deontological |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgywYURbaaywUGlQJVZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgyBICaq0K6XMrKmr0N4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugw4KGoNZ6qkfTfPPFx4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzGodQsUA1hNS32srd4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"industry_self","emotion":"fear"},
{"id":"ytc_UgwI9JOFU3frr2fV6mJ4AaABAg","responsibility":"company","reasoning":"virtue","policy":"regulate","emotion":"mixed"},
{"id":"ytc_Ugyvao_iPGzRBwwh53d4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_UgzuIc7fGxid-5qCeld4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzwlY3EABvcA6OQlA94AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzbbxzGyZmPctlF6f14AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxvBQYWR6s9I5Ukvlx4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"fear"}
]