Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I'm shit at drawing after two years AND I STILL DON'T USE AI ART SO WHY ARE YOU …
ytc_Ugxxu4flZ…
G
So many stupid things… You just said that AI has a role; it’s going to fulfill i…
ytc_Ugwokn6fa…
G
I have a feeling that you might be making one of these “I put my ai self in my o…
ytc_Ugz9cjtlW…
G
I've got nothing against AI art just existing, but calling himself an artist at …
ytc_Ugxto49ub…
G
AI does not mean we're remotely at the point we're capable of building a Dungam,…
ytr_Ugzf-re0y…
G
Yes because giving a robot a live firearm has never ended badly in any scenario …
ytc_Ugw8iwZ43…
G
Yes that's my thought , I'm with Him take my money 💰 all of it for that AI Angel…
ytr_Ugzf_PeDo…
G
TBH, I see two jewish guys discussing eschatology, but I'll play along. The very…
ytc_UgxD2xZX8…
Comment
Isn't it common consensus amongst scientists that the feelings that make us humans *alive* are caused by chemicals in the brain? How would the AI even *be* self aware without a complex proccesing device like our brain?
Surely we'd give an AI a brain to actually be able to be alive if we wanted one to be so; and then that solves the question.
youtube
AI Moral Status
2023-08-21T06:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | deontological |
| Policy | unclear |
| Emotion | mixed |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgySZ6aLxO7ZpreByjx4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzhpURSR2IJEDSpv494AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgyWgr1V5d5bs3tppft4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgwhkKosGzX3vt7JSYR4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugx6l9iUT3XAEriWuNF4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzK6rJckH_Tb0w0wqJ4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgzQ2GXzis34278cFMZ4AaABAg","responsibility":"unclear","reasoning":"deontological","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzPTds8zGVirYTk1hx4AaABAg","responsibility":"unclear","reasoning":"deontological","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugxotep83lhNTvUs1cF4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxKJBMcpWtO68-y1qV4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"resignation"}
]