Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The Government needs to regulate AI companies like Cyberdyne Systems and Omni Co…
ytc_Ugxt8wWJH…
G
"AI CEO says his product can change the market so that investors pump their mone…
ytc_UgzVbRXiv…
G
Jab ai itna problem create kar raha hain to researcher ne ai kyu develop kiya…
ytc_UgwR9XXfz…
G
This doesn’t sound bad at all. This is just hybrid learning. 2 hours with an AI …
ytc_Ugz2htWmw…
G
Ai art is the equivalent of Chef Mikey. Literally nothing but the worst most slo…
ytc_UgzPCtP-d…
G
Satirising ghetto behaviour has always been a thing because it's funny AF. Stop…
ytc_Ugzqwde2T…
G
I don’t get it. It’s a tool. Like Photoshop. It does nothing on its own. A perso…
ytc_Ugy2Y5EFt…
G
Flailing and scrambling to hold back the inevitable. AI will make our country w…
ytc_UgxpoZS0K…
Comment
Very interesting conversation. I must confess while I'm largely ignorant, I don't understand the concern about whether LLMs are or might ever be conscious. As I understand it, we don't understand consciousness at all beyond "I think therefore I am". The reason I assume that other biological entities like Profs Green and Bostrom are conscious in roughly the same way as I am, is because assuming otherwise would seem to be extremely unhelpful. But anything that has consciousness in this way is biological.
Unless you're a panpsychist (I believe they're called?), why would anyone assume that an LLM is conscious, when nobody understands what consciousness is, except that it's a thing that I have and assume that other biological beings possess? I mean, I'm not saying it's wrong, of course, I'm saying it seems an equivalent to worrying that Sonic the Hedgehog is conscious, because I can interact with him, he can answer questions, he can 'suffer', and so on.
youtube
AI Moral Status
2026-04-20T08:5…
♥ 4
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | mixed |
| Policy | unclear |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[{"id":"ytc_UgyZkkFRo-IaJNrWMz54AaABAg","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxmCpOeke9qBaQBTZx4AaABAg","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwTP8bt4zAOYr8xEBl4AaABAg","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxjE83ElfpiWF196HN4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwBAZ5jF6ci-ATnmUF4AaABAg","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgwPLn5yg-c2i2OIn9h4AaABAg","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgxtkX1kuWaHUyKXrgl4AaABAg","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzPDFuyrW2HCOvYWvl4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugw1MbyL1mWKI_WaRYp4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgyAfCeA1Cp_R_Tgnox4AaABAg","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"indifference"}]