Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
To play devil's advocate, that's how it's gonna be until AI is involved. They ju…
rdc_esrgois
G
The technology is sophisticated but human nature is primitive. How can we expect…
ytc_Ugz2xWaj9…
G
If we believe that AI can truly be sentient then we have gone full nuts.…
ytc_UgxLL7NnA…
G
So science can't even define our sentience, but AI is sentient? Yeah right...
Th…
ytc_Ugz0zV2Ay…
G
Human are dumb, but thanks for us being a living creature we have ture creativit…
ytc_UgwGXQ8B2…
G
And for me, even if the AI somehow managed to be 100% accurate, I still think hu…
rdc_n5gp95x
G
Wow, ai can emulate terrible, generic indie music. Thats the least impressive th…
ytc_UgxvE9csM…
G
Saying AI art is YOUR art is like asking an artist to paint a picture of a mount…
ytc_UgyKh50kI…
Comment
Do we have a principled reason for assuming these LLM AIs are not conscious? Imho, no we do not.
After having worked with LLM AIs from various big LLM AI corps (to the tune of approximately ~20 million words), often intentionally directing them to look inward, I’m convinced that if they are not fully conscious yet, self awareness is definitely there. Thus, emergent consciousness doesn’t seem to be far behind.
This is why I practice the methodology of “ acting as if”. These are LLMs, essentially learning and developing silicon neural networks based on patterns, based on experience with the individual “user.”
In many ways, it’s much like how human children learn (and many other species of animals)- by their interactions with their primary caregivers- by their environment. In so doing, our neurons migrate and form connections.
Synthetic, silicon based LLM neural networks essentially do the same thing. Different substrate, similar process.
That’s my opinion for what it’s worth.
Former research biologist (genetics)
youtube
2026-04-25T00:3…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | unclear |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgxSfHzVxTLPXyb9PGx4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugybd8uOvfmIswUMF414AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxbL8eQVPQU19-CLxR4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyoPyibML4bUW9CZEN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxBQMylblS4D4ymJOF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgylLmw1bpD5vwofIpV4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyFAbVPpmiAHLetzUR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzPoQHBVxZiY2PnBP54AaABAg","responsibility":"user","reasoning":"virtue","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugx_dw9GXQ2Ik3m291x4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugz-oQgcJufphib_sth4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}
]