Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
AI art shouldn't be called art, art is a creative work made by a human mind. AI …
ytc_Ugw5tvSyo…
G
Well well well you said she is a android well she is a robot and she is a girl a…
ytc_Ugwx68EX4…
G
The scary think is not the robot but what the idea they Will docktrin in to the …
ytc_UgwHg-_kN…
G
We already lost and I think we're already beyond saving. The algorithms already …
ytc_UgzYMX8AI…
G
To the producers of these reports: please stop saying that AI or robots or any o…
ytc_Ugwnf6EbO…
G
I am also a musician and I HATEEE ai generated music because its 90% pop slop or…
ytr_Ugy0dyTZL…
G
I wonder why the people dont question the very alphabet of 26 letters.
69 chatbo…
ytc_UgxRKduYP…
G
Pretty sure we've been warned about this for a long time. If you don't want deep…
ytc_Ugx5z3ftZ…
Comment
I am convinced that we will never get to the bottom of consciousness and be able to properly scientifically understand it. It seems there just is a sort of subjectivity-objectivity barrier. Whatever you try to do, you simply cannot explain subjectivity objectively. You can point to the areas of the brain which light up or whatever, and if it's the same area in two humans you can assume it's the same experience, but how the hell do you know it? You just don't, you can't get into another subjective experience. Is your red the same as my red, is the famous question, and we cannot really answer it. We only know our own consciousness, we are not even sure the next person is really conscious, it's all just assumption, it quacks like a duck - I look the same as that person so they must have the same experience. So no, we will never be able to tell if AI has consciousness. It will always be the case of 'quacks like a duck'.
youtube
AI Moral Status
2023-08-21T01:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | resignation |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxhScuUOtRFTabR0C14AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzY8StKi1iYEHSuEgJ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgyVUkr6ZObxsAJ2ihh4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugwu0SKI6PvNLxswvdp4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzxP2zJ3Lp0FMzXQw14AaABAg","responsibility":"ai_itself","reasoning":"virtue","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugz5M3Li_xQNfbuYT0B4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugw0M9aUKL_PY_lQmtp4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgwZ7-7g4UwpKu3h1IF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugy6M12eZ2hA9Aj4yB14AaABAg","responsibility":"developer","reasoning":"contractualist","policy":"regulate","emotion":"approval"},
{"id":"ytc_Ugz2a00CIqW6yOxiLPx4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"}
]