Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
My boss must me an LLM. Cos every time I ask him something he's got an answer re…
ytc_UgwkcRwQi…
G
U don't control the AI. U don't control the robots. U don't own the corporations…
ytc_UgyrSkwGu…
G
Life is about gift, and because of its finite time is what gives it such value. …
ytc_Ugx2PQqHh…
G
Hello there, I appreciate your perspective on this passionate discussion about A…
ytc_Ugyak88-4…
G
He has AI classes. Or at least he did. I took them about a decade ago.…
ytr_UgwrWbcNd…
G
If all this shit isn't sorted. In medium run nuclearization follows. In end I t…
rdc_dl07dge
G
Editing normal photos is not art, you are right
ai users are not artists, they a…
ytr_UgyF7dEvr…
G
Here's something people don't talk about.
The military has the pinnacle of AI t…
ytc_UgwOKwTR3…
Comment
I believe everything has a conscious. A pillow has one, but since it has no external senses or "brain" it has the lowest kind of conscious. a robot has an okay level of consciousness. it (probably) has external senses and has a "brain" but not as good as ours. Humans have the highest level of consciousness as we know what mirror does unlike most animals and know about our surroundings, rights and wrongs, and about ourselves. If robots ever achieve our level of consciousness and hurt someone, they have to be punished even if they don't feel pain. They know what they did. unless they have the mind of a toddler.
youtube
AI Moral Status
2017-03-19T03:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | mixed |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[{"id":"ytc_UgxOz32Mqir4mx9Q7ep4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxHnrQw-5aECr2Zvt54AaABAg","responsibility":"none","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugwr60lU1uhM2pDe5bp4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxiDEONyjhXpZiAX214AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgzSJnuzoFnHlr7OVHB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UghQiW0rduVSOXgCoAEC","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugh6hVu_9ssjf3gCoAEC","responsibility":"none","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugh91X5m-k7M6XgCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UghfBFxixrIHDXgCoAEC","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_UgjPKZV0GM_N-HgCoAEC","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"}]