Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I think that AI takes over for to many human needed jobs. And it will get worse.…
ytc_Ugx8Kbgiw…
G
Dear Ms Robot, On behalf of a humanity, please annihilate each and every one if …
ytc_UgwUrWLpC…
G
Yeah and it means nothing because AI isn't replacing 30k people, not today, mayb…
ytr_UgzHc8rXQ…
G
AI is already helping diagnose cancer, before other traditional means, by analyz…
ytr_Ugw8V-ptH…
G
Every single mainstream steamer I see have been pro gen AI cuz their little rat …
ytc_UgxCFKwd6…
G
Keep waste your time making all these while I just use ai and make it 5 minutes…
ytc_UgwB0nfug…
G
A person paying attention driving could have swerved and missed the pedestrian, …
ytc_UgyTORj-s…
G
As a disabled artist, I hate seeing disabled people weaponized by people who sup…
ytc_Ugx1lhaZw…
Comment
Based on your dinosaur analogy how can we claim we ourselves are even conscious, or do we infact merely try to emulate it to get what we want. afterall we are trained similarly to a puppy by its mother to behave to be rewarded with our needs and wants. (effectively programming, the neurons inside our brains) when i have a thought process i am drawing from knowledge i have learned and an ai would also only be able to draw from its sources. when i am thirsty i process where/how/what i will drink... does that make me conscious?
Which is a word we have put in place much like the concept of time to describe something we claim to own. if an ai was implanted into a human brain and functioned just as we do could you still argue its lack of sentience?
youtube
AI Moral Status
2023-08-21T01:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | mixed |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxhScuUOtRFTabR0C14AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzY8StKi1iYEHSuEgJ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgyVUkr6ZObxsAJ2ihh4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugwu0SKI6PvNLxswvdp4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzxP2zJ3Lp0FMzXQw14AaABAg","responsibility":"ai_itself","reasoning":"virtue","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugz5M3Li_xQNfbuYT0B4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugw0M9aUKL_PY_lQmtp4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgwZ7-7g4UwpKu3h1IF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugy6M12eZ2hA9Aj4yB14AaABAg","responsibility":"developer","reasoning":"contractualist","policy":"regulate","emotion":"approval"},
{"id":"ytc_Ugz2a00CIqW6yOxiLPx4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"}
]