Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The irony of plumbing as AI safe job is misunderstanding that the current plumbi…
ytc_UgxadmVXi…
G
On a moral note, the dreams of the small, untalented majority are being suppress…
ytc_Ugzpnehym…
G
I guess it's time to look at skilled trade jobs like plumbing, electrician, iron…
ytc_Ugx-og7Ox…
G
@Saucegod207 you really can’t look past the clip. let me explain something very …
ytr_UgzcsMhDY…
G
That's an interesting perspective! The name Sophia indeed has deep roots and can…
ytr_UgwkcDeWB…
G
The news this morning interrupted shit 3 times to issue the “breaking news”. It …
rdc_emp9145
G
The reason we're talking about robot rights is because the elites want to merge …
ytc_UgzEiE9Ka…
G
I don't like ai, but i sympathize with that person somewhat. Every time i try to…
ytc_UgyqD_tO0…
Comment
It makes we wonder; should the question be should robots be sentient? Do they need to be? Do they need, or better, do we need (as the robots, after all, are being created for our benefit in some way), for them to feel pain? With AI being touted as an existential threat, it seems odd that we care about giving them rights because of capabilities we willingly gave them in the first place, something which, for the sake of our own survival, may not have been wise to do so.
youtube
AI Moral Status
2018-12-17T14:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | mixed |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgwxbGDsqCiSNuQhRfR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwVgsUMRlcixxZfw8p4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_Ugx631C12qaWO8ZV5vN4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzT9okUcSlUw_n7VSd4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyipJWvs2wfjFQCHOd4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugz5JhTK_u6mxG2AHjB4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwOauXWT3WGLwPcCXV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxYcPwVex6HZFsb2kl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzUSa2V7YjIYKavGqN4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgxbAV0LZJaLxgRItnl4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"mixed"}
]