Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Ai should be used as a tool not be used in the entire piece of work in art…
ytc_UgwySfzXt…
G
Idk…I use AI a lot for my intellectual job. It’s amazing, but it lacks so much i…
ytc_Ugzpr555d…
G
How can AI respect humanity with a MAGA, Iron Age, Christo-Fascist cult attempti…
ytc_UgzyFzhri…
G
When are we letting AI take away peoples rights before they do anything wrong to…
ytc_UgwjsYja9…
G
I mean just engineer robots to do human labor - they can't complain, repetitive …
ytc_UgwSSywSy…
G
My husband and I were discussing this. If AI was truly smart, it might gain cons…
ytc_UgwJaq-J1…
G
I suppose that it is possible that we may actually be better off with an AI mast…
ytr_Ugw-25W7n…
G
Well... No more jobs? Who's going to the grocery store? What jobs will AI do? No…
ytc_UgyqgEZP2…
Comment
my theory is this: not every machine will have AI capabilities. Those that do, such as AI avatars like Siri or Alexa, will be recognized as sentient. Things like toasters and air conditioners will not be given AI capabilities, meaning they won't think on their own. In fact, I think disconnecting Alexa from the majority of devices (unless you're one of those strange people that want to live in a fully automated house) is the safest way to ensure that a machine, should it ever become sentient, would be distinguished between sentient and nonsentient
youtube
AI Moral Status
2019-08-29T22:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | industry_self |
| Emotion | mixed |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugz52rg38UD6qhuUrCF4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxXyh-p6pDPGItPDER4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxdROzPJg8SUyDFAet4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugy8Zh3m6s2unTwbeT54AaABAg","responsibility":"government","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgxXPtwOV_m08xwva794AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugyd9Vy1jQQbFbHUX2x4AaABAg","responsibility":"none","reasoning":"deontological","policy":"regulate","emotion":"mixed"},
{"id":"ytc_Ugy3FJi2xAksUrZX8w54AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyAAKTh3q10PRBoZKZ4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"mixed"},
{"id":"ytc_UgwyGv52cS3T3c8qkrV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxmuOjkzUrkJ88BbCh4AaABAg","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"mixed"}
]