Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
All the answers to how we embrace AI and robots has already been explored in sci…
ytc_UgzmFK8x_…
G
@MarkCrawford-c5r Not just a camera, but also a keyboard as well. When photogra…
ytr_Ugx-kWqwz…
G
What IF AI is the real world Skynet (Terminator 2) someday in the future? AI co…
ytc_Ugw79J9uM…
G
I know exactly what's coming that's what's sad. One day the good people of socie…
ytc_Ugx9BXCCZ…
G
Zuckerberg is the shareholder with the highest stake in Meta. This means that he…
ytr_UgyIZDdfZ…
G
That was an exceptional interview. This book sd be a must read by govt officials…
ytc_Ugwq9IZ7d…
G
@JohnPh1023 We didn't need this crap 10 years or more ago and that didn't requir…
ytr_UgxdtWPt2…
G
We originally learned it from them and it was further engrained into our culture…
rdc_dv07yvc
Comment
No, never, not ever, in fact should a robot of mine start showing consciousness ill shoot it in the processor and go get another one. They're artificial, not alive, and since we can't define consciousness for ourselves, then we can't for a machine, and we should never try to build something that will ultimately surpass us. Humans became the dominant species by killing every other animal and hominid that looked at us funny and there's no reason to deliberately build something that can do that to us.
youtube
AI Moral Status
2022-04-29T21:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | deontological |
| Policy | ban |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgzX9ksonajxCVIgIMp4AaABAg","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzkbfBM9OKYYe9LxDF4AaABAg","responsibility":"user","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugxrex7upSRH4rXegYJ4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"approval"},
{"id":"ytc_Ugxjrqoc6lfhc-GHTz14AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzQaxjMdw3DwHAmVCN4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwvvX2H0fcrb82a7Sx4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"liability","emotion":"mixed"},
{"id":"ytc_UgyOnSyKdjQUZ04OZIx4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"fear"},
{"id":"ytc_Ugz15PCUn7LidJCaI_F4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugx0fNPuglmAB4K2WiZ4AaABAg","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugwfoo78Xt1MKetbW5t4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"}
]