Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
We got here because it refused to generate images of white people even when spec…
ytc_UgwDiO0hj…
G
People talking about pillaging the contents of a driverless truck reminds me of …
ytc_UgzkFMoWq…
G
It seems odd to me that you are asking about Boston Dynamics and why you can't b…
ytc_UgxbB1KgF…
G
The problem isn't so much the elimination of millions of jobs by AI, but rather …
ytc_Ugx2W9Dir…
G
AI is codes and will not do things with out any one telling it what to do .. a …
ytc_UgwTBdOua…
G
Ai is 2 letters of the English language you can learn English during school hour…
ytc_UgxehudTN…
G
AI's threat to humans is directly correlated to those who program AI. Greed prog…
ytc_UgyKGAh__…
G
To paraphrase something I read a while back: "why should we bother with somethin…
ytc_UgzdWoEwe…
Comment
The difference between animals and robots, eventually AI and technology will be so advanced self aware robots could potentially come to the conclusion to create their own rights while forcibly taking away human's rights. Because humans have the potential to be no different than animals, and like you said animals have no right. Advanced AI will believe their logic is undeniable.
youtube
AI Moral Status
2017-02-24T23:1…
♥ 9
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytr_Uggd08XSz0JQIXgCoAEC.8PN6h_D467g8PNtlgSHTMT","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytr_Ugj_ypQQgCMlY3gCoAEC.8PN6dfNsBaM8PNWO9rxJLl","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytr_Ugj_ypQQgCMlY3gCoAEC.8PN6dfNsBaM8POA17mzqMD","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytr_UghZWqsVkxAASHgCoAEC.8PN6PjmEl_08PNN7YNqGfA","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytr_Ugjtd1c-ODg4hHgCoAEC.8PN6LkvQaaq8PNVzY69RLw","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytr_Ugj_RptSNZh00XgCoAEC.8PN5jps9XG_8PNHYi5uC_S","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytr_Ugj_RptSNZh00XgCoAEC.8PN5jps9XG_8PNYIKWrMbP","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytr_Ugj_RptSNZh00XgCoAEC.8PN5jps9XG_8PO0cXZSZ6y","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytr_Ugi35VrFcerAV3gCoAEC.8PN46RI2SQY8PODM9YuFDx","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"resignation"},
{"id":"ytr_UgjnJl7gI6RY5ngCoAEC.8PMweD_JVe88R_b6z9Dscf","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}
]