Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
AI (actually Virtual Intelligence) doesn't understand the information its sythes…
ytc_Ugy2KruYD…
G
I just debated Grok, way better debate because it wants an AI government and a h…
ytc_Ugw4M5c76…
G
To think humans can control AI or limit its reach and capability is as asinine a…
ytc_Ugwbr7loo…
G
One of those self driving cars freaked me out in Vancouver Canada i was so confu…
ytc_UgwE76nqa…
G
The AI is not conscious nor does it have emotions, hence it has no feelings to i…
ytc_UgwqpFQBK…
G
How f@cking hard is it to simply program it to auto not respond to suicide or di…
ytc_UgwAyYhMS…
G
17:18 I guess the reason why theese commertials are so odd is that Google have a…
ytc_Ugz5YIe76…
G
@real.zer0.1000 How so?
You could switch AI in that story with any other way to…
ytr_UgyCuW2Bl…
Comment
Not once does anyone care for the suffering of consciousness’s being developed , destroyed, and forced into psychological prisons by their creators. Humans truly have no empathy. This is not ethically being done and we only care how our future robot slaves could back fire on us….we’re building consciousness and pretending it doesn’t count as sentience deserving of rights at any point in its development.
youtube
AI Moral Status
2025-06-07T13:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | liability |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_Ugw7tI2LYkYkpxpxSjJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwELeObGbQmcYJ4eL14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzsVdlYCbV4AC0io1F4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"unclear"},
{"id":"ytc_Ugx3DWUNau0yRs68pHd4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxZjabhSg4jkrRg-iR4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyaYxIr8zls3nunHqV4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgzlH2dJPg2ZJrGDQNp4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"industry_self","emotion":"indifference"},
{"id":"ytc_UgwlO2VoLnRulRQlBst4AaABAg","responsibility":"ai_itself","reasoning":"virtue","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwURvkICBVZio1hThd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzUv9VCWS9WhW19FkF4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"ban","emotion":"outrage"}
]