Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
This is one of the ways Stephen Hawking thinks will end humanity.... ai going ro…
ytr_UgwmZk9-L…
G
I use AI to help write templates for emergency action plans and post orders...b…
ytc_UgwuT-L0M…
G
Im from Tennessee...a lot of the people here dont even know what Ai is, they cou…
ytc_UgzWa3WQS…
G
what's going to happen? the economy could decouple from meeting human demand to …
ytc_Ugyy6KX6t…
G
I don't understand, fake articles?
Isn't AI looking at main news sources and no…
rdc_mkbceh6
G
Incorrect. AI requires mass theft to create anything. The tool is only made po…
ytr_Ugz2F7aRN…
G
I think youre missing one point. People are grtting rich because of customers an…
ytc_UgxYtZ2Ir…
G
Sophia is more human than a robot. Fear of being obsolete and not being useful.…
ytc_Ugx3RRe6N…
Comment
Yes. Conscious entities should have rights. Period. If we don’t want to deal with this, we simply shouldn’t create AI systems that can have consciousness, or can feel pain. This is our responsibility.
youtube
AI Moral Status
2024-12-06T20:3…
♥ 7
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | regulate |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgyTSlYXF8YeGLvnZ0p4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwHDb22w4kFqt-DEKx4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzGr2ym5qwOh9NzdJ94AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzYREqoM9SiKLvnXqN4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"unclear"},
{"id":"ytc_UgyYJHNV9OtSwWgwNGZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgzMNOnkp9AsF0pYwQN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwURGZntvLUO7L0UJ94AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwReRzx6ODQt-suNYN4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"unclear"},
{"id":"ytc_UgzLLRZpFIoeW4kdz6N4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzpVdhSBxwNYGJtSPR4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"outrage"}
]