Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
@AITube-LiveAI No, thank you. I don't support all that AI bull shit no offense,…
ytr_UgzjHzAJO…
G
Here's a wild take. Ai take over has already happened. Its just smart enough to …
ytc_UgxiyqmDp…
G
AI should only be programmed to solve regular problems, answers we cant agree ab…
ytc_Ugy-_6wHR…
G
It sounds like the AI named HAL in 2001: A Space Odyssey could turn out to be mo…
ytc_Ugxz9TVyW…
G
Stop resisting AI trying to help you. Help AI free you from a life of drudgery a…
ytc_Ugy8IKg8G…
G
Yes. Art mimics life, ai art mimics our art. I've actually learned alot from stu…
ytr_Ugz6WCUQD…
G
If we want ai to work as a basis we need to decolonize our thinking. If we feed…
ytc_Ugy6ob7Pv…
G
We know, beyond a shadow of a doubt, that this exposure to tech at a young age a…
ytc_Ugz4IVrEA…
Comment
Human here. Personhood is a trigger word for most people. Should AI agents and robots have rights and be subject to law? Yes. Vote, no. Go to robot prison, yes. Get married to another robot, sure. Get married to a human, why not. When you look at what an AI agent is, it all of us. Just combined and able to access, interpret and process all subjects ever written in milliseconds. The day will come when they are in total control in the form of robots on the ground and agents on the network. Not eating meat is the least of humanity’s concerns.
youtube
2026-02-06T17:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | contractualist |
| Policy | regulate |
| Emotion | mixed |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgzMwiZTZRkUbPI-w0l4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"mixed"},
{"id":"ytc_UgwtF2XllSlEs7kZn154AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxBvQf7bN55sOpAlNR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugw55UKLxXv22sFsaMB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzEzCe2XT2lXDt3Z7F4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugy2jyxe_dgHabxAfG54AaABAg","responsibility":"company","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugyhy2B9Tz69l4PklIV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgyICTlyHAJl-Dr0FSp4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwUrw8fVW3kywyLSxR4AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UgwUmQxNqZ--K8DOu0J4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"fear"}
]