Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Most people are just absolutely clueless. There is a reason only podcasters are…
ytr_Ugwi2lVW_…
G
I was very hooked by this insider look into these prominent AI questions- I thin…
ytc_UgyzfD9EZ…
G
Same, this always happens when Im writing my essays in writable. It corrects my …
ytr_Ugygn4toM…
G
There is no retraining people if Ai takes over everything automated. Computers c…
ytc_UgyYNpsH8…
G
Yeah cause people build cars, oh wait thats robot too and not like there isn't b…
ytr_Ugziv8ama…
G
Absurd, alexa can't even understand context or word variants. Ai will completely…
ytc_Ugzop1Pb6…
G
Robots take blue collar jobs
AI takes white collar jobs
Not enough humans have…
ytc_UgxlKiYWS…
G
For me, i use ai if im :
1. Out of ideas
2. Wanna have fun in reading…
ytc_Ugwibk0KW…
Comment
The grant of personhood is not a moral issue. The ultimate taking of personhood is end of existence, death. A Commanche raiding party has no compunction about such taking. It's their survival. Typically personhood rights must be taken - think Magna Carta. It is only "given" when the givers aren't taking any sort of real hit. A starving pioneer family doesn't risk their continued existence by sharing. So, the key is abundance because with abundance, the giving of personhood has little marginal costs to society. But this will be irrelevant by then because AI will have the power to take it, and moral giving with simply be the stamp humans place on it after the inevitability so it feels morally good (not that this will fool AI).
youtube
2026-02-09T14:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | contractualist |
| Policy | regulate |
| Emotion | resignation |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_Ugwk2ARgGNRx2ZL24ll4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzjwH9BO30Qjsc7a2t4AaABAg","responsibility":"user","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugzp5eCQr9O2NG1Yyb14AaABAg","responsibility":"distributed","reasoning":"virtue","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgxWYDfJD4T9TkLVFwF4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgwSHDpP8HO0iZ5LTnR4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgyLr5ANlfO5A6nZ5_54AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugyx6U0jWg2wqipK6ZB4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxCWIzTnBz-QgVmwmN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgzcG3_Mk-npxaXS21Z4AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"regulate","emotion":"resignation"},
{"id":"ytc_UgzFjri1tdo0RZ3JAt14AaABAg","responsibility":"ai_itself","reasoning":"contractualist","policy":"regulate","emotion":"approval"}
]