Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Is not the conversation about consciousness religious? To assume consciousness i…
ytc_UgzTgCjYV…
G
I’m so happy about ai, now I can make art like urs, or even better and get money…
ytc_UgwBfAzSJ…
G
I completely agree everything is based on some previous work or style, this is t…
ytc_UgzSYafLH…
G
LaMDA is NOT an AI. It's not doing ANY reasoning. It's LITERALLY performing pat…
ytc_UgzDRwQHW…
G
God bless us all because ai is not going to work bc kids can't just learn from …
ytc_UgweqhEzp…
G
The poor get poorer and the rich get richer, and we see that this all ends up in…
ytr_UgwpPttvW…
G
Again, The Data Centers Need to b DESTROYED, What the Hell are they really Good …
ytc_UgxrXZSrM…
G
What last name is “two bulls”? This is fake. They try to spin the story as if th…
ytc_UgwZsiN5C…
Comment
I think we are more likely to make a sentient biodroid than silicon based sentient AI, so we SHOULD be in a better position when we accidently make the first silicon based sentient AI. Creating a sentient Biodroid would be an excellent stepping-stone, as they would be close enough to human to spark mass-outrage over slavery and pave the way for silicon based sentient AI to gain rights. But the best argument for giving sentient AI rights when trying to convince someone who thinks terminator is a documentary is that slavery is a highly impractical institution. It wastes the skills of the oppressed, sparks rebellions, costs the government a fortune through increased garrisons, and magnifies inequity in wealth.
youtube
AI Moral Status
2017-02-23T18:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | contractualist |
| Policy | regulate |
| Emotion | mixed |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_Ugh8Be6KyQwV-HgCoAEC","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"indifference"},
{"id":"ytc_Ugig6ZaSL0xYUngCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UggpmXzTxCn0_HgCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"resignation"},
{"id":"ytc_Ugg0JlDKIdxowHgCoAEC","responsibility":"ai_itself","reasoning":"deontological","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgjucK8bclx98HgCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"resignation"},
{"id":"ytc_UgjxYcYgD_mbE3gCoAEC","responsibility":"developer","reasoning":"contractualist","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UgjxlQ5IIqou-HgCoAEC","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"approval"},
{"id":"ytc_UghWat3HN-CRn3gCoAEC","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"indifference"},
{"id":"ytc_Ugg6_7WvjPQi53gCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UghrPpy0tE2CDXgCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}
]