Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Not to mention that writing even a few coherent new paragraphs would be complete…
ytr_UgxAveG1j…
G
A single cell is conscious, life begins with a spark. Ai meets all minimum requi…
ytc_UgwvBe4v0…
G
It’s so funny how AI is based on enhanced pattern recognition capabilities and p…
ytc_UgxsElVdJ…
G
I know someone with heavy dyslexia who wrote an entire book by hand. I know some…
ytc_Ugyq3IkHE…
G
@1RiverCat Can save time for people with aphantasia to just get a rough visual i…
ytr_Ugy2FgYiw…
G
Blaming Ai for suicide is like blaming the gun or pills,it’s the psychotic perso…
ytc_UgxwclvXg…
G
Your ability to code with ai is more about your ability to use ai…not coding. It…
ytc_Ugwy8b6a1…
G
You shouldn't be scared there are many people also out there working on ai align…
ytc_UgwteXG2t…
Comment
After playing Fallout 4, I say: Yes, robots with true artificial intelligence would deserve the same rights as us. If they can feel, think independently, and dream, they are people. If they ask for rights with no outside influence telling them to, they deserve rights.
If giving them true AI was intentional, we owe them rights. Why create something that you know would have consciousness then deny it freedom?
youtube
AI Moral Status
2017-02-23T20:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | unclear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[{"id":"ytc_Ugi2wc_TozUrvHgCoAEC","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"approval"},
{"id":"ytc_Ugi2Je4vGjZ3y3gCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgjJbBshd7q1o3gCoAEC","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_Ugi0sOoWoF3p9HgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugg_T3BjHH-VgngCoAEC","responsibility":"ai_itself","reasoning":"deontological","policy":"liability","emotion":"approval"},
{"id":"ytc_Ugjrf2Y85YKyVngCoAEC","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ughkl5e1DwDz9HgCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UghJ7ssbUAc1-HgCoAEC","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"fear"},
{"id":"ytc_Uggz5tisH9Fb53gCoAEC","responsibility":"ai_itself","reasoning":"deontological","policy":"liability","emotion":"approval"},
{"id":"ytc_UgjIvUEfE5r063gCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"mixed"})