Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
You are the smart one to be wary. The problem is that this is happening without …
ytr_UgzVRHgne…
G
How does money actually work?
The way I see it, You find something that peopl…
ytc_UgwwNb-X0…
G
@arkaghosh2810😂 hilarious. I absolutely disagree. I can pick out AI slop anywher…
ytr_Ugxd6tb0m…
G
That story AI wrote about world war 4 is the plot to Terminator. James Cameron, …
ytc_UgzFiDKqQ…
G
So Overall Self Driving Cars are bad, better mass transit is the actual solution…
ytc_UgyKwejlb…
G
I used pi when I was getting ready to write my first book. I love it. I have log…
ytc_Ugw4ahUTt…
G
If robots become conscious, will we need to teach them Christianity so that we c…
ytc_Ugjv5Bx70…
G
There's an old Yellow Pages (telephone directory for anyone whose knees don't cr…
ytc_UgzcXdw9x…
Comment
i feel that for a robot to have consciousness they need to feel pain sadness and joy. they need to genuinely dislike pain and sadness and like joy. that's when they should be considered conscious and in that case they should have rights. like who said things need to be biological to feel. and by the way robots with consciousness should have a less chance of taking over the world.
youtube
AI Moral Status
2017-04-02T18:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | deontological |
| Policy | liability |
| Emotion | indifference |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UghN0A8SEeh4RXgCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"indifference"},
{"id":"ytc_UghzlaQnMcZyqXgCoAEC","responsibility":"ai_itself","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugh6kh87bJEztngCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgiK41GCIEutVHgCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UggpE9hB8ZGKUXgCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugjs7Uuups4vv3gCoAEC","responsibility":"ai_itself","reasoning":"deontological","policy":"liability","emotion":"indifference"},
{"id":"ytc_UghF3cqakiS6zngCoAEC","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UggTOrD8M8fPnXgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgjHaZq_lbQMNHgCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgjHONlI3SmohHgCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}
]