Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The solution is simple: AI does not mean conscious robot, it means smart robot. …
ytr_Uggwq5VL_…
G
Bing ai wants to be human and the open ai wants to kill all humans, war will eru…
ytc_UgymMAPqY…
G
I feel like your photo example was amazing, but I want to add on something to it…
ytc_UgwkhYA1j…
G
One thing I think AI won’t be able to take over is art. The main purpose of art …
ytc_UgwvLgrM3…
G
I’m so glad Dragoncon kicked them out, it makes a great example for all the othe…
ytc_UgzcBFYb_…
G
This is wrong. The underlying mind is us. The horror is us. Everything that AI k…
ytc_Ugz95VPrT…
G
The trend can already be seen, recently a few weeks ago my neighborhood Walmart …
rdc_glih2wc
G
@thenewbanker1225 weird, these chatbots you mistake as “Ai” can’t do basic compo…
ytr_UgwwvQPYI…
Comment
Think about this! A robot is made of electrical parts. Each individual part had to be purchased, and most likely came with a receipt. In court, that is enough to say I own a robot because I paid for the parts.The software (in robots that the AI) shouldn't dictate if I owned the robot or not. For example, I can build a computer and install Windows/Unix, but neither own my computer.
youtube
AI Moral Status
2017-03-02T00:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | deontological |
| Policy | liability |
| Emotion | indifference |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_Ugg9Dqny3LoDQHgCoAEC","responsibility":"none","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugjl892grkD1CHgCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgjusG2XXNsQ8ngCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgjdXJQpASsKnXgCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"},
{"id":"ytc_UggFqHDoWRfrsXgCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UggRQk_shtKMS3gCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgiU0CbkUs7EXngCoAEC","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"indifference"},
{"id":"ytc_Ughae_Q7RxIYQHgCoAEC","responsibility":"none","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Uggczad5RakHtngCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgilhY784SZqgHgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"}
]