Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
But it is true that AI programmers dont program outcomes. They set the initial c…
ytr_UgyI481x9…
G
Sam Altman has never built anything, like the other pathological lying narcissis…
ytr_UgwTn5iPi…
G
>An arbitrator chose a new, complicated algorithm based way of determining vo…
rdc_jemn77v
G
You know this AI stuff might one day be useful and helpful, but companies need t…
ytc_UgwPJkn4z…
G
for work I do seasonal jobs, between 2 to 6 months. afterwards I am laid off and…
ytc_Ugw6Kgs-1…
G
Go to the Bible and see that God has got it all taken care of. Christians don’t …
ytc_Ugy16gmGw…
G
Damn I kind feel bad for the A.I. lmao. Like it's just trying its best to answer…
ytc_UgyiIBZXv…
G
nah, you just need to be able to use AI to hack into systems ;) like security ro…
ytr_UgzSu1HcZ…
Comment
I don't think robots would deserve rights until: They can think, understand the world, know that it is them when they look in a mirror. When they can feel pain, understand it. And also if the robot is weaker than the human/animal. Because if you were to give a titanium robot with deadly weapons built into him rights, then they will definitely take over the world. And what separates robots from humans? Humans are organic matter. Robots are not.
youtube
AI Moral Status
2017-02-23T14:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgiglmrSOaC-V3gCoAEC","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UggcVwGpN4yVdngCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UghMjRhW38shAXgCoAEC","responsibility":"ai_itself","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgjoVoZqTEOe13gCoAEC","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgiNylxDqi2bZ3gCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgiI_3TVpi3Nz3gCoAEC","responsibility":"none","reasoning":"deontological","policy":"liability","emotion":"approval"},
{"id":"ytc_Ugjfwf1Bv-_gf3gCoAEC","responsibility":"none","reasoning":"deontological","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgjgEocxIvWv6XgCoAEC","responsibility":"none","reasoning":"unclear","policy":"ban","emotion":"fear"},
{"id":"ytc_UgjgDdTu4uD0RngCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UggLE8qEuXr8zngCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}
]