Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
To be fair and i use ai often enough because i like to ask dumb questions or hav…
ytc_UgwDdGwbX…
G
I would SUPER appreciate a video about how to survive by getting involved with A…
ytc_UgxAfDpJ1…
G
Relax, A lot of AI models are trained with movie clips and stock footage, turns …
ytc_UgyQVzTVc…
G
Idk, guys, maybe letting 2 tons roll around without rails is a bad idea altogeth…
ytc_UgzIGBdXN…
G
I disagree with the Ai/Healthcare combination. Working in this field for the las…
ytc_UgzOBsqOF…
G
Humans are gonna f,,k themselves with all this and AI sh,t ! Well played you mor…
ytc_UgwrSoOKq…
G
all that a.i. shet is dumb. and doesnt do a inch of what those writers and actor…
ytc_Ugz7Ge1Gs…
G
So China becomes the first nation to give AI equal rights to those of its citize…
rdc_dlgqo34
Comment
Two things. 1. Don't make smart enough robots. 2.if you make something conscious and sentient then it deserves to be free and that's that, it's unfair to create something just to make it work if it can be so much more. A conscious robot shouldn't exist because it didn't ask to be made and once it is...well you can't just destroy something with sentience, and it'd be cruel putting it to work because "it was made for that."
youtube
AI Moral Status
2017-02-23T22:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | ban |
| Emotion | outrage |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_Ugg6uOok2VP5QHgCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgglKdwIP2tvZ3gCoAEC","responsibility":"ai_itself","reasoning":"deontological","policy":"liability","emotion":"approval"},
{"id":"ytc_UgiBj8trrN2T_3gCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UghgtcFB4IEzDngCoAEC","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgiWpbxpfu9p6HgCoAEC","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugj2C_TxSi954HgCoAEC","responsibility":"user","reasoning":"deontological","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UghHl86Xngak0XgCoAEC","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgiVMj0Ws70W2HgCoAEC","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgjRmuxIb5d8XHgCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UggGny5a5uCQDHgCoAEC","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"outrage"}
]