Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The fact Alex used ChatGPT to introduce the sponsor of this video broke my brain…
ytc_UgxGF0Hyf…
G
i can no longer draw. i have limitations i didn't used to. i dont have years to …
ytc_Ugwo4Gved…
G
Goodbye ladies hello future wives humanity is over boys it's been real but ol AI…
ytc_Ugxl5W2JI…
G
the problem of purpose will be solves by virtual reality.
by then a person can b…
ytc_Ugzuw-2iu…
G
to be able to create a new face that nobody else has, it would have to have take…
ytc_Ugw5ZFmXi…
G
Can’t believe anyone can be deluded into thinking this is normal. We do not need…
ytc_UgxAofo3h…
G
Again we're not all to blame. Its not fair. There are a few of us that live life…
ytc_UgyvdN4gy…
G
This isn't a problem we should fix about the goddamn ais. It's something we need…
ytc_UgxxmYpXb…
Comment
In my opinion, i'd give robot rights to robots that function beyond simple repetitive mechanichal tasks like common engines.
The robots from Megaman,iRobot,Terminator,Robocop,etc. fit the criteria of having basic functions, but have the capability to make their own decisions without the assistance of their manufacturers or whoever's currently using them, therefore they're allowed to have robot rights.
basically having an Ai, the will/desire to preserve one's own existance & the capability to act independently in the same range as humans do gives them full access to the rights.
still, we *do* need to give them laws so they won't try to dominate us, we *did* create them, after all, if they'd need a place to live, send them to a compatible planet, monitor their activities & keep in touch to prevent secret uprisings.
youtube
AI Moral Status
2017-02-24T02:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | deontological |
| Policy | liability |
| Emotion | approval |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_Ugiho-tsco0HsHgCoAEC","responsibility":"ai_itself","reasoning":"deontological","policy":"liability","emotion":"approval"},
{"id":"ytc_UgihP4M0zuJ2L3gCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugi6zFhOrnv24HgCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugi5s15A-5P3A3gCoAEC","responsibility":"ai_itself","reasoning":"deontological","policy":"regulate","emotion":"approval"},
{"id":"ytc_UggpOpWwDB4wVHgCoAEC","responsibility":"distributed","reasoning":"contractualist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgheShPcV_JPhXgCoAEC","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UghNrYEViz_YengCoAEC","responsibility":"ai_itself","reasoning":"deontological","policy":"regulate","emotion":"approval"},
{"id":"ytc_UghLJVnLMJarmXgCoAEC","responsibility":"ai_itself","reasoning":"deontological","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgjyEqoBSc9RuHgCoAEC","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgiUnh0YmHk5LXgCoAEC","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"mixed"}
]