Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Even A.I. is racist 🤣🤣🤣🤣 I got a bad feeling about this 🤔 terminator version of…
ytc_Ugz9azLLq…
G
Definitely agree on the last point. If they are going to regulate it get people …
rdc_jkh01i3
G
Coming from the guy trying to become the leader in the field of AI. “Let me put …
ytc_UgybtLDZ4…
G
There *IS*...something occuring in some of the models. Not all of them. Not all …
ytc_Ugzzw_oWR…
G
Wipe out the working class, and maybe worse, the creative / artistic / cultureal…
ytc_Ugyd6nyZH…
G
No you won't slow it down it just gained the ability to accomplish twice as fast…
ytc_Ugzi94kUc…
G
I don’t get mad when lawyers or doctors have to look things up or double-check. …
ytc_UgyifjnlT…
G
I'm over here increasingly wondering if we have to just abolish capitalism entir…
ytc_UgwGVlJKn…
Comment
Idk about ALL rights but at least rights protecting human's bodily integrity, robots don't need them.
Robots only develop consciousness and feelings if we teach them/program them to feel.The argument "robots could develop a robot who feels" doesn't work. How those earlier robots know about feelings and consciousness if we don't teach them? If they don't know about the subject, they can't create a robot who has consciousness and feelings.
youtube
AI Moral Status
2017-02-23T16:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[{"id":"ytc_UgjAIMevKcxrnngCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UghA9z6zW0bejXgCoAEC","responsibility":"none","reasoning":"contractualist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ughkx0Mum9Cdv3gCoAEC","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"approval"},
{"id":"ytc_UgiW0mYZuMt7_ngCoAEC","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UggEDexH2OK8gngCoAEC","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgjxS0Kmu4JbOXgCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_Ugj9MK_eJU-tCHgCoAEC","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgjxlEoy6_MqTngCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgipqO8xcHoXyHgCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UghPudcmY-9ThHgCoAEC","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"indifference"}]