Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I need fellow artists to understand that people genuinely think they're entitled…
ytc_UgxVC5bP0…
G
Surveillance capitalism, yes, and we have to rectify it. But much worse than tha…
ytc_UgxWrebGy…
G
@odabasim We were supposed to have flying cars decades ago. We've had code gens …
ytr_Ugx2A7E-d…
G
Chat GPT is unnecessary and unhealthy for all children… and any person suffering…
ytc_Ugyl5lqXx…
G
Nothing new here, really. The most famous, recognizable works by artists like Wa…
ytc_UgwdJNeE8…
G
After my recent conversations with Gemini, I absolutely understand the answers. …
ytc_Ugwj_Rn2c…
G
His robotaxis are expanding fast, and regulation is no longer a problem (the poi…
ytr_UgwwPos9M…
G
While AI risks are real, I rely on OSVue to handle customer support as I develop…
ytc_UgxTP0m5p…
Comment
We should ban people from programming emotion-feling robots.
A robot is a machine which is programmed to do a job, it must follow its creator/owner's intentions.
If a robot has feelings it should be reset or disposed of and whoever programmed it to feel should be punished in some way.
Robots do not have feelings, their feelings are merely the mechanical process of collecting info and elaborating a response, a skill WE HUMANS gave them, it's not emotion, it's a process.
youtube
AI Moral Status
2018-07-02T02:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | unclear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[{"id":"ytc_Ugx7YznFYEUKkMe1iBd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzahW5WKawAqoKCB7t4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwRHWKvJT8IhKO-_qF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxbgNKJMW57e2gSy1B4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzYCJpRzmrEA7SN_ll4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugw7dI6ViiYSCEbnzft4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzinrD6hweefSHzu-x4AaABAg","responsibility":"none","reasoning":"contractualist","policy":"regulate","emotion":"approval"},
{"id":"ytc_Ugy9MR1jF5P4ZT51IHR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugy75Vkh-6d8zWFeqFZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxnZ11_1Tt2abQ2lgh4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"outrage"})