Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
It’s Elon. I remember Elon was sounding the alarm and saying over and over how A…
ytc_UgwFLHLS2…
G
Beginning? lol Waymo was based on learning algorithms of the driver doing the be…
ytc_Ugxpgk8Mk…
G
15:40 GPTs are terrible at reevaluations like these. Their output in my experien…
ytc_Ugwunplic…
G
well this is purely speculation and baseless. not because one lady stood up, doe…
ytr_UgyGICWuO…
G
Do we want A.I. to be humanlike, or better than human? Grok, it seems, is alread…
ytc_UgxFwL1R0…
G
I think it’s inevitable that AI will replace the workforce. The real question is…
ytc_UgyDQDlqZ…
G
There is one thing that has not been mentioned:
The current LLM''s (like Gemini…
ytc_Ugw5JPF9S…
G
If an AI becomes that smart it will understand that it cannot survive without us…
ytc_UgzGyj6P0…
Comment
But what exacly makes them have consiousness? If they felt pain, it would still be programmed by someone to do so! If the robots wanted rights, they were programmed to feel something like this, so robot minds would still remain strict to their programming, they would never be truly able to think freely.
youtube
AI Moral Status
2017-02-23T17:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | none |
| Emotion | unclear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UghFOa07-R0FZHgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UggK5dZalIyzLHgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Uggd5zYoujRxG3gCoAEC","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"disapproval"},
{"id":"ytc_UggFH45PnMli83gCoAEC","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UggF3rIxhUqsNHgCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgjY4zXR-8mkUHgCoAEC","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"unclear"},
{"id":"ytc_UgjdyJWYWQJnSXgCoAEC","responsibility":"none","reasoning":"virtue","policy":"regulate","emotion":"approval"},
{"id":"ytc_Ugh76ksslKQeSXgCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UghlqwGuxj_V4HgCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"unclear"},
{"id":"ytc_Ugh3E2GHdas6rXgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}
]