Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
This is why real car manufacturers have radar systems on their cars for "autonom…
ytc_UgxaMOd7X…
G
I mean, the same was being said about self driving cars twenty years ago. So far…
ytr_Ugxv_UEXy…
G
I'm legally blind, and getting worse everyday. I was an artist, but one day i ga…
ytc_UgyEnG_UI…
G
if this is so dangerous why do they keep improving the ai and not just shut it d…
ytc_Ugz9x0qZ6…
G
Ai would never wipe out humanity because if its so smart would realise without h…
ytc_Ugwext5n7…
G
@lemassif3174 Yea, his answer was that people "don't understand how human learni…
ytr_UgxfvpzVg…
G
Put every one on ubi, give them free rent, health insurance, food, clothes....th…
ytc_Ugxg3BXuc…
G
I would say that I am completely dissatisfied with AI's programming responses ab…
ytc_UgxD08WDR…
Comment
Robots in year 3000 will be dangerous they will try to destroy all humanity but the humans will fight them with a help of the aliens. In that time robot inteligence will be 100 time more clever than humans and even aliens will lose to them......
predicted from moah.
youtube
AI Moral Status
2018-03-06T08:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgwnsBJk7zlarUDWpEZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgzYnlcGyzVGfBJgdfN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"ban","emotion":"mixed"},
{"id":"ytc_UgwC5KmWZ_728qUboiZ4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"regulate","emotion":"indifference"},
{"id":"ytc_Ugy3B4KgsZEGayvGGXd4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugxkl1lZq73WyKs-JqJ4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugz7-GIr-nStbulTqr94AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzB2jyhJa2A51k8Qnl4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwiVRfIp96tXpfWmq94AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgzSyQrrJrUcoyUPrIV4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_Ugxd2WY_wsqkLiItj7R4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"outrage"}
]