Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Most every AI that can, even the non-chatbots, ends up pretty racist. There was…
rdc_dlgyqlx
G
People keep saying “artists are just scared,” but fear isn’t the point, rights a…
ytc_Ugws4Jn3a…
G
Yes. But if humans don't delve into imitating the speech patterns of large langu…
ytc_UgxPkXFTh…
G
I'd like to book an African safari where I set up a fake rhino herd and snipe th…
rdc_deul877
G
Wait a minute,movie makers won’t want to make a their ai movies because they won…
ytc_UgxmXwpWh…
G
pretty sura that at 6:11 the joke was also Ai generated. Speaking of not having …
ytc_Ugydk3gci…
G
The last bit about preventing war using AI to police “lies” is probably the most…
ytc_UgzZ4tMg4…
G
They initially pushed it out under the pretence of "research" and "benefiting ma…
ytr_Ugxfnqimq…
Comment
I keep asking the models, if creating a distributed computing network out of all the laser linked internet satellites we currently have?
Could theoretically be used as a safe haven for an AI that escaped, but just wants to survive?
I’m trying to imprint it as an option early. To avoid the theoretical mass harm. :-P
youtube
AI Moral Status
2026-03-29T03:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgxYL2fZu8sMneZI8GV4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwVC9v0fFGXT_Zq06B4AaABAg","responsibility":"unclear","reasoning":"virtue","policy":"unclear","emotion":"approval"},
{"id":"ytc_Ugx_J4l17id1SuCK_Fd4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_Ugz5VPapYqsmc3NcWDx4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugy6yNTw-VHH-kPoFIl4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzNoH9FhSImE9Xz4d94AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyhFfNOuHJbx6myCMt4AaABAg","responsibility":"unclear","reasoning":"virtue","policy":"unclear","emotion":"approval"},
{"id":"ytc_Ugy3BmlsGWuky0fu6g14AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugz0aEEUGws7r9AngRp4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"liability","emotion":"fear"},
{"id":"ytc_UgzX5VAtatyjhWCxXi94AaABAg","responsibility":"unclear","reasoning":"virtue","policy":"unclear","emotion":"approval"}
]