Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Could AI automate pretty much everything? Sure. Should it? No. Will it? Probably…
ytc_UgxwZvHBE…
G
Notice how there isn’t anybody arguing with him. This is because 1:more people t…
ytc_Ugzhyh0Nq…
G
AI art is generally boring and limited but not for the reason stated. Personally…
ytc_UgweGZDIs…
G
I yelled "POOKIE" when the PukeiPukei showed up, AI is making me mad but that fe…
ytc_UgyTzCV5A…
G
There's no moral question as to whether sapient robots deserve rights, obviously…
ytc_UgjKd8TV7…
G
Nahhh. AI (or Arty) just grew up real fast and decided to go live with their new…
ytc_Ugy6su2NY…
G
Well I think human beings are over rated. The centuries and decades have moved …
ytc_UgxDNWTrP…
G
You forgot to mention this critical "new device"...
The Quantum Computer....
W…
ytc_Ugy1tuTM3…
Comment
The implementation of the bea*t that is AI here on earth at this particular moment in time was planned thousands of years ago by the malevolent and psychotic race of human extraterrestrials universally known as the Watchers aka Maldekians aka Anunnaki aka Irinim aka Raqib aka Deva aka Neteru aka Egregoroi aka Tiwar aka Tuatha de Danaan aka Shining Ones. It is not a human invention or accident...
youtube
AI Moral Status
2025-12-14T19:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgwSapLRfxZc2aDJ8tR4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxTq1Boru0PXMHo5lN4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxvZf9DclKP4fvKh4h4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxsNZ_WOh_3oe8FWcl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyjS5yBjG0dVaoRKn14AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxQL2QUDBiOKBBBond4AaABAg","responsibility":"distributed","reasoning":"virtue","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxbeMw2FfPcs99wRxF4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwyrIoUTw1PY88mdDl4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgzUdqCMPucOO57NL8B4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgyoB5lwt8raCj54NY14AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"resignation"}
]