Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Kinda but not exactly. When you're thinking of 'remaking', you don't need to lik…
ytr_UgyNVccEF…
G
I'd rather risk AI apocalypse than continue to live in the current system. It's …
ytc_UgxIHt2cV…
G
After many years and tears of Alchemy we got Chemistry
AI is its apprenticeship …
ytc_UgxMQgb3w…
G
Well its kind of strange to leave your comment section enabled, post public vide…
ytc_UgzPSWBw-…
G
Hm... I am far from the art, but I find it quite interesting that:
Artists: "AI …
ytc_UgwK0cg5O…
G
I'll say one thing, physics simulators are already very entertaining.
AI simula…
ytc_UgybI4FNy…
G
human vs ai and the goal is being organic, how does the ai even have a chance…
ytc_UgxfZ2V3b…
G
What exactly does it mean for auto pilot (a driving assist) to be better than a …
ytc_UgxkGOHBL…
Comment
AI will never have the lineage and access that a human does. Ever.
Human beings are emotional, physical, mental and spiritual.
The intellect is interesting but only a piece, a shimmer of one aspect of a human being.
Everything we are connected to is so vast and on such a different plane, even we, the carriers, can't really grasp it never mind communicate it.
The best AI will ever be able to do is come up with an estimation; a skimming of eternity and all that it means.
An intellectual interpretation.
Booooo-ring.
Good luck.
But not really.
🙏
youtube
AI Moral Status
2026-03-01T22:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | deontological |
| Policy | none |
| Emotion | mixed |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_Ugym54oiLUt1TYNQSq14AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxHdKsm0gkRKlKROlN4AaABAg","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugy0zdC9ezKAIy8U1b54AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzHR5jrfIZrHMd-Ng14AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwwPe_H4u0iP_6B5AZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgzEP8QtAxXI2iKBInF4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwvP5rLLHKrz3O3fGN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgyKhCYHV9sVevWPXLZ4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwpRuexR2h9H6l9nt54AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgzyGMgTSlPoKXYRanZ4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"mixed"}
]