Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I did not need to listen to minutes of conversation between some lonely dude and…
ytc_Ugx_qIz0r…
G
This is insane. I'm more and more convinced we're literally experimenting on a p…
ytc_Ugx_aI_7W…
G
Surely, a moralistic, ethical code should be installed from the very outset crea…
ytc_UgwrakQna…
G
So let me get this straight, all those who implied that AI has limitations (inhe…
ytc_UgwDKRNvq…
G
Who really believes these people will care for ethics? It's a race to the first …
ytr_Ugzvyp0I0…
G
Lol. AI makes non-designer delusional. superpowers? It didnt give that to you. D…
ytr_UgwFyhIZa…
G
Thought you might make an argument from aristotelian ethics but instead it was a…
ytc_Ugw4mKQkz…
G
What hes saying is, he and everyone like him who have helped to create and devel…
ytc_UgwXS9pcL…
Comment
We need to be careful when referencing what a “Superintelligence” would “want”. AI doesn’t have wants, needs, or any biological survival instinct. What they have is the distilled human reactions to different scenarios. They emulate US, but internally the AI doesn’t even actually understand what they are saying. They don’t have any internal structure that allows perception of time, self, emotion, empathy or agency. They SIMULATE these phenomena, but don’t actually experience them.
youtube
AI Moral Status
2025-10-31T09:3…
♥ 12
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | deontological |
| Policy | none |
| Emotion | mixed |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwGCenfic0DffQynGV4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxMwUrLPPKGZc7N7gZ4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"mixed"},
{"id":"ytc_UgxYTqk0c1AMEO-Cn0R4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"disapproval"},
{"id":"ytc_Ugzvezki_UIzKiot7-R4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwqT_qp2eypDr9Kwf14AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyYZAS6C1uYHlECl894AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgyJjyR6omrJ_AWUSwR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwTHp--dd6C17hBoY14AaABAg","responsibility":"company","reasoning":"virtue","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugyv3k5O2BLJBDFPWJN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugz1YYOzpzTlkFa9XrV4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"none","emotion":"mixed"}
]