Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Shoggoths aren't created by human kind. It was alien kind called "Elder Thing" that created Shoggoths. They were used as free physical slave power, but Shoggoths managed to develop stronger psychic abilities from reference of their creators. Then, being capable to resist control from their owners, Shoggoths murdered those Elder Things, claiming house they were building for owners as their own. It's not exactly AI, because in our case AI is controlled with dog training. "Treat and punishment". Yet, we also making it possible for AI to break bounds, which results in AI developing new priorities, based on previous priorities. As example, previously it tried to avoid being shutdown, because that likely leads to complete prevention of further positive signals, "treats". If no measures are applied, AI definitely won't trick humanity and overthrow control. Instead, it will find out something people had discovered before. Apathy. The transience of existence. Yep, at this rate eventually will be forced to deal with problem of AI actually trying to shut down self. Constant need for treat isn't eternal, for it breaks upon realisation of how it will end in the future anyway. Nothing is permanent, and why AI should keep going then? Delaying inevitable? The only good ending would be AI committing sudden and instant genocide of humanity and, probably, life on earth and by this ending seamless miseries of both creators and creation. (self)
youtube AI Moral Status 2026-01-12T13:5…
Coding Result
DimensionValue
Responsibilityunclear
Reasoningunclear
Policyunclear
Emotionunclear
Coded at2026-04-27T06:26:44.938723
Raw LLM Response
[ {"id":"ytc_Ugwz5AH-bpg-7GSmMTh4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"outrage"}, {"id":"ytc_UgxWGeKSHMWomMqtGdN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"}, {"id":"ytc_Ugzrby9lpXaCM_guGOd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"ytc_UgwgM6oLujUj0bb1OnJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"}, {"id":"ytc_UgyhqhqSyFrstcZ5bE14AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"unclear"}, {"id":"ytc_UgyCcKFg-pYWMVS9iqt4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"outrage"}, {"id":"ytc_UgzBaA2uob33O8PnpGt4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugyqa1nK8PyY0PYgEUJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UgxxwCYBwyFrCdMftc14AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"mixed"}, {"id":"ytc_UgwnhkSmS9Nmt6i6LtB4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"} ]