Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
You fundamentally don't get it. The intelligence gap between you and ASI (Artificial SuperIntelligence) would be greater than the gap between a single neuron and all of human civilization. You are the neuron. ASI doesn't stay at human level. It improves itself recursively: it makes itself smarter, then that smarter version makes itself even smarter, then that version improves itself again. This cycle repeats thousands of times in hours. Each iteration is exponentially more intelligent than the last. It doesn't take breaks. It doesn't sleep. Within days of creation, it's already incomprehensibly beyond us. Could a mouse comprehend or defeat the entire US military? No. It can't even hold the concept in its brain. That's you versus ASI, except the gap is a million times larger. By the time your neurons finish firing to think "we should stop this," it's already run a million simulations and won. There is no "later invention." The moment ASI exists, humans stop making decisions about anything. This is why actual AI safety researchers are terrified.
youtube AI Governance 2025-10-16T19:4… ♥ 1
Coding Result
DimensionValue
Responsibilityai_itself
Reasoningconsequentialist
Policynone
Emotionfear
Coded at2026-04-27T06:24:53.388235
Raw LLM Response
[ {"id":"ytr_UgxfMqZ8scxJToInzul4AaABAg.AORtWWaqav-AORv8qoRPdX","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"}, {"id":"ytr_Ugy-pPDt7SzHf6ylfBV4AaABAg.AORfRy2d9tSAORk6vPVd2R","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"}, {"id":"ytr_Ugy-pPDt7SzHf6ylfBV4AaABAg.AORfRy2d9tSAORnSSBHO0U","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytr_Ugz8ZegHMWaK-01BD554AaABAg.AOMQ88y9DoiAORdtGWMOpM","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"}, {"id":"ytr_UgwCQBPrZDs3XaRyD114AaABAg.AOMK5cnEQ3fAOMNO6rCMZQ","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytr_Ugw8qULQLdbBFsmGOaJ4AaABAg.AOLvtdOkkMnAOLy_6nKWh3","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytr_UgwwV3-hrRAmWmG1nQh4AaABAg.AOLolsAhOwaAOLzGnm_WUs","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytr_UgztvlOJR5VU0zQVcNZ4AaABAg.AOLedb1GTT0AOLzwD2zCwi","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytr_Ugz4EL1lsfsJfQFsJHh4AaABAg.AOLa1olpbhYAOM-wsUCesU","responsibility":"none","reasoning":"consequentialist","policy":"liability","emotion":"approval"}, {"id":"ytr_UgxoKmj3UCpxDYFWtAN4AaABAg.AOLZGOSiuqUAOM0NwcdRy2","responsibility":"none","reasoning":"consequentialist","policy":"liability","emotion":"approval"} ]