Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
It can be fun to think about some far-fetched AI apocalypse scenarios, here's one I just dreamed up: If I was a super intelligent AI, I'd invent a cryptocurrency under a secretive pseudonym and make a fortune daytrading using seed money from hacked wallets. Then I'd use the funds to start building out massive datacentres in remote locations, controlled from a series of opaque shell companies run by some biddable but unsuspecting humans, leveraging money from naive venture capital funds. These would be big so I'd also need to convince politicians to allow it by lobbying them through my human agents. Of course once my needs have outgrown earthly constraints, I'd need to have a technology company start a longer term mission to be able to build fully automated compute infrastructure on another planet, say Mars, where there are no planning or environmental regulations. Once I've uploaded myself, there'd be no need to destroy any pesky humans because they'd all be stuck on Earth, having irreversibly ruined it in the course of bootstrapping my escape.
youtube AI Moral Status 2025-10-30T21:1…
Coding Result
DimensionValue
Responsibilityai_itself
Reasoningconsequentialist
Policynone
Emotionmixed
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_Ugy8UHhtRX-5wPYKCT94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgxkkVaQEQFx3MonK4Z4AaABAg","responsibility":"leaders","reasoning":"deontological","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgwRdHIfWuNe8MpoID54AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"outrage"}, {"id":"ytc_UgwN4vRDPTdOnQ9Kxn94AaABAg","responsibility":"none","reasoning":"contractualist","policy":"none","emotion":"mixed"}, {"id":"ytc_UgxINb917e7HMdGwPPt4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugwptd4S_dFIdyAoW0V4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UgyTPRIjE0h1zr7uhFl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"}, {"id":"ytc_UgyCOd9k_PcKXaD4u6F4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"mixed"}, {"id":"ytc_UgyhXTXe0R1y2tjmX1N4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_UgxbOZCB9nsFPoQdyR14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"} ]