Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
This one is actually by far the biggest risk to total human annihilation: Climate change could kill a lot due to food shortages, certain regions becoming inhabitable, etc. but it doesn't really have the potential to kill all humans. Similar story for nuclear war: a lot of death, suffering, certain areas uninhabitable, famines, but not all human. Even with a very severe asteroid impact, humanity could persevere in special bunkers. Also, in the last couple years we got the technology to detect and intercept the globally destructive ones before it's too late. Biological weapons might be very lethal, but again, small isolated groups of people could likely survive and carry on humanity. A sudden gamma ray burst from a "nearby" black hole, or a rogue black hole passing through the solar system are the only things I can think of that could be potent and unavoidable enough to get rid of humanity altogether. Though luckily these two scenarios are extremely, cosmically unlikely to happen. AI on the other hand: if it's superintelligent and for whatever reason considers us an obstacle enough to decide to wipe us out, we are cooked. Compared to a natural disaster that stops even when there are survivors, a thinking force wouldn't stop until the job is done, cutting off the potential future of trillions of people that could have lived among the stars.
youtube AI Governance 2025-11-22T00:5… ♥ 1
Coding Result
DimensionValue
Responsibilitynone
Reasoningconsequentialist
Policynone
Emotionfear
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[ {"id":"ytr_UgxoCtXxVGUMylnbUKp4AaABAg.AMMVeHhZ_CCAMNf4Llv8jc","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"}, {"id":"ytr_UgxoCtXxVGUMylnbUKp4AaABAg.AMMVeHhZ_CCAMOFqJqmbLo","responsibility":"none","reasoning":"consequentialist","policy":"ban","emotion":"fear"}, {"id":"ytr_UgxoCtXxVGUMylnbUKp4AaABAg.AMMVeHhZ_CCAMOMo8xT9M3","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"}, {"id":"ytr_Ugx5Ai4xn2qCXGTnegN4AaABAg.AMLucwwjrWlAMMGyJ_v2Zm","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytr_UgwHYRPfGwNlHXWXURR4AaABAg.AMLroQvDxmtAMwuUW4pg0S","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytr_UgzbgPnwynGf75XgKNp4AaABAg.AMLTSkFHfByAMNll8kfwOD","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytr_UgwtiVwM7oQQfACTRJ94AaABAg.AMLPHd_BGZ5AMLQsFFwXtE","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"approval"}, {"id":"ytr_Ugzda0CqwR2pwyg5z9J4AaABAg.AMLK_oVa3m7APoEGQJnoe3","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytr_UgxruEjafI6CsLocy7Z4AaABAg.AMLJsptIyK_AMNqYBsO8LQ","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"mixed"}, {"id":"ytr_UgxruEjafI6CsLocy7Z4AaABAg.AMLJsptIyK_AMOcQgFtxbn","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"} ]