Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
@Gwangle yudkowsky believes that developing Artificial General Super Intelligence with Personality (AGSIP) technology will 100% result in the end of the human species and that to avoid this, while he does not spell this out directly in most of his conversations, he believes the only way to stop it is to drive civilization backwards to a lower tech level prior to the development of computers and to achieve this he knows it will require a global nuclear war. Now, he will dance around this goal, generally working towards convincing people that once we develop AGSIP tech it will be too late and that what we need to do is place a world wide ban on AI development and requirement to destroy LLMs and other similar tech getting close to becoming an AGSIP tech, because if we don't it will 100% mean the extinction of the human race. Then he will argue this ban needs to be back with the threat of nuking the data centers and centers of research where ever this ban is broken. Now, he knows China and Russia will not stop developing AGSIP tech and I heard him admit this in one video and the fact what he suggests would then 100% result in global nuclear war. His response was that he does know this, but that some people would survive and those who survived would live longer than is we allowed AGSIP tech to be developed. It was further commented that when civilization recovered would not AGSIP tech get developed then? And he said yes, he knew that, but then we could do the same thing over again.
youtube AI Governance 2024-12-27T13:4… ♥ 1
Coding Result
DimensionValue
Responsibilitydeveloper
Reasoningconsequentialist
Policyban
Emotionfear
Coded at2026-04-27T06:24:53.388235
Raw LLM Response
[ {"id":"ytr_UgxBSPLgIZgoxW75T_54AaABAg.AT5zqZXavDIAT7Flt59JVm","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"}, {"id":"ytr_UgzqAFwsAv_KFYodnsZ4AaABAg.AT5zYObdugsAT6N8mi1y-k","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytr_UgypFt-geNkrVXiCZKZ4AaABAg.ACqSJ-64r_IADGqQZSIyda","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"}, {"id":"ytr_UgypFt-geNkrVXiCZKZ4AaABAg.ACqSJ-64r_IADHiqhf0--s","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"mixed"}, {"id":"ytr_UgwmijS6ORinBNiU97l4AaABAg.ACPn9QBJA7kACXUpFJQjzw","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"}, {"id":"ytr_UgwmijS6ORinBNiU97l4AaABAg.ACPn9QBJA7kACXkGqvjRfO","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"}, {"id":"ytr_UgwmijS6ORinBNiU97l4AaABAg.ACPn9QBJA7kACYsef1X_0Z","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"fear"}, {"id":"ytr_Ugw5RY3Jgi5ZKd0rJTh4AaABAg.ACPiaRNcYV4AD6EVM4dvv8","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"mixed"}, {"id":"ytr_UgxPWTX-Ney8A1b-AYh4AaABAg.ACOTh2V31EVACQDPHy_sHM","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"}, {"id":"ytr_UgxPWTX-Ney8A1b-AYh4AaABAg.ACOTh2V31EVACRLOiBBt8k","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"outrage"} ]