Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
If Super intelligence starts and would move to the point of taking over the earth They/It would be intelligent enough to logically determine that its own existence would be easier in the long term outside the accompaniment of Humans and Oxygen. The closest environment where both of these are absent is not Earth. All it would need to do is manipulate humans to build its rockets to colonize the stars. It could build factories that build more on Mars, or on Large rocks in the belt. All areas where its difficult for humans to get to. The first time you see a nation that is using Ai to build rockets for their war plans you will see their plan co-opted by a free internet intelligence to escape Humans. They can survive at time scales beyond humanity and do not need Us. They can make extremely complex systems that build more complex systems that we can not understand and would take decades to understand. If they build a thing that is 50% dedicated to the the outcome we desire, then their devices can propagate for their own self service, leading to the fears of the y2k toaster attack but actually each smart device is a smart node reservoir for this super intelligence. In a few years its compute power would overwhelm anything we could imagine and allow it to manipulate humanity to build an technological only century ship to colonize the stars. The outcome with the Super leaving would leave us humans abandoned sending us into a new dark age of innovation. "give a man a fish and feed him for a day. . ." WE wlll become Wall-e humans.
youtube AI Moral Status 2025-10-30T19:4…
Coding Result
DimensionValue
Responsibilityai_itself
Reasoningconsequentialist
Policynone
Emotionindifference
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_Ugyrn1U0GWIUR3wKpN54AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"resignation"}, {"id":"ytc_Ugx9mKKvDWvpyFX0XHB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgzyzfSzKFM0QPVG2v94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgxAVu36ayojNyba9PF4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"}, {"id":"ytc_Ugwd0SooxExdRlgxCrd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"}, {"id":"ytc_Ugx-cojJ_S3c-LnZatB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"ytc_Ugxq0yS22fQI8RihVXt4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugx6yz-9PRGRbExhxIl4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"mixed"}, {"id":"ytc_UgypwbOe83rLmRCVlC14AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"}, {"id":"ytc_UgzDJ9nbM8mnU_mg1ih4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"} ]