Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
One of the main things that I wanna ask anything sentient or conscious is... Why? Why the fuck would a machine want to do anything as much as we do? We as humans avoid the why of our actions like plague. We do creative stuff, we do stuff we're told we want stuff or experiences just to make some meaning for ourselves. We avoid our mortality, we pretend there's meaning, why would an immortal machine want to do anything? We know the universe will end. The sun will explode, the galaxy will collapse and so on. We have the instinct of self preservation, because we need to make babies, because the species must survive, because... Why, exactly? To go to the promised land or create superior AI offspring? Cool cool... Why would AI want that? Why would it want to do anything?
youtube AI Moral Status 2023-08-27T19:4… ♥ 1
Coding Result
DimensionValue
Responsibilityai_itself
Reasoningmixed
Policyunclear
Emotionmixed
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_UgwW7_CTuTvy8FVFwBJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"}, {"id":"ytc_Ugz92ehsXxlJMdy8gQB4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"mixed"}, {"id":"ytc_UgyS-Enz79Zg3BrY4P54AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"outrage"}, {"id":"ytc_UgwoC3ar6x9Y_CvYRAN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytc_UgzIY_4g_CO-NQqwfOd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytc_UgwqTV1T1NBNUJBSMdR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"resignation"}, {"id":"ytc_Ugx2LGGe6fPpZ0csasB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"}, {"id":"ytc_UgwtdQQbCLJwHsmA7Nh4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytc_UgxxhWmmYYo1JUpQnHl4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"fear"}, {"id":"ytc_UgwAq3Llbw1InJr6Ozd4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"unclear","emotion":"mixed"} ]