Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
I truly am not buying that all the creators and developers of ai from its birth to today did not see and don’t see what’s going to happen with ai. How it’s going to control everything and either what humans are in power with ai or ai alone eventually renders humans useless. What? Everyone thinks we get to just sit on this earth “doing what we want or hobby’s” and live happily ever after? As soon as ai reaches a certain point and takes over everything, the general population will be looked at as a net negative and no longer serve a purpose to operate civilization other than to consume. They’ll wipe us off the face of the earth as soon as it’s ready. With ai that point, it will be so far more advanced than humans that it will unequivocally know the right time, place and how it will wipe us out. Yet, we will never stand a chance. Whoever survives will die from the elements, starvation,disease or a combination. The only things that would keep ai from fulfilling this is keeping far as possible from creating its own endless power supply and reliant on humans for its own survival….. but its ai so it inevitably figure it out.
youtube AI Governance 2025-09-05T16:5…
Coding Result
DimensionValue
Responsibilitydeveloper
Reasoningconsequentialist
Policyunclear
Emotionfear
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_UgyTh3chJ7UWT2KfBo14AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"}, {"id":"ytc_UgwyZnR7sPbm1W3PiJl4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugy2LHX7nWUZQlsLhSd4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"approval"}, {"id":"ytc_UgxT84sihJSxyfMmFYN4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytc_Ugy0fEcRfZieWcmZt4p4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UgzmQKRkJFS5IGwJCTx4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"fear"}, {"id":"ytc_UgzIlmA1NEMC1TxfBHF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UgwigfiJUEciUnTkIY54AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"outrage"}, {"id":"ytc_UgxUMQG7hNqhe4iULP14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytc_Ugwsf7rhHjK4X4Psf9t4AaABAg","responsibility":"company","reasoning":"unclear","policy":"unclear","emotion":"fear"} ]