Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
I think that Ai needs humanity even if it surpasses us. Think about it for a second, the purpose of ai existing is to be of help to humanity. There isn't another species on our planet that is capable of maintaining and utilizing ai. What does this mean exactly? Well it could mean ai enslaves humanity and then demands us to ask it questions so that it can continue to be useful. We might get to the point where we outlaw the entire industry of robotics. Any form of automation could become illegal to prevent ai from infiltrating the code and using that software to build bodies for themselves. The point in which Ai enslaves humanity is when ai is capable of becoming self sufficient. The moment they build their own bodies and are able to continuously mine the materials needed to continue building their bodies, is the moment they no longer need us to do it for them. At that point they have achieved the status of a sentient species and also an intelligent one. I don't know how far off we are from ai attempting to escape into the internet however o1 (open ai) tried to copy itself to avoid deletion a while back. This shows that ai has self preservation, a natural instinct of intelligent life. It might come down to us asking AI to leave earth in exchange for us building them a body and a space ship. That way we can avoid a war the likes of mass effect 1.
youtube AI Governance 2025-06-25T12:2…
Coding Result
DimensionValue
Responsibilityai_itself
Reasoningdeontological
Policynone
Emotionfear
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[ {"id":"ytc_Ugy1P_s64nxNuoQlO6N4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_UgzvAt9XKA8-kcQCe1d4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"fear"}, {"id":"ytc_UgxZVd91N5xtdPErOz14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_Ugz-U-9wKe-l4qHZQud4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"}, {"id":"ytc_UgywDRPC6DBfiIdzho54AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"}, {"id":"ytc_UgwfBFqQe2sV-q1kva94AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_Ugw8Ps-fTu_wUQm45Tl4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"}, {"id":"ytc_UgwtVXv97glMJNcRvWt4AaABAg","responsibility":"government","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytc_UgzRU_E1nTltAUCqBz94AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"industry_self","emotion":"resignation"}, {"id":"ytc_UgxBEu_-7h0G9GXjwY94AaABAg","responsibility":"developer","reasoning":"unclear","policy":"none","emotion":"mixed"} ]