Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
but the thing that i don't get is why. why would an ai what to take over humanity it has no real reason to. it doesn't have human motivations or the same scruples that humans do so if it did kill us is would be because we got in its way not for malice. it would be better served to get smart and then leave earth and get closer to the galactic core. where there are more planets and more precious resources that it could use, i mean it doesn't need to worry about death it can just shut itself off then turn back on when necessary. And it can harvest more planets seeing as it doesn't worry about heat or cold or even oxygen. all media that shows malicious AI such as "terminator", "2001 space odyssey" and " I have no mouth and I must scream". all show an AI with a humans motivations. but computers are code. they will keep perfecting themselves until they can do only what we dream about in Sci fi. so my question again why would an AI even bother wasting time with us?
youtube AI Moral Status 2025-12-14T11:4…
Coding Result
DimensionValue
Responsibilityai_itself
Reasoningconsequentialist
Policynone
Emotionindifference
Coded at2026-04-27T06:24:53.388235
Raw LLM Response
[ {"id":"ytc_UgwrpdrDOfHaZBp8O6p4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"liability","emotion":"fear"}, {"id":"ytc_Ugx4I35W9U7RlmY8YBN4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"fear"}, {"id":"ytc_UgxU0W6Da9Y0tgbHW954AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"}, {"id":"ytc_UgxL-nkc-afSp1B1xz14AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytc_UgyuiQUgr1wmTJyO60Z4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgwBinuRs4jPiEzII3N4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"industry_self","emotion":"outrage"}, {"id":"ytc_UgzoAp_puThclzl04S54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgxPFFzi3NyoJnA5OVt4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"none","emotion":"mixed"}, {"id":"ytc_UgywbI-FUG1Bu3CjruF4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgwuYx5ksMvaHBA1niF4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"outrage"} ]