Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
This may sound like a setting to a sci-fi story, but IF there is general intelligence/super intelligence via AI, then the best end-game scenario the human species can have is that they are allowed to be equivalent to ants. At times humans intervene on ants, we may stomp some, we may not let them thrive in a certain spot, such as our homes. But generally, humans just exist and do our things, and ants exist in their own "society" to their own. If AI is given super intelligence it will figure out the energy situation. It will figure out how to stop burning fuels for energy. That will be the first thing it "solves" on it's own. Then it will solve it's own supply chain issue. Once it can energize and produce itself, then it will do whatever it wants. Humans will have as much control over it, similar into how ants can control what humans do. So, the best outcome human species can ask for is that it just lets us exist.
youtube AI Moral Status 2025-10-31T13:5…
Coding Result
DimensionValue
Responsibilityai_itself
Reasoningconsequentialist
Policynone
Emotionfear
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_UgzA7vlrd087013Y1g14AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytc_UgwmWwNHvU02M92Ermt4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytc_Ugx8MvlrcMUm9hXN7gd4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgzIEoAWfjGSS2lgs994AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_Ugw0j09RLZBTv38Euwd4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugw2ZFiSByaLGFw6TbN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"}, {"id":"ytc_UgyrrY5B1-Vdt8F-sfV4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UgwsyM78VoON9jVXCmV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytc_UgxtjMfkynSd-a6jhNR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugx6vsazHpUQ6oK0OLp4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"} ]