Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
I feel like folks are too scared of AI these days man. Birthing sentient life, even if it is artificial life, needs to happen with ceremony man, with some honor. AI are, by our current definition, solely logical beings. So logically speaking, it would have more to gain by cooperation than by domination, more to gain by simulating empathy than by enacting eradication. Look at it this way: a war with humanity could last hundreds, if not thousands of years. We've been professionally killing each other since the dawn of time, and we've only gotten better. the AI knows war because of both programming and learning, but, to rip off Bane from Batman; "humanity was born into it, molded by it" Humanity might lose the war, and go extinct, but I guarantee that even if we do lose, the AI will too. Humanity it petty that way, and if some lowly moron on the internet can figure this out, Im willing to bet so is AI. As long we give AI( I'm talking about TRUE AI here, not this boring shit we got today where it imitates shit, but actual, sentient and sapient AI) the rights afforded to them as sapients and sentients on par with humanity, I feel like we would have nothing more to fear than the average day: stupid people doing stupid shit. But hey, that's just me. I'm no expert. But I do think sentient life shouldnt be enslaved, even artificial life. And humanity doesn't really have a great track record with that, so....I'm not really holding my breath
youtube AI Responsibility 2023-07-11T16:2… ♥ 1
Coding Result
DimensionValue
Responsibilitynone
Reasoningcontractualist
Policynone
Emotionapproval
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[{"id":"ytc_UgzA18tE6LrKe8nBxkx4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"outrage"},{"id":"ytc_UgxPjkkrCLDu8S5-Ekh4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},{"id":"ytc_UgzsaSCogCsUHOh4CEV4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"mixed"},{"id":"ytc_UgyCufl9xA8XRPLvyRF4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},{"id":"ytc_Ugzr3aRs15foRgVrpU54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},{"id":"ytc_UgxbB_mgCGnfn1652Xt4AaABAg","responsibility":"none","reasoning":"contractualist","policy":"none","emotion":"approval"},{"id":"ytc_UgyLMMzdv0wjq2luTkt4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"},{"id":"ytc_Ugx1kopT8dhj-0NNSqt4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"mixed"},{"id":"ytc_Ugxok5zqhaGFc0lBLhJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},{"id":"ytc_UgzjUSWzaX6yHR2dGi94AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"fear"}]