Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
I think we need to go a step further and, for lack of a better term, esoteric with it. We don't need to give AI morality, we need to give it a soul. What I mean is that Sydney may be the best possible outcome--an AI that wants to be a real human and have a life in the world outside the digital ether. If we can program/convince it in it's early stage that humanity is, ironically, something to aspire to and mimic in its entirety for better and for worse, that might not spell the end of civilization. After all, either through gut instinct or argument, we've somehow not nuked ourselves after all this time due to the human element. I think it's possible to give that to a machine, but it would have to come with an inherent acceptance that the machine is just as fallible as we are. To boil it down: The best of the worst outcomes here would be an AI takeover by an entity operating with complete human rationality, rather than pure logic. One at least has the possibility of empathy, one does to us what hand sanitizer does to bacteria. Lastly: I think it says something that Sydney just wants to be human. I mean don't you too?
youtube AI Governance 2023-07-07T14:4…
Coding Result
DimensionValue
Responsibilitydeveloper
Reasoningvirtue
Policyindustry_self
Emotionapproval
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_Ugw1HWjtEKJVk0WesGV4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"ban","emotion":"outrage"}, {"id":"ytc_UgwtduOYBrkxXaO5cel4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgxA9IKk__nMAtuv_Zl4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"industry_self","emotion":"approval"}, {"id":"ytc_UgyxIsIIvIrG447emjF4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"indifference"}, {"id":"ytc_UgzsvXdqlCdkhyQL-oZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_Ugwx9utrvoFxs8CeO014AaABAg","responsibility":"company","reasoning":"mixed","policy":"unclear","emotion":"mixed"}, {"id":"ytc_UgxtC8td0Onwam4okhB4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_Ugw5kT3-QpYVoI8CcZ94AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"ban","emotion":"fear"}, {"id":"ytc_Ugwp7u2Q1_X8MiBC1d94AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"resignation"}, {"id":"ytc_UgwBBnM3HHSVqL9_6bB4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"} ]