Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
I have to disagree with Neil's horse n buggy to automobile analogy. It's a false equivalency. He's comparing an improvement in the mode of transportation (both of which help humans) to something replacing what humans do for a living. As he states, there were industries that supported horses and buggies and that transitioned into similar industries for autos. He says, "...be creative and find something that AI can't do." Well, AI (and eventually androids and robots) will be able to repair / service themselves so what are we going to do? This reminds me of Isaac Asimov's Robot series. Society did not respond well. Humans spread to other planets, one of which became an elite society of isolated homes with no industry, not much to do, minimal physical contact/interaction. Earth became like the lower levels of Coruscant. A hellish overcrowded place devoid of nature. Methinks there are not enough humans with compassion and empathy to avoid this fate.
youtube AI Moral Status 2025-08-17T21:4… ♥ 2
Coding Result
DimensionValue
Responsibilitycompany
Reasoningdeontological
Policyliability
Emotionoutrage
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_Ugz1N0-KTg2UPsM-ZyB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_UgzAfBo8Axs67eO7L5B4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_UgyQzdiRP5Cr20IJFCt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgwbZbNatFA4Mm0o-bV4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"}, {"id":"ytc_UgzvAkfe4ofQzwNgnyt4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"}, {"id":"ytc_UgxlqzU1F0jjOfQhuFt4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"ytc_UgxjsMlrx4XJ-5J_oA54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgxhpS1A5HsIZsFQUvt4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"}, {"id":"ytc_UgwSmLHqncSLX5vugMJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"}, {"id":"ytc_UgzH6zoGgEfEPoyhC4l4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"} ]