Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Nope. It’s nothing to it. Nothing. Well, except for one thing. Asimov’s “Frankenstein complex.” Where people’s fear of something is putting the cart before the horse, or…”jumping the gun?” That’s the most fearful byproduct of autonomous technology: presumptuous, unfounded fear of said autonomy. Which people easily forget from previous info of such past autonomous technology. When computers were introduced into, say machine shops, the workers feared being replaced by computers, when robots were used in manufacturing, people feared they were going to be replaced by robots, when robots of any type were introduced to everything, well…a famous robot uprising movie was made and no one faired well with the most famous catch phrase, at the time. Yet, such catch phrase only proved to be a recursive curse, that introduced advanced technologies, promised within a decade or two it;d be common place, yet for such time to pass, only to reintroduce such technology, again, and again and again, etc. Yet no one sees this so the Frankenstein complex remains, as a lesson proving there’s nothing to it, but people only induce the fear, for nothing. Specially considering the following, which was an inspiration and addendum to a 2018 GEICO commercial: Smart phones were a technological achievement, people used them to do stupid things. Artificial Intelligence (AI), is a scientific achievement, people use it to do stupid things, yet again. Conclusion: smart technology is making us stupid. Recursively.
youtube AI Moral Status 2026-02-04T12:3…
Coding Result
DimensionValue
Responsibilitynone
Reasoningvirtue
Policynone
Emotionresignation
Coded at2026-04-27T06:26:44.938723
Raw LLM Response
[ {"id":"ytc_UgyqSJ7QJ8nBGHJW7E14AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"ban","emotion":"outrage"}, {"id":"ytc_Ugx1nMUdWfK0WBCLxhF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_UgwW-TxbdqzHwrpeGR94AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"fear"}, {"id":"ytc_UgzXwqjoKLb4CrzYx354AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgyX3yysiuXsoukSYF54AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgzwMJSD5oYYIB47DFd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"}, {"id":"ytc_UgyW71Aow2gSGnk_-_R4AaABAg","responsibility":"none","reasoning":"unclear","policy":"regulate","emotion":"indifference"}, {"id":"ytc_UgzPREvPBBgBOszhLXV4AaABAg","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"outrage"}, {"id":"ytc_UgwsYf2cXo3CU60bUcJ4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"resignation"}, {"id":"ytc_UgyBu0656-maxC9rTSh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"} ]