Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Avid readers of science fiction have dealt and absorbed the AI 'problem' for decades. Cinema flashback; AI relationship, as in the movie "Her" (2013 with Joaquin Phoenix), HAL going mad (2001 Space Odyssey, Stanley Kubrick, 1968), early AI in movies: Fritz Lang's Metropolis (1927), in literature as early as 1872, Samuel Butler's novel Erewhon. Collective human awareness limps behind sf-readers with decades, all themes popularized through visual media. Thinking, learning and evolving the mind requires effort. Since machines, and now AI, habitually replace the effort, we foster generations of incompetence. There will always be exceptions. AI will perish along with us. It requires an enormous amount of energy. Instead of focusing on how we can use less energy, we try to find new sources to feed our voracious consumption. The end of an unsustainable world order is preordained. To understand the harsh nature of reality, we get more insight from studying metamorphic rocks than from AI. So say a reader of science fiction, who marvelled about AI (Artifical Insanity) long before the term was coined. Read Roger Waters lyrics Amused to Death, verse 5. And don't worry, don't be sad - things go as they must go.
youtube AI Moral Status 2025-06-08T05:1…
Coding Result
DimensionValue
Responsibilitynone
Reasoningconsequentialist
Policynone
Emotionapproval
Coded at2026-04-27T06:24:53.388235
Raw LLM Response
[ {"id":"ytc_UgwMCVb0jmxiKKDBGFB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_Ugyo4VuSFxK-G5m5emx4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgyCy7CUEsc72g07kid4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgwWWo8Ofrq62S4JJK54AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"}, {"id":"ytc_Ugw1X9xTxowLrEd_2NR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgwlVQSTgN8EiVuZdNd4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgzRSD6P-SwzeqOpYht4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UgwQzJu4eVuT9Iwo6LR4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_Ugx-hQjKsXsX3TpYzjp4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"}, {"id":"ytc_UgwXTrgWo7_o8ZqE8F94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"} ]