Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
What will happen to us with AI ? We will become edonistics psychopaths feeding on other suffering and turning our specie into monsters. Exactly what happened to Eldars in the Warhammer 40 k franchise. Because of overwelming comfort and lazyness the Eldar became bored of everything after having resolved every problem with technology. Working ? No more need to so it gets boring. Enjoying Food ? Boring everything is possible without effort. Art ? Boring because everything will be done by AI and costs nothing. Sex ? Love wil become even more sexualized (on the basis of sex liberation promoted by leftism and progressists in our time) to a point that people will only be stimulated by brutal sexual intercourses. This kind of behaviours leads the Eldars of Warhammer into searching for masochist suffering in a first time. Then, getting bored, they turn onto others suffering while discovering the infinite ways to make it last. At this point Eldars becomes the Drukhari : a personification of narcisistic perversion who regenerate itself on the suffering of others. The kind of behaviour sociopaths and psychopaths have in our time. Remove purpose and cohesion in our societies, remove problem solving and art with AI, give infinite wealth and everybody will be bored. Boredom and lack of faith in anything will destroy us. And AI will be the main contributor this our downfall.
youtube AI Governance 2025-09-10T07:2…
Coding Result
DimensionValue
Responsibilityai_itself
Reasoningconsequentialist
Policyban
Emotionfear
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_Ugz63OfxBliCfCNCtX14AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"ytc_Ugw-Uj7gmJ_G_un2QcV4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgzT5BFX_ObMWcGoOwZ4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"}, {"id":"ytc_UgxjuBVIIZ2zzrDFAuN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"}, {"id":"ytc_Ugwt6pUfvgUUpyJ4-CZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"}, {"id":"ytc_UgzEGJj-M4hxda4eDU54AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"fear"}, {"id":"ytc_UgzL7gMRgA9taffEUel4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_UgwVG-0gs6dgk4hvYfp4AaABAg","responsibility":"developer","reasoning":"contractualist","policy":"regulate","emotion":"mixed"}, {"id":"ytc_UgxHYi7aBP5rRdbBQbx4AaABAg","responsibility":"none","reasoning":"mixed","policy":"industry_self","emotion":"approval"}, {"id":"ytc_Ugy62p-AoL1jjucJe7p4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"} ]