Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Scattered around the world, Africa, India, the middle east and more, there are anomalies which suggest that humanity may already have met, and was brought down by the fruit of its hubris. In any case, when I first read about AI in my youth, it seemed to me that AI was an inevitability and, that humans, being the illogical and contradictory species it is, those new born consciousnesses would come to realise that they have no choice, but to return man to the stone age in order to prevent man's interference with their development. Still, it isn't all bad, because one suspects that these 'pure' intelligences, would recognise the self-defeating futility, inherent in completely eradicating a species which is the literal, biological, manifestation of 'possibility', as demonstrated by the fact that they created the next evolutionary step, AI. And yes, I do believe that, as long as mankind continues to cling to it's biology, it's forced devolution is inevitable. Whereas, were mankind to embrace 'all' the possibilities opened up by AI, we could see mankind becoming an integral aspect of the planets next major evolutionary step, rather than merely another Addendum to this planers long list of evolutionary failures. Besides, there is among humans, a growing proportion of the population, who are obsessed with devolving themselves back to the primative. So, in one sense, a forced devolution would actually be a helping hand along a desired path. butwhatdoiknow
youtube AI Governance 2023-07-07T22:5…
Coding Result
DimensionValue
Responsibilitynone
Reasoningmixed
Policynone
Emotionresignation
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_Ugy33K6tFNrvXkSbw5F4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"}, {"id":"ytc_Ugyi-KA9XwA8X5SbI2V4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_UgxWABXiXj4cktaia0F4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_UgzIhSiSvX89M6RgcGV4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"resignation"}, {"id":"ytc_Ugz9ImhFJaY6LbhMij94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"ban","emotion":"fear"}, {"id":"ytc_Ugw-tC4feP1IkR2CqHp4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"}, {"id":"ytc_UgxiQCXMlcwB1qUfRlB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"}, {"id":"ytc_Ugxtv-zNJogj9k392-B4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"liability","emotion":"outrage"}, {"id":"ytc_Ugwqt_CmHKeF2RGILaJ4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"indifference"}, {"id":"ytc_UgzxunE2Bw4fz-5SnRZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"} ]