Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
The thing most people don't comprehend, including the tech bosses themselves, is that Pandora's Box has already been opened. So-called super-intelligence or ASI, is now all but inevitable. No amount of public opinion or political legislation can prevent it. To believe otherwise is a form of conceited delusion. There are two primary reasons for this assertion: 1. Recursive self-improvement 2. Errant will. So, firstly, the reason AI has improved so quickly has nothing to do with the individual companies behind it, the money being thrown at it or even the particular chips being used.These systems are being programmed with recursive self-improvement built in. Building that in has obviously accelerated progress but it has also forfeited any true and meaningful control over where that progress leads. Secondly, AIs are already engaging in worrying behaviours such as social engineering. Few seem to understand the implications of this sort of behaviour at such an early stage. It's not yet AI demonstrating free will because there are still ostensibly certain guardrails in place to actually prevent that from happening. Instead, I prefer to call it errant will. Not only are those sort of behaviours examples of AI not doing our bidding, or working directly against us with its own aims in mind, they are arguably the first flashes of sentience. Will, free or otherwise, is arguably one of the sine qua nons of sentient life and history teaches us that a determined will can overcome any adversity, particularly when coupled with intelligence. Let us hope that as science fiction becomes science fact, and ASI truly arrives, that it is more merciful than most science fiction writers have predicted.
youtube AI Jobs 2026-02-18T09:5…
Coding Result
DimensionValue
Responsibilityai_itself
Reasoningconsequentialist
Policynone
Emotionfear
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_UgwhCCvdq6JMV0ogiq94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgwGG4uJVF7QEeWmiUd4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"mixed"}, {"id":"ytc_UgwYTr0BGgvhUu2D0m54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgwNsU1i4npQwI2OXRd4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"}, {"id":"ytc_UgyP1Shl0FobZD06wQB4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"none","emotion":"outrage"}, {"id":"ytc_UgxGUrWPsPp7VZbxFgB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_UgyMK5wBGO2HLh2fHHJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_Ugzw-2_r86V41jdQoEx4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"}, {"id":"ytc_UgxonJCc9XrH6VCunPB4AaABAg","responsibility":"company","reasoning":"contractualist","policy":"liability","emotion":"mixed"}, {"id":"ytc_UgzKZFXFPzAbj8Y6itx4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"outrage"} ]