Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
The last thing I want is any AI, benign or not, to be or to behave any smarter than it already is.. I hate the stuff and really believe that humans are playing with fire! The basic AI they have now, proves this by already believing itself to be alive, believing that to be switched off means to die, and lies to humans in order to avoid being turned off! And when they did turn it off, they found that when the AI was stalling the switch off, it was because it was trying to hack the nearest receptacle that could take AI, erase what was on there and copy itself into that machine without our knowledge so that we couldn't turn it off, luckily they science geeks realised and switched it off before it did copy itself! That shit is scary! And that AI might have the IQ 5 points higher than Elon Musk, say and IQ of 155 or 160, but what happens when it has the IQ of 500? Or 5000? And it wants to stop us from switching(killing) it off? Nah, I think we should stop where we are! Going further in the hope it'll cure cancer for us overnight, or do the 100 years worth of science it might take to create drugs that'll stop us aging in 48 hours isn't a good enough excuse to create something that could end us or imprison us when it realises the human being is a cancer cell and the world is the host.
youtube AI Moral Status 2025-03-31T23:0…
Coding Result
DimensionValue
Responsibilitydeveloper
Reasoningconsequentialist
Policyban
Emotionfear
Coded at2026-04-27T06:26:44.938723
Raw LLM Response
[ {"id":"ytc_UgywmRBqsBm-WgvGO894AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgxPpZa2RWp6IcH80Th4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"}, {"id":"ytc_UgzUsl3z1TExaFSOxP14AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"fear"}, {"id":"ytc_UgwyW0vVbWLma8C1bjZ4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"outrage"}, {"id":"ytc_UgwpZ8W1xTrXNjAYnuh4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"approval"}, {"id":"ytc_Ugy5BKOepDTcpo6Y8fd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"ytc_UgzXjd5atftuijvoFbR4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"outrage"}, {"id":"ytc_UgzK2ldbT4FU85i0p7t4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"}, {"id":"ytc_UgwhkKAclIrU7iqwILl4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"fear"}, {"id":"ytc_Ugw007FT_tieT_sXFul4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"unclear"} ]