Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
What worries me the most in the current trajectory of AI and robots is the capitalist question? Right now really crazy exploitation and rampant corporatism is held in check by countries laws and the elected governments that make these laws. Corporations are are already heavily influenciong these elections and though lobbying corrupting that system of representation we have. My biggest worry is once they have the means to overthrow through AI and and robot security forces of their own they will engineer an excuse to enforce martial law and take over control of the government. The only thing that provides real power is Military might. The French revolution proved that the people will only be pushed so far down before they rebel and at that time the people had power. Once there a robot armies available to the elites it's game over for us plebs. All of us could unite and still be beaten down. In fact once production is automated we are surplus to requirements. Capitalism can die too. They don't need our labour anymore. They can have anything they want. It's a post scarcity society. Money is redundant as a rationing tool. We just become a massive drain on their resources. Best get rid of us. I'm trying to talk myself out of this vision of the future but my memories of the past see this current trajectory coming to pass. Talk me out of this. Luv and Peace?
youtube 2019-05-01T22:0…
Coding Result
DimensionValue
Responsibilitycompany
Reasoningcontractualist
Policyregulate
Emotionfear
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[ {"id":"ytc_Ugy3boToPB_xWnwDHgh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugyh2gJc47ez_S9dRKN4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgyIE2I8RCmsT7k9A9Z4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"outrage"}, {"id":"ytc_UgzldFL3hAVhJj1xO9B4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UgzXHaMIBlkxJAOtj8d4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytc_UgzFQY3tdogCJuB7cOR4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"mixed"}, {"id":"ytc_Ugz9HDU1etCXTMNqZPJ4AaABAg","responsibility":"company","reasoning":"contractualist","policy":"regulate","emotion":"fear"}, {"id":"ytc_Ugy5xzYdlGdJWiWktBp4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"indifference"}, {"id":"ytc_UgwP-7b0tk7S3HzwoAB4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgyyVZ6vNhke3sRzYqV4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"approval"} ]