Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
The problem is not the machines or AI, the problem is that they are owned by individuals and not everybody. Imagine a medieval village where someone comes and says "here, have this machine that will do all your work". I doubt anyone would have a problem with that. After all technology is there to do our work and make our lives easier. But imagine the same village where the lord says "I have this machine now that can do all your work, so get the hell out you are no longer needed and I dont care weather you live or die". If the means of production are owned by the few while the rest is excluded from the chain of economic value generation then there is no need to feed, clothe, house and care for them. They are irrelevant in a economic and thus political and societal sense. Or, to quote Kurt Vonnegut, machines are by default slaves and humans that are forced to compete with machines are thus also slaves! And a universal basic income would merely be a handout designed to keep the system going but what is needed is a structural change with a real redistribution of wealth (from top to bottom for a change) and some form of shared ownership of the means of production (ie the machines/AI) with the majority of society! Technology should serve everybody, not just the ones who own it!
youtube AI Harm Incident 2024-07-29T19:3…
Coding Result
DimensionValue
Responsibilitycompany
Reasoningdeontological
Policyregulate
Emotionoutrage
Coded at2026-04-27T06:24:53.388235
Raw LLM Response
[ {"id":"ytc_UgylCzhJ5kjxN4V1jBZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgzeXdHjI8Hz8OcJIQJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"liability","emotion":"approval"}, {"id":"ytc_Ugz8oFctO7uDNNQceWt4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"}, {"id":"ytc_UgxOJ4tOGwUndXqZOQl4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytc_UgwuWnXo_sQ0jdkPoKl4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgzIsJXGoBj3rTjjiZd4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"}, {"id":"ytc_UgyWApOcaReKjOLjbfd4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"}, {"id":"ytc_Ugz0WfgqHd3J77Qq23J4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"}, {"id":"ytc_UgwVQLTsqv2a2GjbtHR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_Ugzb3me3dVm0CeQXB_94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"} ]