Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
We can talk forever about the doomsday visions. There is a solution and I never hear anyone talk about it. All the technological inventions are a blessing. Why did we invent machines, robots, and now AI? The very core of all these inventions is the same. "The machine should take the work of my shoulders." Now the machines - eventually AI - are doing exactly that, and humans react narrow-minded with fear instead of celebrating the upcoming blessing of freedom from work slavery. So, how to solve this? If a machine, a robot, or an AI does the work of x-amount of people, the company has to pay tax and this goes into the pool of an unconditional basic income. We have to talk about re-structuring the entire monetary system, adjust the laws to the new situation, which is not that new anymore. Industrial revolution, robots working in huge factories, and all that is ongoing for many decades. Politics didn't react and they missed to adjust the system accordingly. If one robot at a car factory does the work of 10 people, The machine must pay a basic income for 10 people. Make a maximum cap on - how much one single person can own - and funnel that money again into the pot for the basic income. It is a blessing, that we have no machines and AI to do our work. This means, we are finally free and can do, what we really love to do and not waste our entire lives to work ourselves to death, for survial, wealth, or simply to fead the family. People are pretty simple-minded. They love to focus on problems, fear, discuss all the details, and find someone to blame. Hardly anyone is talking about a good way into a good future.
youtube AI Governance 2025-10-30T20:3…
Coding Result
DimensionValue
Responsibilityunclear
Reasoningunclear
Policyunclear
Emotionunclear
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[{"id":"ytc_UgworqxWBQVfwFn6Xuh4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgyUuaK2xEg2JAFHoaZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"ytc_UgyYYMD8oaSsIgyqId94AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"indifference"}, {"id":"ytc_UgzeTvSaPfAMLT7cXux4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"mixed"}, {"id":"ytc_Ugy_3BS3WOcdEQdHs_B4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"}, {"id":"ytc_Ugw0Lmqm-uB_UC4p_n94AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"fear"}, {"id":"ytc_Ugznd2Tsv3RQHgSVpwV4AaABAg","responsibility":"government","reasoning":"deontological","policy":"ban","emotion":"outrage"}, {"id":"ytc_Ugzs3cZeD8fv20ptl6Z4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"ytc_Ugweg6mYLzYr0pPl5-d4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"ytc_UgwRosgy1bvg2Ejcs0l4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"mixed"})