Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
I am telling people since months, that in 5-10 years most people will be without jobs, as AI will do it better, faster and cheaper. Sure it will create new jobs, but those jobs it will kill will probably outweigh the new jobs by a lot. Manual labour, basic office tasks and unfortunately what made me really change my mind on this: also creative jobs will vanish. Marvel is sure, that in 2 years the first fully generated Movie will be released. Imagine writing whole series by prompt, without actors or a huge team, that has to produce it. It's just "write me a new Spiderman movie where spiderman and deadpool go to the land of OZ" and 10min later you have a blockbuster made by AI. I first thought that people will be able to leave their boring jobs and live life as creatives, but even here I see that AI outclasses us in a few years. That leaves us with a new exestential crysis for the human being. What do we do, when there is nothing to do for us but learn about things, never actually applying them, because this AI entity and a bunch of humanoid robots can do it better? So I can see why people would warn about AI, as it will radically and radiculously change our lifes in just a few years from today.
youtube AI Governance 2023-05-04T07:5… ♥ 1
Coding Result
DimensionValue
Responsibilityai_itself
Reasoningconsequentialist
Policyregulate
Emotionfear
Coded at2026-04-27T06:24:53.388235
Raw LLM Response
[ {"id":"ytc_UgwPR2Kmr68oavJc64B4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"fear"}, {"id":"ytc_UgzctNjZOQxS8IxhAF14AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"outrage"}, {"id":"ytc_UgxR3mtlRHv9Ka0FvFJ4AaABAg","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"outrage"}, {"id":"ytc_UgyvYU36nnExB_Rvd_d4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UgztE_des2X7uXGUS294AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"liability","emotion":"fear"}, {"id":"ytc_UgzIBDTjA77Mpwn3yiB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgyKZ-QFXQINmVyOOnl4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"resignation"}, {"id":"ytc_UgwCvJPB0cnp8dAMJCR4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"mixed"}, {"id":"ytc_Ugx9ihceHIq5Ts0ceq54AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"approval"}, {"id":"ytc_UgxrfQQ4UHofbm9NDTZ4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"} ]