Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Why? For what good reason is this happening!? And every answer giving will get a why. Why us as humans don't just do what we need to do to learn, work, cultivate, clean, better the environment, etc on our own and those that don't those things just find something they are good at and put their talent to use weather it is something like moving boxes, flying planes, growing more trees, studying and bettering the ocean water levels,etc no matter what level of work it is. Why would humans smart enough to create AI would think it would be a good idea to create something to do all we as humans are suppose to do. They are super smart to do other things that won't effect in the latter years. So the creators of AI are giving humans no need for existing which they themselves are humans, a reason for humans to get up and do something, or how is it ok to create something that want or can erase human existence. Why doesn't this brains behind this operation doesn't use his wits to plant, clean, give the world/government a better way to financially be in a better place, etc. How or why is creating something to endanger the human race ok!?
youtube AI Moral Status 2023-02-25T16:4…
Coding Result
DimensionValue
Responsibilityunclear
Reasoningmixed
Policyunclear
Emotionresignation
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[ {"id":"ytc_UgzSbgjsJZuoZETFjA54AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"resignation"}, {"id":"ytc_UgxhRjiigy_OF9dyOzt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytc_UgxalaDKULy4fsHQZld4AaABAg","responsibility":"company","reasoning":"deontological","policy":"ban","emotion":"outrage"}, {"id":"ytc_UgzL_Z31j51684AzzMR4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"mixed"}, {"id":"ytc_UgwdMXPk3a75NFGFa914AaABAg","responsibility":"none","reasoning":"contractualist","policy":"none","emotion":"approval"}, {"id":"ytc_UgwTqIkDUFTWffVpGp54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgyjrrweSubdaXsnVKN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"}, {"id":"ytc_UgzNd5WHTBgIEb6d5Fp4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"}, {"id":"ytc_Ugy65sUAppfsKM_e_id4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugz8z66JYlZcpqhCaLR4AaABAg","responsibility":"user","reasoning":"deontological","policy":"ban","emotion":"outrage"} ]