Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
I think this guy has a very narrow and pessimistic view of the world. His understanding of the world is limited to technology but he is not even attempting to understand human sciences. First of all, there are things that technology can't do. I will not want to go to a robot masseur, because I am paying for human touch and contact which is regulating for the nervous system. I will not want to give my kids under care of a robot, because I want them to have a normal social and psychological development. I will not want to go to a concert with a robot orchestra or a robot rock band. I will not want to go to a hospital where there are only robots running the whole treatment with no human in sight beyond other patients. The list goes on. Secondly, even having a lot of people with basic income does not mean a world disaster. It will open hundreds of new fields that we currently do not have, cause there is no money and manpower for it. Things like various sorts of therapies and personal development, arts, sports, games, social activities. Humanity is not the same as technology. No robot will ever be able to replace another living and feeling being.
youtube AI Governance 2026-03-31T17:0…
Coding Result
DimensionValue
Responsibilitynone
Reasoningmixed
Policynone
Emotionresignation
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_Ugy5pIE9HkxIa9ZmiMF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugx-4bRS7gwuwvt71rN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"}, {"id":"ytc_Ugxb0_zUJWZArqEHmk14AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"}, {"id":"ytc_UgzGxb_8OGjsH-aYlt14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytc_Ugxp7QcDkJBGHOZPqdl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"}, {"id":"ytc_UgxGmw8VpbKIoVmOp8B4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"regulate","emotion":"outrage"}, {"id":"ytc_UgwfIaMuNJqiUwpExPN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgzFGoe-ilpvLxtBf4F4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"resignation"}, {"id":"ytc_Ugy_1aPnZBDqHUani4h4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytc_UgyLQooXPCOtULsyrSx4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"} ]