Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Well here's what I disagree with. Elon is indeed thinking in deep future, but when it comes to economy you must think also about costs of such stuff. For example is it really profitable to build a robot that will replace all the policemen? What would be the cost of the parts of such robot and maintaining it? Let's say that someone programs the neural network soo good that it could "replace" policemen. $50k to get it built and 20k$/year to strictly maintain it. Is it profitable? Of course not. Don't forget about people being furious about such decision, it would never be condomened by the general public. Same goes with engineers, electricians, surgeons, sport trainers, nurses, I could go on for hours... Same for teachers, kids have an emotional force linked to them, why would any parent send their kid to a robot school? The only people that will get replaced is people that do things in the digital world, that is way easier to replace. If you're thinking deep deep deep, then yes, we might get the blue collar workers replaced too. If the blue collar happens then only in the US, and some third world countries. The red collar is strictly linked to the digital world therefore it can be easily replaced. We will need a rebrandation of the society. Everyone now wants to be a graphic designer, web programer, and stuff like that. We need people like construction workers, engineers, electricians. Jobs that the newer generation DOESNT WANT TO DO! Not the "people will be taken care by the ai", but people will have to rebrand scenario.
youtube AI Governance 2024-07-11T09:4…
Coding Result
DimensionValue
Responsibilitynone
Reasoningconsequentialist
Policynone
Emotionmixed
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_UgwUskLnBLAOknJkgQ94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgxiI673b_R_gZjuaw54AaABAg","responsibility":"government","reasoning":"deontological","policy":"none","emotion":"outrage"}, {"id":"ytc_UgxIPWHrwb6H50VMdRF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"}, {"id":"ytc_Ugx5eIWMYHAc-6gPWbF4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytc_UgxsKE_awP7ODo2zeaR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_UgzRoTtpTFa1Sr5WFmV4AaABAg","responsibility":"distributed","reasoning":"virtue","policy":"regulate","emotion":"approval"}, {"id":"ytc_UgyttExFUjXJR_gi_v14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"}, {"id":"ytc_Ugz3NfItEzZKpg-yRh54AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"}, {"id":"ytc_UgzmjISoR37cmwMIpIN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytc_UgxX34vJKfDvEMWPP7N4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"regulate","emotion":"mixed"} ]