Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Okay…but the problem isn’t moral dilemma, it’s a technical one. We do not know how to tell a machine what is right or wrong whether it is sapient or not, because we don’t even know how to construct ideas of right and wrong… Think about the trolly problem. A machine “does what it does” and to the best of our ability we control it. Thinking we can control it is one of wolframs arguments: we can’t. It’s like trying to control a horse. We control them in a way that’s mutually beneficial but not in anyway where we can understand what the horse really wants to do or what it values. It just so happens that they let us ride their back in exchange for hay and some pets. Yud makes a ridiculous statement in the beginning: “let’s kill all the mosquitoes, I’m sure nobody would care.” Which is absolute rubbish because spiders and birds and other creatures eat mosquitoes and this would throw off the equilibrium the planet has set up. In the same token who’s to say that AI’s wouldn’t make the same argument given that humans kill thousands of species by the day. His arguments self defeat themselves because of severe lack of understanding the technical problems not the moral ones which are easy to talk about when we as humans can communicate with each other…which does not apply to other creatures with vastly different worldviews and as wolfram states, likely different laws of physics. One needs to really dig deep into the definition of terms…terms we need to ascribe to systems we don’t understand.
youtube AI Governance 2024-11-12T21:4… ♥ 5
Coding Result
DimensionValue
Responsibilitydistributed
Reasoningconsequentialist
Policynone
Emotionresignation
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[ {"id":"ytr_UgzB6btm-JilYmMP79l4AaABAg.AAl6xzIlRedAAlOHKAg61d","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytr_UgzB6btm-JilYmMP79l4AaABAg.AAl6xzIlRedAAlU2OI3kA7","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytr_UgxkP3JTDL_ibbhpF8V4AaABAg.AAkjOk44JQDAAkrdVI2oK4","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytr_UgxkP3JTDL_ibbhpF8V4AaABAg.AAkjOk44JQDAAn0q8sljVi","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"mixed"}, {"id":"ytr_UgwWBO4fzkcfxXZzfyh4AaABAg.AAkUtW5iuCWAAlD5KnMRDf","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"}, {"id":"ytr_Ugyv6Qp4eSIzqy-uciR4AaABAg.AAkOBbglEiZABNVA3g4QMQ","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"outrage"}, {"id":"ytr_UgxTT76971kJ8R5SF1t4AaABAg.AAkGiKLSpT7AChPjafcvz8","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"resignation"}, {"id":"ytr_Ugy3_FrrLbfNKR629w94AaABAg.AAkEpYBLQ0LAAkQMEbtUro","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"indifference"}, {"id":"ytr_Ugy2QqbkS-UHFcFKQYx4AaABAg.AAkBaUAVdsbAAkVhFqbLql","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"outrage"}, {"id":"ytr_Ugy2QqbkS-UHFcFKQYx4AaABAg.AAkBaUAVdsbAFje08SsJaL","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"} ]