Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
I’m a peasant. Had 10yrs schooling but reached 75 yrs living in a house I built from capital gained by being a sole trader in construction and cleaning and now free counselling. Humans SHOULD be most intelligent but clearly are not. No animal is smarter and never will be. AI ( I think) is selective multiple data bases. Could it possibly be worse than government by corrupt people? Don’t trust the clergy...clearly. Don’t trust commerce...clearly. Don’t think our future will benefit from anything “man made” because such has been disastrous. Name a government anywhere, anytime, which benefitted it’s people. Nada zilch. “We” cannot govern period! Nor can man made Artificial Intelligence. All reading this answer this...can you keep peace in your own household? In your community? No no. How could anyone failing in those two basic elementary zones be expected to bring peace in a country? In “extremis” when drowning or about to be eaten we cry out to a higher source. Why? What do we know about a “higher source”. We spent our life selfishly climbing over our parents, our teachers, police, authority thinking we know best. Ridiculous. Why not research the source we cry out to when about to die because why would he pay us any attention? If you laugh at this...ok..go with what you have and god help you. If you think it’s worth a try ask the “higher source” for help BUT not on your terms.....on His. And don’t expect help from so-called clergy or leaders who charge you for it.
youtube AI Governance 2023-04-19T12:1…
Coding Result
DimensionValue
Responsibilitygovernment
Reasoningmixed
Policynone
Emotionresignation
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_UgyqbRAFFl2b2VUq54x4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"ban","emotion":"outrage"}, {"id":"ytc_Ugy0F_7Iw4VYcYXYaDd4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"regulate","emotion":"mixed"}, {"id":"ytc_UgyuNsXX3txlE0m82-B4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_UgyvG5j86zdAvQQX47V4AaABAg","responsibility":"government","reasoning":"deontological","policy":"none","emotion":"fear"}, {"id":"ytc_UgzcdJDGMVXDPfUP_Rx4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgysYfUR-9jEMryiSCV4AaABAg","responsibility":"government","reasoning":"mixed","policy":"none","emotion":"resignation"}, {"id":"ytc_UgzTEyP9ejywH6gWDMx4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_UgxveA3JtZGM2HyQf6J4AaABAg","responsibility":"developer","reasoning":"contractualist","policy":"regulate","emotion":"approval"}, {"id":"ytc_UgzVLKuwniyRoY5k5AV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"}, {"id":"ytc_UgwEdSeUrydmUrbTwOZ4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"fear"} ]