Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
I majored in history, so I'm curious to know whether there are any historical parallels. And there is. Slave-based societies. So we can infer some likely outcomes based on the past. The problem with slaves is that, while they produce, they do not consume. A large number of cheap, or effectively free, labourers also force down wages and job opportunities for freemen. Fewer people consuming leads to the breakdown of trade networks. The wealthy can only consume so much, and there aren't enough of them to keep the economy stable. Governments might choose to implement a UBI program to deal with the mass of unemployed and poor people. However, if you have more people taking out of the system than putting into it, you'll quickly spread yourself too thin. This is one of the factors that led to the fall of Rome in the West starting in the 3rd century. Similar issues cropped up in Spain and the Soviet Union. AI is just another form of slavery. At least from an economic perspective. I'm not concerned about a singularity and self-sustaining superintelligent AI taking over, because the economy will collapse before that happens. That's not much of a comfort, I know. However, civilization has a habit of hitting the reset button whenever our hubris gets the better of us. Every black swan moment has a precedent before it. Of course, hindsight is 20/20.
youtube AI Governance 2025-09-05T00:3…
Coding Result
DimensionValue
Responsibilitydistributed
Reasoningconsequentialist
Policyunclear
Emotionfear
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_UgzogvgdeZZT27L1nQR4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"}, {"id":"ytc_UgwZOSBem_S4jp6zi4F4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"}, {"id":"ytc_UgyxVMYZ0vfPBPMjJT14AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytc_UgxXzcegaH3vE66k4Bl4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytc_UgyVT96GP3sbHjCzxfZ4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"}, {"id":"ytc_UgwBWekUysiyxMEFA8h4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytc_Ugz0uyft_NaZgv-Sn094AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"}, {"id":"ytc_UgwUSYRI56_zGojpAHp4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"}, {"id":"ytc_Ugx1e8BeeR1NIP37FCZ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytc_Ugwr_T_LSeTovQV9VYN4AaABAg","responsibility":"none","reasoning":"contractualist","policy":"regulate","emotion":"approval"} ]