Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
I disagree with Bernie here, on how to approach the situation. We should strive to automate all essential labor We should strive to make housing, education, healthcare, food and water, electricity, internet, public transportation, and other basic necessities, human rights. Guaranteed, not commodities. We should nationalize AI, among all other infrastructures that we DEPEND on, to ensure all essential needs are shared, not capitalized. AI is another means of production. And lastly, introduce AI globally to move past human leaderships that often don't respect scientific progress, into an eco-socialist society, sustainability is the key goal of our society, as this Earth has suffered from pollution enough. Thinking in tax isn't enough anymore, or people will never have human rights guaranteed. Star Trek is a clear example of how a system can run. Karl Marx had expressed his expectation that automation should in theory reduce human labor for better overall quality of life. Geoffrey Hinton expresses his faith that socialism will be a necessary trajectory in a world where AI can take any labor. If the left doesn't pick this standard up, we will never see an improvement in a system that doesn't plan to change. People have shown how democracy will sabotage itself, now is not the time to act sanctimonious, and it's time to take the means of production. This technology will revolutionise the STEM field, the progress must benefit this Earth.
youtube AI Jobs 2025-10-08T03:0…
Coding Result
DimensionValue
Responsibilitynone
Reasoningconsequentialist
Policyregulate
Emotionapproval
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_UgwD-5B06cxXVe3sCAJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugy3j2NREebrl8bHFK14AaABAg","responsibility":"company","reasoning":"mixed","policy":"liability","emotion":"outrage"}, {"id":"ytc_Ugyz9EgGEj8FNqEso5l4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_Ugymnl6eTvq5Cp3iQtt4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"ban","emotion":"fear"}, {"id":"ytc_Ugy8rU_ciGhaUVP1SBJ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"approval"}, {"id":"ytc_UgySktgPP3-zCVSBoMl4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_UgxHK6GIjtKguDLG4194AaABAg","responsibility":"government","reasoning":"contractualist","policy":"regulate","emotion":"approval"}, {"id":"ytc_Ugwfgrb4nkOCl4I_1E54AaABAg","responsibility":"company","reasoning":"mixed","policy":"liability","emotion":"outrage"}, {"id":"ytc_UgwWlZkLbQL6W2VZ_2x4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytc_UgybxO8SLB46qNeU33V4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"approval"} ]