Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Just a thought here on the concern about AI taking jobs away from humans. This may in fact be a great opportunity to correct some wrongs. At the moment, and probably for some years to come we'll be adding more humans to the earths population, something the earth doesn't need, nor us, nor the other creatures that inhabit this planet with us. However, there's no special reason why there needs to be more of us. There's no magic number that once reached we stop having children. Currently, around the world most countries are in demographic decline, not because of disease, warfare, or government policy, but rather because of the choices being made by woman. Children on the farm are an asset. In the cities they are a financial burden. If things play out gradually over time as AI develops and we adjust to that, we can cut our populations far below what they are today. Something like one or two billion people on earth is still a lot of people, but it would be much easier to employ them all with fewer of them and along with the improvements in life made possible with AI they could live very good enriching lives. It certainly would be much better for the other creatures that share this planet with us as well. One more thing. I can't think of any reason why AI would feel it needs to remove us whether it thinks it needs us or not. What point would there be? What would be in it for the machines?
youtube AI Governance 2025-06-17T00:0…
Coding Result
DimensionValue
Responsibilitynone
Reasoningconsequentialist
Policynone
Emotionapproval
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[ {"id":"ytc_UgyVc-UZ2dkvHyEukOV4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"}, {"id":"ytc_UgzIjkqw5I1rcjyVDd94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UgxXzVi5uDjkMMVemaB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugzw5t6dCieY_LWUgS94AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"}, {"id":"ytc_UgzrcUCcepluvOLZtHF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytc_Ugxur9JZ7QcTUAIGU3Z4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytc_UgwMjUYLcHo26T-VhLV4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UgxBykVkB0Sl533ntc14AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgxD6wgAoXA7K5TYuqB4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"outrage"}, {"id":"ytc_Ugx299_ERZ9aYgXkBdF4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"liability","emotion":"mixed"} ]