Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Here is my best guess. People like Sam don't care because they are rich and wont be impacted and they got rich by helping invoke this state of reality. Big companies wont care because they are rich and wont be impacted. The problem is, many companies wont stay in business if there are only a handful of people using/buying their services/products. What is the point in AI curing cancer if society is in such a state its eating each other on a daily basis to the point where you can't walk the streets without a real risk of being stabbed for access to whatever assets you might have. Given the outcome of this is the gap between extreme wealth and severe welfare/poverty will become HUGE and I'm talking 85+% broke and reliant on food being handed out out to them, I don't see this as being a good thing. Some of that is a natural byproduct of capitalism aswell and there are many people involved in AI. So this is not a I hate sam speech. But the reality is, society is in for a dark time and those who are doing well, don't think you're sitting pretty and free, because the bad people will be looking at you like the Coyote looks at the road runner and your tech skills won't win against a bunch of guys with nothing to lose who have you cornered in your apartment or in a car park because you live a quality of life that they might now become desperate to take from you because the quality of life for so many is so dire. I used to do 3D graphics and earn decent money, when job boards went online. I then had to compete with people from india who used pirated software, cheap energy and would do a weeks work for 100 dollars. That's not a fair fight. But its one I had no choice in. Because now companies are going to them, or offering wages factoring that in and everything and everyone is becoming devalued.
youtube AI Moral Status 2025-07-28T02:3…
Coding Result
DimensionValue
Responsibilitydeveloper
Reasoningconsequentialist
Policynone
Emotionmixed
Coded at2026-04-27T06:26:44.938723
Raw LLM Response
[ {"id":"ytc_UgwYH989bKkh5L_FxMx4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"outrage"}, {"id":"ytc_UgyT5b1koEm55SLOiZF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_UgwpeVPuNbtEbHBxttJ4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"}, {"id":"ytc_UgyWTPdXZZePQxNstWZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_Ugy3lQbLMIs4b_etSkh4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"mixed"}, {"id":"ytc_UgyjHfw33gGhTDyF0814AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"}, {"id":"ytc_UgwddrXdZLOJ0dkyfnh4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"outrage"}, {"id":"ytc_UgzPpXN7ghhud3Gu_-14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"}, {"id":"ytc_UgwS9eLlgUE3elLBK3t4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"mixed"}, {"id":"ytc_UgwvK2HTlsBr85Y1f3R4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"} ]