Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
It might be a good thing! We shouldn’t just see negative points of the tech progression! Of course, most of us will lose our jobs to the robotics and AI in the next few decades, and it’s in all the industries, not just truck drivers. Some experts say that in the next 30 years, about 80 to 90% of current jobs might not exist in their current form. And of course, it would be hard for many till new regulations and a global basic salary come into effect. But we should know that it also can help many people come out of extreme poverty. The first industrial revolution happened around 1800 BC, and machinised factories started working. It changed most of people’s lifestyle. Before that, 80% of the people worldwide were living in extreme poverty, and now there’s just 9%! Just from 1990 to now, after the last internet and digital revolution, it decreased from 36% to 9%, and it’s because of the economic prosperity that these revolutions are bringing. And of course, in each revolutions, many type of jobs had been changed a lot. it’s hard for the first generations, but it’s going to be beneficial in the long term for the world. Still, about 800 million people are undernourished and dont have access to the most basic human needs, and many are dying daily. I believe robotics and AI would reduce these numbers much more (by fully automated agriculture and …) but it’s sad that many would become unemployed at first. So generally, I’m interested and optimistic about the future, and I’ll invest in this Aurora company because I’m sure it’s going to grow much more in the next 5 years because this 4th revolution is just getting started and it’s gonna be much bigger than the previous three revolution.
youtube AI Jobs 2025-06-02T22:0… ♥ 2
Coding Result
DimensionValue
Responsibilitynone
Reasoningconsequentialist
Policynone
Emotionresignation
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_Ugw56pt6gXwIdeW9iw54AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_UgxWUXp45W_bYRjgniV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"outrage"}, {"id":"ytc_Ugzy5RyfvjAQI-pJ75l4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytc_UgxZkmag-tmxLebugZB4AaABAg","responsibility":"company","reasoning":"virtue","policy":"unclear","emotion":"fear"}, {"id":"ytc_UgxEwhmeI2pJb_-p9ol4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UgzqdtHxVZJII30QBQR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UgxTng2V7AfE5f9KC6t4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytc_Ugzp12FseXhETWT2a8R4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytc_Ugzuj-nrom-j_RalJ4d4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"indifference"}, {"id":"ytc_UgzRdb8ew2AxrAglMdF4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"outrage"} ]