Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Neil just wants a computer that will make his coffee, and may be satisfied with that. Unfortunately, most employers want a computer that can do what each of their employees do - exponentially faster, and (if they have little need to be at the bleeding front edge of their industry) notably cheaper. Does he believe that the employer will keep paying that employee so they can live and eat? That they will (as Bill Gates surmises) suddenly free to pursue more creative endeavors (most of which won't pay them enough to live and eat, especially now that AI can do those too)? Most of Engineering isn't being able to do the hard stuff off the top of your head, it's knowing how and where to find bulletproof solutions that can be quickly and accurately implemented, and having a reasonable level of education and experience which allows you recognize quickly if your math starts to go off track. The old argument was that if automation takes your job, learn how to automate, but those jobs are already being taken by younger people who are willing to work for less right now. Without regulation, AI is about to cause unemployment to skyrocket and disposable income to crater, especially with the current government deciding that higher education in the U.S. should be a luxury. Having worked in a couple of different collapsing trade industries (over 80% reduction of their workforces) while the rest of the economy scraped by normally, it's hard to not be a pessimist about this.
youtube AI Moral Status 2025-07-24T16:1…
Coding Result
DimensionValue
Responsibilitycompany
Reasoningconsequentialist
Policynone
Emotionfear
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_Ugxh38fDeZr3JxYD3a14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytc_Ugy_eSNzFDMHO_E6QJx4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_Ugx3rHdi0g0EcLlDg554AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"ytc_UgxIG1GRTkchTBO5Pd94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"}, {"id":"ytc_UgzJJ4MPrdQbno5KI3p4AaABAg","responsibility":"distributed","reasoning":"deontological","policy":"unclear","emotion":"fear"}, {"id":"ytc_UgzbqyJZJh7v-OcV6R54AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"indifference"}, {"id":"ytc_Ugzq9Gz_UOu8_ZJVsSV4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UgxejZPyCvue7NvOrat4AaABAg","responsibility":"developer","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_UgyuV9KJHCGqHHQjT6x4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"}, {"id":"ytc_UgztL_pW_521o7TagOl4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"mixed"} ]