Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
The irony is that the large IT corporation that I work for had a webinar for Mental Health Week, where people basically recommended using AI chatbots as a replacement for a therapist. Not a word of working less was spoken. I used to think that AI was mostly going to affect first-contact help desk support, but corporations are pushing AI on employees, despite AI not being a real selling point for the majority of situations. I lost out on a promotion because apparently a large amount of the company's spending went into AI. You're right, AI absolutely is making people more productive, but this is going to affect jobs. While I don't think we necessarily need to keep human jobs open simply to pay people, we absolutely need solutions. If AI can do someone's job more effectively, then maybe use AI. At the end of the day, we still have two core issues; who is even going to make money to be the consumers, and why do we base so much our own worth around the ability to make money? I quite literally can't even think of how the future is going to be if AI and robots are doing the majority of the work. Only those who can afford the outputs will be funding the venture. Yes, this could break the US if the consumers shift to international customers, or only the top 1%. This could create a society that's unsustainable.
youtube AI Jobs 2025-10-08T19:4… ♥ 1
Coding Result
DimensionValue
Responsibilitycompany
Reasoningdeontological
Policynone
Emotionoutrage
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_UgxRfU3UQq0aDXxKC8Z4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_UgwokcFQimcntbWXdzl4AaABAg","responsibility":"company","reasoning":"mixed","policy":"regulate","emotion":"fear"}, {"id":"ytc_Ugw3riuaSLuvvSeAOW54AaABAg","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"outrage"}, {"id":"ytc_Ugx4DT2LCD49b-RiPI54AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgzQQv3-NVR5rzUZKt14AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_UgwdwfyH_McIGAaFD5F4AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"regulate","emotion":"approval"}, {"id":"ytc_UgzxeT1D9GFx9Vksa854AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"liability","emotion":"mixed"}, {"id":"ytc_UgyDY66qjHqTFtEWlKJ4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"}, {"id":"ytc_Ugx1JTSP7bgf_b8Uj-x4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_UgyDN6-sTApnrwz9eq94AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"outrage"} ]