Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
I agree with everything that you are saying. I find it interesting that currently the focus is on “replacing” software engineers, but not many are admitting that other industry professions such as law, accounting etc can already be mostly replaced with the current state of AI and yet I do not see a lot of concern in those industries, atleast for now… Another thought I had, if in let’s say 10 years enterprise level applications could be spun up by AI, this would destroy the value tied to not only software engineers but also software in general. This would then be a hilarious case of the largest tech companies in the world devaluing all their own software with these investments, which seems unlikely. However since the dawn of AI humanity has shown time and time again that they will do anything for a profit, even if it’s dismantling their own companies value in the process or destroying livelihoods across many industries. I still believe policies and regulations should be put in place by the world organization to control what AI can be used for and one of those should be to rule against the replacement of jobs by AI, as many large companies such as Meta have suggested and are practicing.
youtube 2025-05-07T11:4…
Coding Result
DimensionValue
Responsibilitynone
Reasoningconsequentialist
Policynone
Emotionmixed
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_Ugz0uD4zmwEETI9Dxv94AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytc_UgySnbn_WXQucUaHNpx4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UgxKQyiIxVNw12INWrx4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"}, {"id":"ytc_Ugx_ZKJk3ltF_CmDilN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UgwBaV5MK-WRDAKuoT94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytc_Ugx3JbGBe4SewpyRX2Z4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"}, {"id":"ytc_Ugw4NfSQ_Y5Q5PR-JWN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytc_Ugwc-OWG7NjbC9wqpUZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"}, {"id":"ytc_UgwzUVkWr8oG18iVwwp4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"}, {"id":"ytc_UgzMgGy0qut1EoFYKUR4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"resignation"} ]