Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
I’m in favor of it. I’m very optimistic. I have this vision of a world of abundance created through artificial intelligence, robotics, and everything that comes with it. A good example of this kind of optimistic outlook is Ray Kurzweil’s book The Singularity Is Near. My own view is that the economic and labor transformation won’t be limited to one sector. AI and robotics will reshape every industry, which means the vast majority of people simply won’t be able to work. At first, as many say, new jobs might appear like in previous industrial revolutions, but what we’re not acknowledging is that AI and robotics will eventually surpass us in everything we can do. We’ll have to find new ways to give meaning to our lives beyond the economy. And honestly, a world where we try to find meaning beyond money can’t possibly be worse than the one we already have. I’m not even talking about AI curing cancer and all that, which by now is basically common sense if you’re even slightly informed. So for me, the faster AI takes our jobs, the better. The sooner we get rid of this economic system, which is completely and inevitably rotten, the better. I couldn’t be happier about it. I just hope I’m right, because if not, we’re incredibly screwed. Then again… we already are.
youtube AI Jobs 2025-10-08T03:3…
Coding Result
DimensionValue
Responsibilitynone
Reasoningunclear
Policynone
Emotionapproval
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_UgzgE65yZglvncFmif14AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"ytc_UgygGm7KZcWszdjvxul4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"}, {"id":"ytc_Ugz-Dg1LyJ3UXceH15V4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"ytc_UgxvTW0izo3vVTcQ0QR4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"}, {"id":"ytc_Ugxba0pZXt1SYBhdBf14AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"ytc_UgwsxDqWH1QV4qzbxv14AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"fear"}, {"id":"ytc_Ugz4BQIZuUV_ocdiyBB4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"outrage"}, {"id":"ytc_Ugw_MlQzlVTeYGbrd-x4AaABAg","responsibility":"government","reasoning":"deontological","policy":"unclear","emotion":"outrage"}, {"id":"ytc_UgzO599XfY9kMimYSZZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_UgyuaFo3rgNYUEoVPGl4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"approval"} ]