Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Replace "AI" with "social media" in this post and the problem becomes crystal cl…
rdc_mujb784
G
I have to ask a legitimate question and not to be seeming as though I am making …
ytc_Ugzm3dW1a…
G
Wow. I will give a good argument for it. ai art shouldnt be made and say this is…
ytc_UgzhZjFRI…
G
I’m a bit ambivalent about AI in general. I work in an industry which is startin…
ytc_UgzZTjkIg…
G
If you spend a little time with AI and you have any expertise at all you will no…
ytc_UgyfefvNm…
G
I'd been wondering about this as well as how it drives in bad weather when you a…
rdc_d1kjnrm
G
I swear to god we are 1-2 years away from an AI Kurt Cobain doing an advert for …
rdc_o62ohfl
G
Strawman argument I see all the time. A better argument would be that art has no…
ytc_UgwUBcuMW…
Comment
How you should listen to the first part of this video: McKinsey advisors did research into topic they know nothing about, made a bullish guess on wiping out 20 % of labour market, tech sector has already been hit meaning that in general it is slowing down, AI still isn't replacing it is merely assisting and trust me it won't be replacing any time soon (eventhough the assisting has cost some jobs and slowed down hiring) CEO's selling the AI hype say the hype can get bigger, like duhh thats what those ceo's want us to believe true or not. And yes we should be worried about people losing jobs to automation but the solution is so simple, we lose jobs not because jobs are lost but because they don't require humans. that means we can keep living at the same standard but we need to work on distributing wealth more.
youtube
AI Jobs
2025-06-05T08:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgyX_l9NMncXb51W5qh4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzywVx2WCJz-8drzHB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwMpRFvM_t9ZV3Xu3V4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugw2DBRt6YrYGPnZY-V4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwQRGrgoTQJdUJkPlt4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxkW2wN3aLN2eay-jZ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugz-oPYJh0obQcbR5rt4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxJtLwrd1tEBpDajOh4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugy_MBA92yg5dB9euh14AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgyunEZ1Nj4M5sxroN94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"resignation"}
]