Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
In fact, he will be the reason a lot of people will die. And bad guy? Says who…
ytc_UgzgbtLgz…
G
The explosion of LLMs is coinciding perfectly with the loneliness epidemic (or r…
rdc_mli4bk9
G
Stay safe and build healthy habits guys, your future doctor is passing their tes…
ytc_UgwAiDJmJ…
G
An AI was recently caught trying to escape its lab. It mined crypto currency for…
ytr_UgxpMzx8e…
G
I recently started following an ex-gen ai user and they’re already doing a,amzin…
ytc_Ugxjl9yrR…
G
Except well, this video is technically scripted.
and it provided some form of br…
ytc_UgzvWJiW8…
G
True enough, except that with AI, many companies and particularly the most heavi…
rdc_oi2772u
G
Here's the hard bottom line of it..
Art, is a skill, not a talent..
Being artis…
ytc_UgxGRM6ip…
Comment
Almost every job on earth can and will be automated. People just aren't waking up. We will all be completely and utterly economically useless within 20 years tops. The only reason people will have any value to other people is because they're human. The problem is, that's not enough of a reason for most people to be considered economically worthy of receiving income. This is why so many people from big names like Andrew Yang to Elon Musk to Yuval Noah Harari have tried to raise the alarm bells here saying guys, this is a MASSIVE paradigm shift, and the only good future possible is if we create a Universal Basic Income and automate survival itself. If we fail to automate survival, we will see civilizations either collapse and/or fall into dictatorships with evil sociopaths claiming they'll protect everyone's jobs as long as everyone votes them dictator for life.
youtube
AI Jobs
2023-07-31T10:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgznzuyBYPFVWLVxoHV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugxu9-jYmIBftDJNrvJ4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugx1N5yNRiwCPGQKlJ94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgwSRcPCCVZkWaNEC2B4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyRI9TbW1_hINeQ_Ol4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzPyD7eF_30NyaTFVx4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugy-jyBv95w75ckxTg94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugyg7Cae94xrcASVEgV4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzyYoRz_4U9ymxmPa14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyR_ldnMhyV33YpC2t4AaABAg","responsibility":"company","reasoning":"mixed","policy":"none","emotion":"indifference"}
]