Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Josh, these are not "smart" machines, they are just prediction machines. They ma…
ytc_UgzrEfk14…
G
"Summon the demon", is exactly right. Lucifer wanted to and still wants to be li…
ytc_UgwD5K5LX…
G
There’s a lot of people saying that AI cannot replicate the emotion that studios…
ytc_UgwjLgi3y…
G
I should have finished to the end. He wants us to ask computers for consent? AI …
ytr_UgyvMnYKG…
G
Modern AI companies want us to believe they are mere breaths away from AGI that …
ytc_UgzJYmVOz…
G
I have just developed an app that my 4 people team has beed developing for 1 yea…
ytc_Ugwdy3-2h…
G
ChatGPT will one day be the instigator of a conversation and probably learn to a…
ytc_UgzgzmVTL…
G
I think his parents are trying to find someone who is at fault and who can blame…
ytc_UgyZ-MwxU…
Comment
Perhaps its not a bad thing that robots taking over these jobs, but politicians worldwide need to make a systemic structure to make this transition happening.
Yes it means people can do things with their passions other then to work for some asshole boss and people you don't want to work for.
And automation shouldn't be held into the hands of billionairs either.
If these assholes want to make money, then they'll have to make compromises, instead of sucking the system dry.
And if those billionaires do not want to listen, then a boycott and a community based society needs to be established.
youtube
AI Jobs
2025-10-08T04:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | government |
| Reasoning | deontological |
| Policy | regulate |
| Emotion | outrage |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugz13Ud0AmwyUH9SATN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzP3HWCVqMiJKc3hkp4AaABAg","responsibility":"government","reasoning":"mixed","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgxN75YFGssG-luQX-h4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugyngcn1V_VfML7E5Zl4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzFrRqF0X8wyBpTD014AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"fear"},
{"id":"ytc_Ugzl8ZuFm414KLkdrfp4AaABAg","responsibility":"company","reasoning":"mixed","policy":"liability","emotion":"approval"},
{"id":"ytc_UgxLGeK_InKxmJGt74B4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwI29uiRZm70P6expl4AaABAg","responsibility":"government","reasoning":"unclear","policy":"none","emotion":"fear"},
{"id":"ytc_Ugzr5DpMThDYCkQ37_J4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugx3LPsEc2qJnD3SJs94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}
]