Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
At the very very least, these apps should be required to be programmed to recogn…
ytc_UgyHgiC1L…
G
Ironic? Hypocritical? Funny how the same people who think a.i. is ripping off …
ytc_Ugy5t7DZG…
G
@EEEEEEEEE-o6d were talking about the over reliance of AI art here & this weirdo…
ytr_Ugx5E7JZR…
G
In the future we will destroy ourselves, but a few will survive by uploading the…
ytc_Ugy4wG5Rr…
G
No, AI is a tool, nothing more than a utility for humans, the supreme life form.…
ytc_UgxquSiiB…
G
honestyl but how did they replace construction worker and builder? giant robot t…
ytc_UgwEyq-hS…
G
AI art is great for those of us without artistic talent or shitloads of money.
…
ytc_UgyhOM1-y…
G
Ai isn’t the issue it’s the symptom of the bigger problem in our society. Our pr…
ytc_UgxZQiDog…
Comment
All that really is missing is the middleware that connects jobs to AI. for instance in IT handing IT adminstration: Setting up new users, help desk support, VM deployment, patch management, Same for other departments: Sales, marketing, HR, etc.
The biggest risk is prompt or AI injection, which enables a hacker to trick an AI system into doing nefarious tasks: Wire transfer $5M to this offshore bank account, or run this randsomeware on all of the companies servers. Or AI leaking confidential information. This is really the only factor that will prevent or delay widespread AI deployment. Systems will need to be configured to limit the AI capabilities in a way that prevents the AI from circumnavigating security features.
youtube
AI Jobs
2026-02-24T17:3…
♥ 4
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgySkEnSxUA4hLz41hF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzCf6lulGBXfBgdAyZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgyhoeGG9FWyxdDa1Td4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzuMLSL0R-3iPTwlZ14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzKCFukqDsEIhjDOMt4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_UgxQrw-eGKZbVnLmwpJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwFR3AVNls0KTgOtKl4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwfZ4vOIMxG9Yp2HUt4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugyu-0yUAqCP6H3Dwnd4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugz3oiciQzRwGIceFmV4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"resignation"}
]