Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Somehow this has been the most humane and truthful conversation that I've witnes…
ytc_Ugx1D0nFt…
G
We need a disaster caused by AI. And soon. It needs to be significant enough, wh…
ytc_Ugz--w5v9…
G
Why the fuck would this guy just bully random people for no reason? Just for usi…
ytc_UgxShNtHe…
G
Do you think we will be able to collectively organize, as a human species, in pr…
ytc_Ugwg3JKRc…
G
Jobs that are generally not legal, ethical, or feasible for AI to perform entire…
ytr_UgyRnUiQu…
G
I think this video really misunderstands AI and machine learning. No one is prog…
ytc_Ugx3sLKS_…
G
I judge Sam on how ChatGP5 is working. OpenAI is doing great as far as I am conc…
ytc_UgxoPpB2X…
G
as a artist ( not very good ) ai is actually taking gibli from real artists and …
ytc_UgxL-jWiK…
Comment
The worry is the population that is way too higher to sustain without jobs. The other thing is even if your hours of work is reduced by AI but what about companies? They won’t reduce hours of working but instead work output will increase. AI will only benefit the owners but not the employees, the employees will only get squeezed more.
youtube
AI Jobs
2025-06-29T17:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_Ugw_9S-Cj1thwsc3d0h4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyJRzZa9evBy-BnAEd4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwOofGLOz7MjFk64qd4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzDeCj48gHzk8xJo454AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugz-cl9LNVumeyZ18gZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxkUt3Y5ieyx1Jk9B14AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugytu4ncboFzhXcnJyt4AaABAg","responsibility":"company","reasoning":"virtue","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxeenAIOP3VTxCNJYN4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugzm6FSNRRGAbq9R_9d4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugy6CtSgRPePvzpXyl54AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}
]