Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I cannot be polite to a machine. It just have to do what I demand from it for wh…
ytc_UgwZlxBz3…
G
This Uber Self driving car would NEVER make it in Rome, Italy, in fact, it would…
ytc_Ugx1-RYHm…
G
BOOM *mic drop* I'm still learning to draw it's always motivating to watch you :…
ytc_UgxWC3eio…
G
They ready have a model for this using RPA in Power Automate to run apps. Yes yo…
rdc_oh2qanp
G
@emptyvienna As i very clearly explained, you can't just trade concepts betwee…
ytr_Ugyjwwy0m…
G
So if people working on artificial intelligence do not know how it's doing what …
ytc_Ugwrwf5dj…
G
8:36: What an oxymoron to talk about safety and in the same sentence say "AI sys…
ytc_UgwM0Rl6-…
G
@Dawn-MercuryYou can work in Google Docs offline. It’s not wasting time nor is …
ytr_UgzSPsIsi…
Comment
A lot of the optimistic side's arguments can be destroyed by one simple argument. The moment that AI gets a physical body, which it will, there will be nothing that humans can do that an AI can't. Even if new jobs appear, the AI will be able to take those as well. This will wipe out millions of jobs.
To create a counter point to this argument, humans tend to value the experiences of other humans. It is highly likely that jobs like streaming, entertainment, and competition will flourish. You already see humans paying to see human artists over AI. The issue with these jobs is that you have to be the best of the best to make a career out of them.
youtube
2026-03-18T20:1…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[{"id":"ytc_Ugy3tGnf860LB52WF9F4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},{"id":"ytc_UgwQNf47TP_TJuLawap4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},{"id":"ytc_UgyMmuUohqtM61e0OEJ4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"mixed"},{"id":"ytc_UgxoutvQ53XEaaTc3h14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},{"id":"ytc_UgzetDtddYDl9oQqzwB4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},{"id":"ytc_Ugw2WyEfDJ7MuceFcbR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},{"id":"ytc_Ugz9C-wpLVh54OKxvnZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},{"id":"ytc_UgzmVA9Lwu-e0ZS39394AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"unclear"},{"id":"ytc_UgxdaAYUTXXAqoVDRxN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},{"id":"ytc_UgyP5KWz50lTumnrPI14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}]