Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
ça fait bien 10 ans qu'une chercheuse au MIT parlait déjà des biais des algorith…
ytc_UgxrvR4RJ…
G
Me: Hey ChatGPT can you make my 3D-model lean foward a bit when x happens?
Chat…
ytc_UgwVD5P4W…
G
Young people especially will "let" chatgpt DO their work. Immediate rewards alwa…
ytc_Ugz2F47MH…
G
AI art looks good but will always feel soulless and uncreative, artists can't ac…
ytc_UgzP897lY…
G
@AlverinI am all in on the AI doom risks. My point was there needs to be a bette…
ytr_UgyE8PWRj…
G
ChatGPT a écrit une superbe chanson, et vous l'avez sublimée avec une production…
ytc_UgyD1yKQA…
G
Bro those people need to shut up already, if you aren’t an artist and don’t unde…
ytc_UgxXv4ADy…
G
My biggest pet peeve is when they talk about making AN ai that is good. There wi…
ytc_Ugx_dY-D5…
Comment
This video hit hard. Tom is right, jobs are vanishing faster than anyone expected. It actually reminded me of a book I just finished called 12 Last Steps. It breaks down how AI and automation don’t just change industries, they rewrite the entire foundation of human survival, step by step. Watching this video felt like seeing one of those steps in real time. If half of what’s predicted comes true, we’re not just talking about new careers, we’re talking about how to avoid collapse altogether.
youtube
AI Jobs
2025-09-24T11:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgyGv-kG5guT1Ysl-R14AaABAg","responsibility":"elite","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgyiFFwNe9NX9UZcPpl4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgyXEgRxeaDVNm6Ej514AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxxLJf7hDRwoVe51fF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzgmRep8vgnsnEcZaJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzagJrPGC2qbWKBLb54AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgxHjDeV3frwrzpM1e94AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgyhjrgfPFzsgJQAFwF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzyFuVvf-pMcyWQcgp4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzojerQg36C2DnldiF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"frustration"}
]