Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Interesting...i'm a retired md, and I guess I kind of use AI like we used the cl…
ytc_UgxRmkcyS…
G
13:40 Why the hell would you build a robot to emulate a human using human inputs…
ytc_UgxxrRai_…
G
If you pay for ChatGPT it's in the legal documents that confidentiality can be b…
ytc_Ugwea_6ZN…
G
4:30 This is done by Indian employees. My brother's been working in that team si…
ytc_UgzVacibY…
G
Ai does but it doesn’t understand
It makes based on 1000 of pictures but doesn’t…
ytc_UgxNoddeN…
G
AI will take over everything, so then the best skills for the future generation …
ytc_UgyHSL72Y…
G
I work a lot with AI -- you really need to know its pitfalls before you can use …
ytc_Ugy3g0ts8…
G
So basically, Roger here at no point has mentioned anything about making frickin…
ytc_UgzXOTdBc…
Comment
You’d think it’s a logical question, but apparently even the “wise” are rushing ahead like kids in a candy store without thinking things through. It reminds me of the Y2K panic—all that global hysteria over a date change. Why are humans so easily swept up in excitement? Why do we crave controversy and insist on complicating what’s straightforward?
This talk about AI taking over jobs is as feckless as a child trying to drive a car. It’s not going to happen. Think about it: Who builds AI? Humans. So why would we build something to replace ourselves? Who’s in charge? We are—or at least, we should be.
The moment we abandon our responsibility is the day robots could take over. But that’s not a tech problem—it’s a human one. Stay in control, stay accountable, and the machines stay in their lane.
youtube
AI Jobs
2026-01-16T14:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | mixed |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgxU3CqSuDl3ixrfdAB4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"resignation"},
{"id":"ytc_UgybrV1i5ySrl7fBR8R4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwmhnThEo7ZG9QnK754AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugy-4K3f3mUbmRqmTVN4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugw69xkZ8vbJfd6xR1V4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxsRbR9Zyk2TvZv3Zt4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"mixed"},
{"id":"ytc_Ugyf_jOKliRQxXgv1lV4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgwUNTr_vxDF1I0wgf54AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugyo5rfhbPmlMjlxOGN4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugy2zeV7ERXGRqy7CGd4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"regulate","emotion":"approval"}
]