Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Ai programmers are like the most dangerous doom cult ever please fix the alignm…
ytc_UgyclQvkM…
G
Tax dollars pay for the roads, where was the public vote for AI vehicles? These …
ytc_UgyRdNLRQ…
G
i dont think it will be a matter of AI just being smarter than a human. it will …
ytc_UgwrPMrVl…
G
@ReactInfo54 There is a huge difference between generative AI and assistive AI …
ytr_Ugw8vNCEe…
G
LOL! We are hitting limits with current AI. This is another hype segment. Wonder…
ytc_Ugxg4Dd-u…
G
Stop because AI treats me so much nicer than literally any human I know. I ask m…
ytc_Ugzg-tXgq…
G
- why is this #2 robot double the price of #1?
- It has real-like teeth and a…
ytc_UgyEHXGXp…
G
Until operational quantum computing, AI is nothing to worry about. Super AI won'…
ytc_UgzFmbQmv…
Comment
This is a distortion. People will still be needed, which means juniors will be needed. I work with AI when coding every day and although it keeps getting better, I always have to intervene at some point, albeit less frequently (both as I get better at how to tell them what to do, and as they improve over time), but I think that asymptote still leaves room for knowledge workers, and the real *insidious* danger is that, lacking full understanding, they will write really insidious bugs that only the smartest humans will be able to solve.
Also, AI's can't pick direction. They have no sense of value or the cost of anything. I have to constantly remind them to watch for O-notation and stop writing to disk (instead of far faster memory) because time and LOC cost money and maintenance.
youtube
Viral AI Reaction
2025-11-23T19:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgzuIV4-Q0TtxEoXKEN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgyCsPD0oyBNUBd0ant4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwZrfadMbdH3l4W6nl4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwsCtAO6fBuhoZrBbt4AaABAg","responsibility":"government","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugzrvm4yhXOJEXzrQG94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugx77tHWcDTp4H-eNwJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgxXVQr7PUKycwO2YBV4AaABAg","responsibility":"company","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgwzMWdjXm6YLVItTpZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"liability","emotion":"indifference"},
{"id":"ytc_Ugy4dpOkhhcDLJd-ckl4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgyL9d2TfN2BMa0wVHl4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}
]