Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
An AI that is so advanced that the smartest people can’t fathom how it is going …
ytc_Ugzx3KCCx…
G
With the executive orders being signed with abandon this past week I feel we mig…
rdc_dcwze5p
G
To anyone that actually believes AI is going to make everything better, all I ca…
ytc_UgyVg2D1J…
G
Paralegal AI programs were being sold for millions over 2 years ago, capitalists…
rdc_jw7o9dn
G
My whole thing is if everyone on earth is worried AI will take over, it's going …
ytc_UgyyKRJxK…
G
There is the energy problem to power this AI. Until it is solved, there is a ca…
ytc_UgzrVv-Kw…
G
@thehonesttruth8808 wanna explain what your talking about. Didn't they do school…
ytr_Ugx77qPk6…
G
You probably made an AI love something.
Nobody thought of training Artificial In…
ytc_UgyPDhe83…
Comment
the token burn on garbage HTML is the part that really hurts. the actual scraping failure is one thing but feeding that noise into an LLM that happily processes it token by token is where the financial damage happens.
two things that would have saved you here:
1. circuit breaker on your agent loop. set a max token budget per URL and a max retry count. if the extracted text has more HTML tags than actual words, bail out instead of re-trying. a simple heuristic like checking the ratio of angle brackets to alphanumeric characters catches 90% of captcha/garbage pages before they hit the LLM.
2. pre-filtering before the LLM. tools like trafilatura or readability-lx can extract clean text from HTML without any AI involvement. run that first, check if you got meaningful content, THEN send to the LLM for structured extraction. cuts your token costs by like 80% on pages that do render correctly too.
the build it yourself trap is absolutely real. i went down the same road and eventually landed on firecrawl for most things. not sponsored or anything, just the pain of maintaining custom scrapers vs paying 20 dollars a month for something that handles the edge cases is not even close anymore.
reddit
Viral AI Reaction
1777057823.0
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | mixed |
| Coded at | 2026-04-25T08:33:43.502452 |
Raw LLM Response
[
{"id":"rdc_oi0ni22","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"rdc_oi0oy9w","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"rdc_oi1bcoz","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"rdc_oi2fibt","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"rdc_oi3ppej","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"}
]