Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> 1. The bots have essentially unlimited memory and CPU. That's the cheapest part of any scraping setup.

Anyone crawling at scale would try to limit the per-request memory and CPU bounds, no? Surely you'd try to minimize resource contention at least a little bit?





Then why generate text at all? Just run a script that enters an infinite loop. But the bots would have to protect against this or the scrapers wouldn't make it very far on the larger internet, would they? Spending a few microseconds on the server costs essentially nothing, and guarantees the scraper's most precious resource (bandwidth) is wasted.



Consider applying for YC's Winter 2026 batch! Applications are open till Nov 10

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: