[RndTbl] slowing httpd access to cgi-bin scripts
sean at ertw.com
Fri Sep 17 16:26:42 CDT 2010
There used to be a third party module, something like mod_limit or
mod_bwlimit, that lets you limit connection rates.
That said, the developer in me says "cache". :)
On Fri, Sep 17, 2010 at 4:08 PM, Gilles Detillieux <
grdetil at scrc.umanitoba.ca> wrote:
> Every once in a while, some doofus points a web crawler at our web site
> and, ignoring the disallowed areas in our robots.txt file, starts
> crawling through some of our cgi-bin scripts at a rate of 4 to 8 hits a
> second. This is particularly annoying with some of the more processor
> and disk intensive CGI programs, such as man2html, which also happens to
> generate lots of links back to itself.
> Is there anything I can set up in Apache to throttle back and slow down
> remote hosts when they start hitting hard on cgi-bin? I don't want to
> do anything that would adversely affect legitimate users, nor make
> important things like the manual pages hard to find by removing any
> public links to them. But when a client starts making 10 or more GET
> requests on /cgi-bin in a 5 second period, it would be nice if I could
> get the server to progressively add longer and longer delays before
> servicing these requests, to keep the load down and prevent the server
> from thrashing.
> I'd appreciate any tips.
> Gilles R. Detillieux E-mail: <grdetil at scrc.umanitoba.ca>
> Spinal Cord Research Centre WWW: http://www.scrc.umanitoba.ca/
> Dept. Physiology, U. of Manitoba Winnipeg, MB R3E 0J9 (Canada)
> Roundtable mailing list
> Roundtable at muug.mb.ca
Sean Walberg <sean at ertw.com> http://ertw.com/
-------------- next part --------------
An HTML attachment was scrubbed...
More information about the Roundtable