Beefy Boxes and Bandwidth Generously Provided by pair Networks
laziness, impatience, and hubris

Re: Optimize DBI connect to avoid max_user_connections

by InfiniteSilence (Curate)
on May 29, 2014 at 14:29 UTC ( #1087806=note: print w/replies, xml ) Need Help??

in reply to Optimize DBI connect to avoid max_user_connections

Randal Schwartz wrote an article on this subject titled, Throttling Your Web Server which might be useful.

Otherwise I would think that settings in your robots.txt might be sufficient to tell the spider to either slow down (there's a Crawl-delay directive) or to simply stop spidering your site.

If you are using Apache (or any modern HTTP server I think) you can, of course, simply deny certain IP addresses.

Celebrate Intellectual Diversity

Log In?

What's my password?
Create A New User
Domain Nodelet?
Node Status?
node history
Node Type: note [id://1087806]
and the web crawler heard nothing...

How do I use this? | Other CB clients
Other Users?
Others examining the Monastery: (4)
As of 2022-01-19 15:12 GMT
Find Nodes?
    Voting Booth?
    In 2022, my preferred method to securely store passwords is:

    Results (55 votes). Check out past polls.