Beefy Boxes and Bandwidth Generously Provided by pair Networks
Think about Loose Coupling
 
PerlMonks  

Re^5: Web Robot

by schumi (Hermit)
on Jul 17, 2003 at 09:45 UTC ( [id://275160]=note: print w/replies, xml ) Need Help??


in reply to Re: Re: Re: Re: Web Robot
in thread Web Robot

Quite true, although most major search engines do actually heed the robots-file, if it is setup properly.

I think the easiest way to restrict access to a directory is setting up a proper .htaccess-file. You could even restrict access by IP-addresses...

On the other hand, using a robots-file (in addition to the above, note!) decreases the amount of 404s in your error-logs... ;-)

--cs

There are nights when the wolves are silent and only the moon howls. - George Carlin

Log In?
Username:
Password:

What's my password?
Create A New User
Domain Nodelet?
Node Status?
node history
Node Type: note [id://275160]
help
Chatterbox?
and the web crawler heard nothing...

How do I use this?Last hourOther CB clients
Other Users?
Others taking refuge in the Monastery: (5)
As of 2024-04-25 10:03 GMT
Sections?
Information?
Find Nodes?
Leftovers?
    Voting Booth?

    No recent polls found