Beefy Boxes and Bandwidth Generously Provided by pair Networks
laziness, impatience, and hubris
 
PerlMonks  

Re: goose.pl

by hacker (Priest)
on Jul 18, 2002 at 12:49 UTC ( [id://182783]=note: print w/replies, xml ) Need Help??


in reply to goose.pl

Since nobody else has commented, I will add mine. WONDERFUL!

I've got a few ideas for this, and it actually is very close to the spider I've been working on for Plucker in my spare time. Bear with me while I brain-dump these out:

  1. LWP::Parallel::UserAgent to fetch content asynchronously.
  2. HTTP::Cookies for storing client-side cookies into the "jar"
  3. Link rewrite rules, so gathering/spidering the content locally after fetching can be done with relative/absolute links, without breaking anything. (URI::URL can help here)
  4. Ability to forge Referer and UserAgent strings through the use of --referer and --useragent. Trivial to add.
  5. More verbose progress reporting (use LWP::Debug qw(+);)
  6. Options for staying on the same host, same domain, or staying below a certain fragment of the URI. Something like:
    # url http://www.domain.com/foo/bar/blort/quux --staybelow http://www.domain.com/foo/bar/ --stayonhost www.domain.com --stayondomain domain.com
  7. Ability to update the "cache" on multiple runs. Compare the remote file with the local file, and fetch if newer.

Expect some patches from me to come flying in within the next few weeks on this one. Great work!

Log In?
Username:
Password:

What's my password?
Create A New User
Domain Nodelet?
Node Status?
node history
Node Type: note [id://182783]
help
Chatterbox?
and the web crawler heard nothing...

How do I use this?Last hourOther CB clients
Other Users?
Others examining the Monastery: (4)
As of 2024-03-29 11:10 GMT
Sections?
Information?
Find Nodes?
Leftovers?
    Voting Booth?

    No recent polls found