Since nobody else has commented, I will add mine.
WONDERFUL!
I've got a few ideas for this, and it actually is very close to the spider I've been working on for Plucker in my spare time. Bear with me while I brain-dump these out:
- LWP::Parallel::UserAgent to fetch content asynchronously.
- HTTP::Cookies for storing client-side cookies into the "jar"
- Link rewrite rules, so gathering/spidering the content locally after fetching can be done with relative/absolute links, without breaking anything. (URI::URL can help here)
- Ability to forge Referer and UserAgent strings through the use of --referer and --useragent. Trivial to add.
- More verbose progress reporting (use LWP::Debug qw(+);)
- Options for staying on the same host, same domain, or staying below a certain fragment of the URI. Something like:
# url http://www.domain.com/foo/bar/blort/quux
--staybelow http://www.domain.com/foo/bar/
--stayonhost www.domain.com
--stayondomain domain.com
- Ability to update the "cache" on multiple runs. Compare the remote file with the local file, and fetch if newer.
Expect some patches from me to come flying in within the next few weeks on this one. Great work!