![]() |
|
Perl Monk, Perl Meditation | |
PerlMonks |
Re: Best way to recursively grab a websiteby gjb (Vicar) |
on Mar 29, 2005 at 11:53 UTC ( #443108=note: print w/replies, xml ) | Need Help?? |
If you don't mind one system call, you could go with wget, an excellent tool to download an entire website. Command line option allow to restrict downloads to a single site, a certain depth and what not. All in all, a very valuable tool. It can be found at http://www.gnu.org/software/wget/wget.html. Did I mention it's free software (a GNU project to be precise)? Hope this helps, -gjb-
In Section
Seekers of Perl Wisdom
|
|