![]() |
|
Problems? Is your data what you think it is? | |
PerlMonks |
Re^3: What is the fastest way to download a bunch of web pages?by BrowserUk (Patriarch) |
on Mar 03, 2005 at 13:45 UTC ( #436204=note: print w/replies, xml ) | Need Help?? |
The difference seems to be that you restricted yourself to three threads, Just add -THREADS=10 to the command line. Try varying the number 2/3/5/10 and see what works best for you. With my connection, the throughput is purely down to the download speed, but if you are on broadband, the network latency may come into play. Chossing the right balance of simultaneous requests versus bandwidth is a suck-it-and-see equation. It will depend on a lot of things including time of day, locations etc. You can also use -PATH=tmp/ to tell it wher to put the files. You really need to be doing more than 10 sites for a reasonable test anyway. Examine what is said, not who speaks.
Silence betokens consent.
Love the truth but pardon error.
In Section
Seekers of Perl Wisdom
|
|