Beefy Boxes and Bandwidth Generously Provided by pair Networks
Your skill will accomplish
what the force of many cannot
 
PerlMonks  

Re^2: parallel web get with lwp::simple

by biosysadmin (Deacon)
on Jul 26, 2004 at 14:38 UTC ( [id://377429]=note: print w/replies, xml ) Need Help??


in reply to Re: parallel web get with lwp::simple
in thread parallel web get with lwp::simple

Another very simple way to fork processes using LWP is by using Parallel::ForkManager. It's shockingly simple to make your code parallel in this way, here's how you could apply it to the loop in your code:
use Parallel::ForkManager my $max_forks = 20; my $forkmanager = Parallel::ForkManager->new( $max_forks ); for ($count = 0; $count <= $max; $count++){ $forkmanager->start and next; my $content; unless (defined ($content = get $URL)) { die "could not get $URL\n"; } if ($content =~ /Test1/i) { print "."; } elsif ($content =~ /Test2/i) { print "Fetched page from Server2 \n"; $count++; &result; } else { print "Page not retreived \n" }; $forkmanager->finish; } $forkmanager->wait_all_children;
This testing is also probably dependent on the maximum number of clients you support, this may not be high enough to stress your gateway either. Check on the MaxClients parameter in your httpd.conf for more information. Depending on what you're testing, it may be fruitful to use LWP to download larger files rather than making more requests.

Best of luck. :)

Log In?
Username:
Password:

What's my password?
Create A New User
Domain Nodelet?
Node Status?
node history
Node Type: note [id://377429]
help
Chatterbox?
and the web crawler heard nothing...

How do I use this?Last hourOther CB clients
Other Users?
Others taking refuge in the Monastery: (4)
As of 2024-04-19 02:19 GMT
Sections?
Information?
Find Nodes?
Leftovers?
    Voting Booth?

    No recent polls found