Second question today, please nobody shoot me!
I have a subroutine that fetches a URL and saves it to the named file.
I use LWP::UserAgent and HTTP::Request. As you can see from the code, the
subroutine tries three times to fetch the URL (sleeping for 5 seconds between fetches)
and if all three fail, returns 0. (Leaving the caller to handle appropriately).
My question though is: Is there a better way to do this?
use strict;
use LWP::UserAgent;
#################################################################
# Returns the success of storing $url into newly created $output
sub get_page {
my( $url, $output ) = @_;
my $ua = LWP::UserAgent->new;
$ua->agent( "$0/0.5 " . $ua->agent);
$ua->timeout( 30 );
my $req = HTTP::Request->new( GET => $url );
$req->header( 'Accept' => 'text/html, image/gif' );
for( my $i = 1; $i <= 3; $i++ ) {
# Send the request
my $res = $ua->request( $req );
if( $res->is_success ) {
open( OUT, ">$output" ) or
die "Couldn't open $output: $!";
print OUT $res->content;
close( OUT ) or die "Couldn't close $output: $!";
return 1;
} else {
print "Couldn't get $url on $i/3\n";
print "Sleeping 5 seconds...\n";
sleep 5;
print "Gonna try again...\n";
}
}
return 0;
}