http://qs321.pair.com?node_id=1000500

Uree has asked for the wisdom of the Perl Monks concerning the following question:

Hi Monks,

First of all, and this being my first post in the place, hi everyone.
Although I've definitely been relying on this site for some time now, this is the first question I ask (which, let me say, should speak well about the community).

Now, Im using LWP::UserAgent to download large XML files. Files do download successfully and their content is the expected, the problem is the high memory usage due to their size (>100MB).
To sort this out, I tried to rely on LWP::UserAgent 's "get" function option ":content_file".
Here's my code:

#!/usr/bin/env perl use strict; use warnings; use File::Temp; use LWP::UserAgent; #use HTTP::Request; do_task(); sub do_task { my $ua = LWP::UserAgent->new( 'ssl_opts' => { 'verify_hostname' => 0 } ); $ua->show_progress(1); my @urls = ( "http://linkToAFat.xml", ); foreach my $url ( @urls) { my ($fh, $path) = File::Temp::tempfile(DIR => '/tmp/my_tmp'); $ua->get($url, ":content_file" => $path); #my $request = HTTP::Request->new(GET => $url); #my $response = $ua->request($request, $path); } }
(Commented lines are additional ways I've tried to work the high mem usage around)

My problem is that, apparently, although I'm using the ":content_file" option, files' content DOES still get loaded into memory.

I am quite stuck with this one, so I'd appreciate Monks' almighty support.
Thanks in advance!