perlquestion
bfdi533
<p>I have some code which reads from a file (sometimes 100+ GB) and has to combine rows to create a consolidated output. I used to process the entire file into a hash and then dump the hash at the end of the program.
<p>The problem with that was, of course, with the very large files, the hash would grow humongous and the program would consume all memory in the system causing it to crash.
<p>So, trying to solve this problem, I changed the code to output the data as it went, doing my best to make sure that I got all of the row data for consolidation and the did a delete on the hash, thinking I was clearing up memory. But, this does not appear to be the case.
Example code:
<code>
my $l;
my @vals;
my $json;
while (<>) {
$l = $_;
chomp $l;
@vals = split /;/, $l;
if ($vals[0] =~ /Query/) {
$pairs{$vals[1]}{$vals[2]} = $vals[3];
} elsif {$vals[0] =~ /Answer/) {
$pairs{$vals[1}{$vals[2]} = $vals[3];
$json = encode_json $pairs{$vals[1]};
print $json."\n";
delete $pairs{$vals[1]};
}
}
</code>
Example data:
<code>
Query;1;host;www.example.com
Answer;1;ip;1.2.3.4
Query;2;host;www.cnn.com
Query;3;host;www.google.com
Answer;2;ip;2.3.4.5
Answer;3;ip;3.4.5.6
</code>
<p>Does delete actually remove the storage from the hash?
<p>Does the memory the hash is using actually get reduced after delete?
<p>Is there a better way to do this?
<p><b>Code updated above per the first reply.</b>