perlquestion
solegaonkar
Hello Monks
I have a strange problem here. I read a huge file (1+ GB) 1000 records at a time. I read the records into hash map, process them and then try to free the memory using undef. Then process the next 1000 records and so on.
But, strangely the code runs out of memory after some time. Looks like the garbage collection is not triggered in time. Is there some way I can enforce garbage collection?
Have you faced such a problem?
The code is something like this...
<code>
my %map
while (sysread(CSV, $record, 66)) {
$map{substr($record, 18, 14)}->{substr($record, 3, 15)} = substr($record, 36, 29);
if ($count++ > 1000) {
&process();
undef %map;
}
}
</code>
Update::
The problem was solved by using scope rather than delete/undef... Updated the code to the following and that performs much better!!
<code>
while (sysread(CSV, $record, 66)) {
my %map
my $count=0;
$map{substr($record, 18, 14)}->{substr($record, 3, 15)} = substr($record, 36, 29);
while (sysread(CSV, $record, 66)) {
$map{substr($record, 18, 14)}->{substr($record, 3, 15)} = substr($record, 36, 29);
if ($count++ > 1000) {
&process(\%map);
last;
}
}
&process
}
</code>
Thank you all for your help!!