http://qs321.pair.com?node_id=1090101

solegaonkar has asked for the wisdom of the Perl Monks concerning the following question:

Hello Monks I have a strange problem here. I read a huge file (1+ GB) 1000 records at a time. I read the records into hash map, process them and then try to free the memory using undef. Then process the next 1000 records and so on. But, strangely the code runs out of memory after some time. Looks like the garbage collection is not triggered in time. Is there some way I can enforce garbage collection? Have you faced such a problem? The code is something like this...
my %map while (sysread(CSV, $record, 66)) { $map{substr($record, 18, 14)}->{substr($record, 3, 15)} = subs +tr($record, 36, 29); if ($count++ > 1000) { &process(); undef %map; } }
Update:: The problem was solved by using scope rather than delete/undef... Updated the code to the following and that performs much better!!
while (sysread(CSV, $record, 66)) { my %map my $count=0; $map{substr($record, 18, 14)}->{substr($record, 3, 15)} = substr($ +record, 36, 29); while (sysread(CSV, $record, 66)) { $map{substr($record, 18, 14)}->{substr($record, 3, 15)} = subs +tr($record, 36, 29); if ($count++ > 1000) { &process(\%map); last; } } &process }
Thank you all for your help!!