TRoderic has asked for the wisdom of the Perl Monks concerning the following question:
Currently working on a document processor that reads most of a directory into memory, parses the files there-hidden, and then dumps the resultant data to several different files on disc ( the reason is obscure and distressing)
I have found however that the log file writes, given they are going to several different files (and at some point soon probably over network shares) are becoming a bottleneck, which is an issue given there are about 800k 1 MB~ files to do on a more than daily basis. (waiting for the DBA to get on with it)
Therefore I would like to store some quantity of the output into memory until a limit is reached, then dump them to disc. viz:
I am sure there is a better/more portable way of doing this though, but I can't seem to describe it to the search engine daemons so as to get the right sort of answer. can anyone describe/direct or otherwise enlighten me in this regard?<<loop>> { if ($hashmatch{$scrutiny}){ $$bufmem = $$bufmem . "<<secret parsing output goes here!>>\n"; if (length($$bufmem) > 1024){ print("buffer dump\n"); $outfh->print($$bufmem); $$bufmem = ""; } } } $outfh->print($$bufmem);
yrs,
TR
|
---|
Replies are listed 'Best First'. | |
---|---|
Re: temp hold logfiles in memory?
by Eliya (Vicar) on Mar 19, 2011 at 16:14 UTC | |
Re: temp hold logfiles in memory?
by ELISHEVA (Prior) on Mar 19, 2011 at 20:35 UTC | |
by deibyz (Hermit) on Mar 21, 2011 at 09:19 UTC | |
by TRoderic (Novice) on Mar 20, 2011 at 12:32 UTC | |
Re: temp hold logfiles in memory?
by graff (Chancellor) on Mar 19, 2011 at 17:04 UTC | |
by TRoderic (Novice) on Mar 19, 2011 at 17:22 UTC | |
by flexvault (Monsignor) on Mar 20, 2011 at 17:41 UTC |