http://qs321.pair.com?node_id=894160

TRoderic has asked for the wisdom of the Perl Monks concerning the following question:

Hello!

Currently working on a document processor that reads most of a directory into memory, parses the files there-hidden, and then dumps the resultant data to several different files on disc ( the reason is obscure and distressing)

I have found however that the log file writes, given they are going to several different files (and at some point soon probably over network shares) are becoming a bottleneck, which is an issue given there are about 800k 1 MB~ files to do on a more than daily basis. (waiting for the DBA to get on with it)

Therefore I would like to store some quantity of the output into memory until a limit is reached, then dump them to disc. viz:

<<loop>> { if ($hashmatch{$scrutiny}){ $$bufmem = $$bufmem . "<<secret parsing output goes here!>>\n"; if (length($$bufmem) > 1024){ print("buffer dump\n"); $outfh->print($$bufmem); $$bufmem = ""; } } } $outfh->print($$bufmem);
I am sure there is a better/more portable way of doing this though, but I can't seem to describe it to the search engine daemons so as to get the right sort of answer. can anyone describe/direct or otherwise enlighten me in this regard?

yrs,

TR