http://qs321.pair.com?node_id=598981

loris has asked for the wisdom of the Perl Monks concerning the following question:

Hello Knowledgable Ones,

I have around 40 logfiles of about 15 MB each. Around 30 processes write willy-nilly into these files, whereby each line contains text which identifies the process. I would like to untangle the log files to produce a single file for each process.

Naively I could try to slurp all the logfiles and then just use grep or split to get the process ID and then write to the appropriate new log file. However, I suspect that I might have some memory issues slurping all the data, but apart from that I would like to know what would be a more scalable approach.

Any ideas?

Thanks,

loris


"It took Loris ten minutes to eat a satsuma . . . twenty minutes to get from one end of his branch to the other . . . and an hour to scratch his bottom. But Slow Loris didn't care. He had a secret . . ." (from "Slow Loris" by Alexis Deacon)