![]() |
|
Perl: the Markov chain saw | |
PerlMonks |
comment on |
( #3333=superdoc: print w/replies, xml ) | Need Help?? |
Hello Knowledgable Ones, I have around 40 logfiles of about 15 MB each. Around 30 processes write willy-nilly into these files, whereby each line contains text which identifies the process. I would like to untangle the log files to produce a single file for each process. Naively I could try to slurp all the logfiles and then just use grep or split to get the process ID and then write to the appropriate new log file. However, I suspect that I might have some memory issues slurping all the data, but apart from that I would like to know what would be a more scalable approach. Any ideas? Thanks, "It took Loris ten minutes to eat a satsuma . . . twenty minutes to get from one end of his branch to the other . . . and an hour to scratch his bottom. But Slow Loris didn't care. He had a secret . . ." (from "Slow Loris" by Alexis Deacon) In reply to Untangling Log Files by loris
|
|