Perl Monk, Perl Meditation | |
PerlMonks |
comment on |
( [id://3333]=superdoc: print w/replies, xml ) | Need Help?? |
Clearly, multithreading is not an option in this case. (Nine women can’t make a baby in one month, etc.) Since your logic appears to consist of reading one enormous file and routing its lines to a few others, with no real processing in-between, the only ruling constraint here appears to be file-buffering behavior. You need to cue the operating system to read and write these files in very-large gulps, to relieve the strain on the disk drive’s hardware mechanisms. The following post on StackOverflow.com appears to discuss this general issue directly. From the first response (in 2009): You can affect the buffering, assuming that you're running on an O/S that supports setvbuf. See the documentation for IO::Handle. You don't have to explicitly create an IO::Handle object as in the documentation if you're using perl 5.10; all handles are implicitly IO::Handles since that release.Their specific recommendation was:
Having found this, I now turn the question over to other Monks – is this still relevant? In reply to Re: Multithreading a large file split to multiple files
by sundialsvc4
|
|