http://qs321.pair.com?node_id=548208


in reply to parallel reading

Here's another,

open my $out, '>', 'ABC' or die $!; { local $_; open my $A, '<', 'A' or die $!; open my $B, '<', 'B' or die $!; open my $C, '<', 'C' or die $!; no warnings 'uninitialized'; while ($_ = <$A> . <$B> . <$C>) { s/\n//g; print $out $_, "\n"; } } close $out or warn $!;
That will let the files have different numbers of lines. Memory use is small, and independent of file size.

Update: Repaired the thinko blazar++ spotted. Empty lines are not a problem - we don't chomp, so they retain newlines until we s/// them gone. I like blazar's extension to different numbers of files.

After Compline,
Zaxo

Replies are listed 'Best First'.
Re^2: parallel reading
by blazar (Canon) on May 09, 2006 at 14:42 UTC

    Nice approach. And my be merged with mine, e.g.:

    #!/usr/bin/perl -l use strict; use warnings; my @fh=map { open my $fh, '<', $_ or die "Can't open `$_': $!\n"; $fh } @ARGV; no warnings 'uninitialized'; print while $_=join '', map { chomp(my $line=<$_>); $line } @fh, __END__

    However:

    • you should s/undefined/uninitialized/;
    • it may not be fully reliable if empty lines are to be expected in the files.

    Update: the second point was a thinko as Zaxo pointed out.