An order of magnitude is not actually correct although the point is *of course* {sigh} valid. For small files the difference is negligible. The gains increase with file size. Nonetheless faster is faster is better.
#!/usr/bin/perl -w
use strict;
my $file = shift;
my $iter = shift;
my $size = -s $file;
print "File size $size Bytes\n";
my $start = time;
for (1..$iter) {
open FILE, $file or die $!;
my $stuff = join'',<FILE>;
close FILE;
}
printf "Join took %d seconds for %d iterations\n", (time-$start), $ite
+r;
$start = time;
for (1..$iter) {
{
local $/;
open FILE, $file or die $!;
my $stuff = <FILE>;
close FILE;
}
}
printf "Undef \$/ took %d seconds for %d iterations\n", (time-$start),
+ $iter;
__END__
C:\>perl test.pl bigfile.txt 2000
File size 47753 Bytes
Join took 72 seconds for 2000 iterations
Undef $/ took 21 seconds for 2000 iterations
C:\>perl test.pl test.pl 2000
File size 878 Bytes
Join took 8 seconds for 2000 iterations
Undef $/ took 7 seconds for 2000 iterations
C:\>
cheers
tachyon
s&&rsenoyhcatreve&&&s&n.+t&"$'$`$\"$\&"&ee&&y&srve&&d&&print
|