Best way is to use File::Copy. The FAQ section you're referring to is dealing with tee-ing filehandles, so that anything one prints goes into three files (if I'm understanding which section you're talking about correctly). If you're simply copying files, it's easiest to simply copy the files. Using File::Copy, you could do something like:
use strict;
use File::Copy;
my $source = 'src1';
foreach my $dest ( 'dest_1', 'dest_2', 'dest_3' ) {
copy($source, $dest);
}
stephen
| [reply] [d/l] |
Depending on the size of the file, you may exhaust the machine's memory allocated
to disk buffering, which means that if you use File::Copy you will physically
reread the same disk sectors over and over again. If this is true, the most effecient
method of writing to multiple files would be to read a line, and write it out to all
the output files. Something like:
my %fd = (
'file1' => 'fd1',
'/tmp/foo => 'fd2',
'/home/me/file' => 'fd3',
);
my $in = shift || die "No file specified.\n";
open IN. $in or die "Cannot open $in for input: $!\n";
foreach( keys %fd ) {
open $fd{$_}, ">$_" or die "Cannot open $_ for output: $!\n";
}
while( <IN> ) {
foreach my $fd( keys %fd ) {
print $fd $_;
}
}
foreach( keys %fd ) {
close $fd{$_};
}
-- g r i n d e r
| [reply] [d/l] |
This line:
foreach my $fd( keys %fd ) {
should be:
foreach my $fd( values %fd ) {
and instead of reading it in line by line, it would be more efficient just to read a chunk of it at a time so instead of
while( <IN> ) {
use
while( read(IN, $_, 4096) ) {
| [reply] [d/l] [select] |
You will probably want to take a look at the module
File::Copy which is included in current distributions. | [reply] |