swares has asked for the wisdom of the Perl Monks concerning the following question:
This code seems to work but breaks on large files due to out of memory issues. I would think the file needs to be written out to file periodicly instead of cached in memory until the xfer completes or fails but I do not know quite how to go about it. We would use this to make backups of directories on remote machines with out creating a tar ball on the remote machine.
Does anyone know how to make it work with very large files?
Does anyone know how to make it work with very large files?
use Net::SSH::Perl # vars to set: $user, $password, $host, $local_fqn, $rmt_dir open STDOUT, ">$local_fqn" or die "Can't redirect stdout"; select STDOUT; $| = 1; # make unbuffered $cmd="tar -czf - $rmt_dir"; my $ssh = Net::SSH::Perl->new($host); $ssh->login($user, $password); my($stdout, $stderr, $exit) = $ssh->cmd($cmd); close STDOUT;
Back to
Seekers of Perl Wisdom