Your skill will accomplish what the force of many cannot |
|
PerlMonks |
SFTP more than 50+ filesby msk_0984 (Friar) |
on Jan 31, 2008 at 16:52 UTC ( [id://665385]=perlquestion: print w/replies, xml ) | Need Help?? |
msk_0984 has asked for the wisdom of the Perl Monks concerning the following question:
Hi Repected Monks, I have been using Net::SSH2 and implemented this module into one of the critical project for creation of a Web-based interface for distribution of files onto remote systems. Initially this project was implemented with Net::FTP module and then the client was very happy with the product and he wanted to make it secure, so asked us to use SSH module which was highly secure for him.( As we have to anyways be with the SSH protocol only). Initially we went in for Net::SSH::Perl, but faced lot of dependencies and even after successfull installation of all the modules. We cld see that the application was using very high CPU usage for the execution of the applications in solaris platform. We went for the Net::SSH2::SFTP module which was a new one. It is a handy and easy to use. So we went along and developed the application, which is working very fine at our end. But latter the client as usually came up and said that he wants to transfer more than 50-60 + files to 30 + remote systems. We used Net::SSH2 and threads to connect to remote systems parally. And when he tried the same the application was taking around 15 mins with cpu utilization reaching to the maximum levels ( 90+) and as expected the browser also timed out. Which made the product to halt. My main concern is can we be able transfer more than 50 - 60 + files to remote system with less cpu utilization. We checked out the Perl modules and could not find a better module for the above scenario. So is there any module which could be handy for the above requirement. This is the part of code which transfer the files to remote systems
Hope you do the needfull.
Thanks in advance, Sushil Kumar
Back to
Seekers of Perl Wisdom
|
|