Beefy Boxes and Bandwidth Generously Provided by pair Networks
Come for the quick hacks, stay for the epiphanies.
 
PerlMonks  

Net::FTP speed

by Anonymous Monk
on Mar 29, 2007 at 14:35 UTC ( [id://607273]=perlquestion: print w/replies, xml ) Need Help??

Anonymous Monk has asked for the wisdom of the Perl Monks concerning the following question:

Hello Fellow Monks!
I am completely new at this.
I have a script that makes use of Net::FTP to transfer files via FTP.
It seems as though the transfers, given the bandwidth that I have available, are not as fast as they could be.
Is there anything (or any additional modules, sub-modules, or parmeters) that I can implement into the script that would speed things up??
Any and all advise would be greatly appreciated.
Thank you in advance for your help.

Replies are listed 'Best First'.
Re: Net::FTP speed
by Ionitor (Scribe) on Mar 29, 2007 at 14:37 UTC
    If you use a standard FTP client, is it faster? Many FTP servers out there are unable/unwilling to provide a single user with the bandwidth necessary to max out a modern broadband connection.
Re: Net::FTP speed
by moklevat (Priest) on Mar 29, 2007 at 14:47 UTC
    Hello Anonymonk,

    Have you actually compared the rate of your script's file transfers to file transfers using ftp from the command line? Regardless of the size of your local pipe (it's all pipes isn't it?), the rate of the transfer may be limited by a high load on the remote server or bandwidth constraints somewhere upstream.

    Having said that, if the local and remote machines use the same type of line ending, you may see some speed-up from using binary mode (i.e. no transformation of the data being transmitted).

    $ftp->binary() or die "Cannot change to binary mode";

    Perhaps some other monk has some insight into whether futzing with the block size would make any difference.

Re: Net::FTP speed
by nimdokk (Vicar) on Mar 30, 2007 at 12:08 UTC
    As others have pointed out, FTP speed will depend on a lot of other factors. "Not as fast as they could be" is somewhat subjective. Have you timed the transfers, timed the transfers using other methods (i.e. commandline FTP, GUI FTP Client, etc). Keeping everything the same (transfer the same file several times using a particular client app) and average the times and rates. Things outside your control could also be impacting these speeds (time of day, traffic on the networks between you and the client. There may also be some throttling going on between the client and server that you aren't even aware of (I've seen it happen). If at the end of the day, it's something outside of your control, there's nothing you can do about it apart from burning some offerings to appease the network gods :-)

    Hope that helps a bit.

      I've just had a similar problem with FTP between two gigabit connected boxes which wouldn't transfer at more than 100KB/sec. It took a while but eventually I found that the default BlockSize in Net::FTP is 10240 bytes (compared to a 4096 bytes in the command line FTP.exe) By tweaking the BlockSize, transfer rate changed from 100KB/sec to 2000+KB/sec. The next exercise is to optimise the blocksize for each server we use depending on network, firewalls, throttling etc...

Log In?
Username:
Password:

What's my password?
Create A New User
Domain Nodelet?
Node Status?
node history
Node Type: perlquestion [id://607273]
Approved by Joost
help
Chatterbox?
and the web crawler heard nothing...

How do I use this?Last hourOther CB clients
Other Users?
Others admiring the Monastery: (6)
As of 2024-04-20 00:33 GMT
Sections?
Information?
Find Nodes?
Leftovers?
    Voting Booth?

    No recent polls found