One option is to use Net::FTP, and specifically set the Timeout value (see the section in the documentation titled CONSTRUCTOR).
It's possible that you may not be able to transfer such a huge file all at once, in which case another option is to split it into multiple pieces, and reassemble it at the other end (assuming you have login or ssh capabilities at the remote end). If that's an option, look at the Unix split command.
s''(q.S:$/9=(T1';s;(..)(..);$..=substr+crypt($1,$2),2,3;eg;print$..$/
| [reply] [Watch: Dir/Any] |
Net::FTP should be able to handle 1.5G files. But the problem may be elsewhere; does the server accept such large files and/or long transmissions? Do you have a file quotum that allows 1.5Gb files? Note also that many ftp programs can deal with resuming transmissions. | [reply] [Watch: Dir/Any] |
Most modern FTP servers/clients have the ability to restart uploads at a given byte position in the file. Check your manuals. | [reply] [Watch: Dir/Any] |
You might like to watch the progress. See Tk and FTP with progress bars., it could be converted to upload. Also there is curl which is an interface to
libcurlYou can setup a curl transfer to have a callback for every chunk transferred. There are a few tutorials on the net and at the libcurl homepage.
| [reply] [Watch: Dir/Any] |
Hi Anonymous Monk.
I agree with the other monks in that perl should be fully capable of receiving/transferring a 1.5G file. If an issue arises, it might be on the ISP side. The following code is untested and was modelled on the perldoc example.
#!/usr/bin/perl -w
use strict;
use Net::FTP;
my $ftp;
$ftp = Net::FTP->new("my_host", Timeout=>240, Debug => 0)
or die "Error: $@";
$ftp->login('id','password')
or die "Error: ", $ftp->message;
$ftp->cwd("/folder")
or die "Error with cwd command: ", $ftp->message;
$ftp->put('filename to transfer', 'filename it will be called on the r
+emote server') or die "Error in transfer: ", $ftp->message;
$ftp->quit;
Additional information can be found at: http://perldoc.perl.org/Net/FTP.html.
Hope this helps,
~Katie
| [reply] [Watch: Dir/Any] [d/l] |
Are you sure it's an issue with the client?
Do other binary files of a smaller size get there?
If no data is getting there (0 byte file created), then it's likely an issue with needing to use passive FTP. If no file is created on the remote host, it could be any of a number of issues (authentication, permissions), but you should've seen an error message.
If the file is getting partially written, it could be a few different things -- disk quotas (possible), the link losing sync due to an excessive run of 0s (very unlikely, but can happen under some signaling standards over T1s), the connection dropping due to flow control commands (unlikely, unless you're coming over a modem using software flow control ... but you wouldn't be pushing 1.5GB over that)
...
Anyway, I don't think this is the problem you think it is, and I'd suggest you contact the helpdesk of your hosting provider.
| [reply] [Watch: Dir/Any] |