emilford has asked for the wisdom of the Perl Monks concerning the following question:

I was wondering what the best solution for uploading larger files is. I have a script that is capable of uploading smaller sized files, but when I try to upload a 1.8MB pdf file, it times out and everything screws up. Here is the code that I am using to upload the file:
open (LOCAL, ">$uploadLocation/$schemedName") or dieNicely("Un +able to upload: $!"); while (read($file, $i, 1024)) { $size += 1024; print LOCAL $i; } close (LOCAL);
Again, this works nicely with smaller files, but I'm not sure how to get it to work nicely with larger files. Thanks for the help.

Replies are listed 'Best First'.
Re: large file uploads - timeout
by perlplexer (Hermit) on Mar 22, 2003 at 17:34 UTC
    Are you positive that timeouts are to blame?
    When you say "a script that is capable of uploading smaller sized files", do you mean to say that you tested it on small text files and it worked?
    If so, then the problem is probably related to you not binmode()ing your filehandle...
    Try doing this:
    open (LOCAL, ">$uploadLocation/$schemedName") or dieNicely("Unable to +upload: $!"); binmode LOCAL; # ... rest of your code


    And before I forget... make sure you binmode() both filehandles, the one that you opened for writing and the one that you're reading from;i.e., LOCAL and $file. The open() statement for $file is not shown in your code, make sure you call binmode() right after that open()...
      I thought I remember reading somewhere that binmode was just needed for Windows systems and not Unix ones. I have tested the script on smaller (< 70kb) txt, doc, and pdf files and everything works fine. When I try to upload the 1.8MB pdf file is when I get the error. I'll try your suggestion. Thanks.

        With the new input disciplines, utf8 and whatnot in Perl 5.8 even Unixers should get used to binmode(). (And I should reread the docs to be able to tell what exactly happens if they don't.)

        Always code as if the guy who ends up maintaining your code will be a violent psychopath who knows where you live.
           -- Rick Osborne

        Edit by castaway: Closed small tag in signature

Re: large file uploads - timeout
by dakkar (Hermit) on Mar 22, 2003 at 16:29 UTC

    Ehm... what about giving some details?

    Like, are you talking about a CGI script? Are you using

    Or is it a demon, and you're just reading from a socket?

            dakkar - Mobilis in mobile
      It is a CGI script. I am using to get the file handle ($file).

        Are you sure you are not hitting the CGI::POST_MAX limit? Have you checked cgi_error()? Have you tried a dump of STDIN to STDOUT (ie your uploaded file back to the browser) to see what is happening. CGI::Simple has much better file upload error reporting than and uses the same interface so this may be an easy solution to diagnose the problem.




Re: large file uploads - timeout
by emilford (Friar) on Mar 22, 2003 at 17:45 UTC
    BTW, here is the error message that I get:

    Could not open the page ? after trying for 60 seconds.

      Are you using Safari as your browser? (the form of the error message seems to imply so)

      I'd say to try with another browser. Say, lynx from the command line, or IE, or somesuch.

      It could be (warning: WAG) that Safari is starting its timeout timer before having sent the whole file, and that more than 60 secs are needed to transfer that big a file.

      Try with another browser, to get a different timeout detection, or from a machine on the same LAN as the server, to get a faster transfer.

      WAG: Wild Assed Guess...

              dakkar - Mobilis in mobile
        Yeah, I am using Safari as my browser. I just tried it with IE (Mac 5.2) and it worked with no problem. It took a while to execute, but never crashed and the file is uploaded on my server. Is there anything I can do? I'd like the script to work with the majority of browsers, but if it's a local timeout setting that is causing the problem, I'm not sure if there is a solution.

      Is there any chance your web server is set to kill the CGIs if they run for more than 60s? If so you'll probably have to increase the limit.

      Always code as if the guy who ends up maintaining your code will be a violent psychopath who knows where you live.
         -- Rick Osborne

      Edit by castaway: Closed small tag in signature

        I don't think the problem is on the server side, but rather a local issue. Besides, I unfortunately have no control over server settings.