Ganlron has asked for the wisdom of the Perl Monks concerning the following question:
Trying to create a web frontend for an FTP client, it's currently working fine for any file upload < 20mb. At about 20.8mb uploaded it craps out. Results vary by browser, in FireFox it simply truncates the file and if there's another being uploaded goes on to the next one. In IE, at the 20.8mb mark it kicks up a 500 server error telling me "Malformed multipart POST".
I thought maybe it was a limitation of HTTP, however if I setup something similar with an ASP file upload, the whole file transfers without issue.
I'm at a loss of what may be causing it. I'm guessing it's something I've overlooked however, any thoughts on suggestions as to what it might be?
Thanks in advance.
Re: File Upload
by matija (Priest) on Apr 22, 2004 at 22:04 UTC
|
<ul- Check $CGI::POST_MAX - it regulates the maximum size of the upload.
- Check if your webserver has any limits in that area
- Not applicable to you (since for you it works with a different server - but worth mentioning anyway) if you're accessing the website through a webproxy, that may affect the maximum size of uploads, too. I think Squid ships with a configuration file that limits uploads to 10MB or so.
| [reply] [d/l] |
Re: File Upload
by sgifford (Prior) on Apr 23, 2004 at 02:05 UTC
|
Probably the best way to diagnose this is to make sure you have error checking after everything that can fail, and then look in your error logs to see what failed.
| [reply] |
Re: File Upload
by asdfgroup (Beadle) on Apr 23, 2004 at 00:47 UTC
|
This doen't looks like CGI::POST_MAX limit. (judging by CGI.pm doc, it have to output error 413 "Request entity too large"
Rather this is simply timeout in your browser :)
And to solve this you have to upgrade your bandwidth ;) | [reply] |
|
That seems unlikely...Browsers don't time out when they're actively uploading something.
| [reply] |
|
Oops. Sorry. I mean Apache timeout :).
| [reply] |
|
|