Interesting ... from the CGI docs:
$CGI::POST_MAX
If set to a non-negative integer, this variable puts a
ceiling on the size of POSTings
...
An attempt to send a POST larger than $POST_MAX bytes
will cause param() to return an empty CGI parameter list.
You can test for this event by checking cgi_error(),
However looking at the CGI code ($CGI::VERSION=3.05;), I see the following check:
METHOD: {
# avoid unreasonably large postings
if (($POST_MAX > 0) && ($content_length > $POST_MAX)) {
# quietly read and discard the post
my $buffer;
my $max = $content_length;
while ($max > 0 &&
(my $bytes = $MOD_PERL
? $self->r->read($buffer,$max < 10000 ?
$max : 10000)
: read(STDIN,$buffer,$max < 10000 ? $max : 10000)
)) {
$self->cgi_error("413 Request entity too large");
last METHOD;
}
}
I think that would be a bug in CGI.pm (anyone want to tell Lincoln). It seems to me that the line should be
if (($POST_MAX > -1 ) && ($content_length > $POST_MAX))
but then again ... setting $CGI::POST_MAX to 0 is kinda silly in a real world context.
|