Beefy Boxes and Bandwidth Generously Provided by pair Networks
There's more than one way to do things
 
PerlMonks  

Getting HTTP 400 Bad Request after a few times page reloads

by Justudo (Novice)
on Oct 18, 2002 at 08:09 UTC ( [id://206241]=perlquestion: print w/replies, xml ) Need Help??

Justudo has asked for the wisdom of the Perl Monks concerning the following question:

Hi monastry mates, Was coding a simple search engine, where about 10 results are displayed per page. And if the user searches subsequest pages of the same search string, a cookie is used instead of repeating the whole search. But after a few different search strings are entered the page goes to Bad request (HTTP 400). For that session, all pages from the server get 400 error.
my $start_dir = $q->param("dir"); my @search_string = split / /,$q->param("search_string"); my $search_type = $q->param("search_type"); my $start_num = $q->param("start_number"); if ( ! defined ($start_num) ) { $start_num = 0; } @search_string = getWords(@search_string); my $search_url = $c{url}{find_cgi}."?dir=$start_dir&search_type=$searc +h_type&search_string=".fmtSS(@search_string); my $cookie_name = $q->param("search_string")."_dir_".$start_dir; my $cookie_data = $q->cookie($cookie_name); if ( defined ($cookie_data) ) { my %cookie_hash = %{ thaw $cookie_data }; my @search_result = computeRank($search_type,%cookie_hash); printScore($start_num,$search_url,$c{search_list_length},@sear +ch_result); exit; } my %word_score = getWordScore(...........); if ( %word_score ) { my @search_result = computeRank($search_type,%word_score); $cookie_data = freeze \%word_score; my $search_cookie = $q->cookie(-name=>$cookie_name,-value=>$co +okie_data); printScore($start_num,$search_url,$c{search_list_length},@sear +ch_result); } else { print "<p>No results found"; }

Replies are listed 'Best First'.
Re: Getting HTTP 400 Bad Request after a few times page reloads
by zakzebrowski (Curate) on Nov 18, 2003 at 19:14 UTC
    You may not need this now but if you haven't figured it out... I got a 400 bad responce when my refer field was very long. (It was a get request to a form that was very long.) Changing the form from a get request to a submit got it to work like a charm... :)


    ----
    Zak
    undef$/;$mmm="J\nutsu\nutss\nuts\nutst\nuts A\nutsn\nutso\nutst\nutsh\ +nutse\nutsr\nuts P\nutse\nutsr\nutsl\nuts H\nutsa\nutsc\nutsk\nutse\n +utsr\nuts";open($DOH,"<",\$mmm);$_=$forbbiden=<$DOH>;s/\nuts//g;print +;
Re: Getting HTTP 400 Bad Request after a few times page reloads
by beebware (Pilgrim) on Oct 19, 2002 at 10:52 UTC
    Which server software are you using? Is there anything in the error log at all?
      I'm using this on Apache web server. There is no error in web server logs though. Any clue where else to start with ? I got a feeling its something to do with 'freeze' and 'thaw'.

Log In?
Username:
Password:

What's my password?
Create A New User
Domain Nodelet?
Node Status?
node history
Node Type: perlquestion [id://206241]
Approved by mr2
help
Chatterbox?
and the web crawler heard nothing...

How do I use this?Last hourOther CB clients
Other Users?
Others chanting in the Monastery: (3)
As of 2024-04-18 00:06 GMT
Sections?
Information?
Find Nodes?
Leftovers?
    Voting Booth?

    No recent polls found