Beefy Boxes and Bandwidth Generously Provided by pair Networks
Your skill will accomplish
what the force of many cannot
 
PerlMonks  

WWW:Mechanize Error[getting Junks] fetching a Page

by Anonymous Monk
on Dec 12, 2013 at 08:45 UTC ( [id://1066777]=perlquestion: print w/replies, xml ) Need Help??

Anonymous Monk has asked for the wisdom of the Perl Monks concerning the following question:

Hi Monks, I am getting junks while I try to fetch the below page. Pls can you tell me what could be the problem?

#!/usr/bin/perl -w use strict; use WWW::Mechanize; my $mech = WWW::Mechanize->new(); $mech->get('https://thebigword-careers.irecruittotal.com/'); print $mech->content(); #my $filename = 'OUTPUT.html'; #$mech->save_content( $filename ); exit;

Is this a www::mech issue?

Thanks Monks

Replies are listed 'Best First'.
Re: WWW:Mechanize Error[getting Junks] fetching a Page
by Anonymous Monk on Dec 12, 2013 at 08:51 UTC

    Is this a www::mech issue?

    No, it never is

      Can you pls tell me the reason...? Only Mechanize is returning junks and no other browsers.. I didn't understand it.

        Can you pls tell me the reason...? Only Mechanize is returning junks and no other browsers.. I didn't understand it.

        Maybe

        Did you read WWW::Mechanize documentation for content method for your version of installed WWW::Mechanize to read and understand what it does?

        The same for save_content method?

        Did you examine the response headers ? compare them to other browsers?

        I believe that is what you would have to do to arrive at a satisfactory answer

        Hi

Re: WWW:Mechanize Error[getting Junks] fetching a Page
by Gangabass (Vicar) on Dec 13, 2013 at 10:56 UTC
Re: WWW:Mechanize Error[getting Junks] fetching a Page
by Anonymous Monk on Dec 12, 2013 at 17:13 UTC

    I tried using Compress::Zlib qw( uncompress ) also.... still no improvments... :(

    Can anyone pls tell me the root cause?

    Is that due to gzip encoding?

    Is that due to gzip encoding?(www::mech wont unzip gzip?)

    Pls help if you can... Thank You

      Can anyone pls tell me the root cause?

      No, but you can maybe discover the root cause through testing, by successfully performing the diagnostic steps outlined in Re^5: WWW:Mechanize Error[getting Junks] fetching a Page

      Is that due to gzip encoding?

      Possibly

      (www::mech wont unzip gzip?)

      Will anything else unzip it?

Log In?
Username:
Password:

What's my password?
Create A New User
Domain Nodelet?
Node Status?
node history
Node Type: perlquestion [id://1066777]
Approved by Corion
help
Chatterbox?
and the web crawler heard nothing...

How do I use this?Last hourOther CB clients
Other Users?
Others perusing the Monastery: (2)
As of 2024-04-25 20:09 GMT
Sections?
Information?
Find Nodes?
Leftovers?
    Voting Booth?

    No recent polls found