Beefy Boxes and Bandwidth Generously Provided by pair Networks
good chemistry is complicated,
and a little bit messy -LW
 
PerlMonks  

Re^2: WWW:Mechanize Error[getting Junks] fetching a Page

by Anonymous Monk
on Dec 13, 2013 at 11:36 UTC ( [id://1067019]=note: print w/replies, xml ) Need Help??


in reply to Re: WWW:Mechanize Error[getting Junks] fetching a Page
in thread WWW:Mechanize Error[getting Junks] fetching a Page

Obviously you need WWW::Mechanize::GZip

Did you try it? Did it work for you? https://metacpan.org/changes/distribution/WWW-Mechanize has had the same gzip stuff since about the same time that module got released ... which is why you don't need it

  • Comment on Re^2: WWW:Mechanize Error[getting Junks] fetching a Page

Log In?
Username:
Password:

What's my password?
Create A New User
Domain Nodelet?
Node Status?
node history
Node Type: note [id://1067019]
help
Chatterbox?
and the web crawler heard nothing...

How do I use this?Last hourOther CB clients
Other Users?
Others examining the Monastery: (3)
As of 2024-04-26 05:11 GMT
Sections?
Information?
Find Nodes?
Leftovers?
    Voting Booth?

    No recent polls found