Beefy Boxes and Bandwidth Generously Provided by pair Networks
more useful options
 
PerlMonks  

Re: Gargantuan memory consumption issues

by Corion (Patriarch)
on Dec 22, 2008 at 23:12 UTC ( [id://732216]=note: print w/replies, xml ) Need Help??


in reply to Gargantuan memory consumption issues

You might be unaware that WWW::Mechanize keeps a history of visited pages. If your script never quits and only ever keeps one WWW::Mechanize object around, that will accumulate a vast history over time, which will also consume more and more memory.

Of course, without seeing code, it's hard to tell.

  • Comment on Re: Gargantuan memory consumption issues

Replies are listed 'Best First'.
Re^2: Gargantuan memory consumption issues
by runrig (Abbot) on Dec 22, 2008 at 23:53 UTC
    This is the likely culprit. Either set stack_depth() on the Mech object, or periodically destroy the object and create a new one.
      Thank you , this was indeed the problem
Re^2: Gargantuan memory consumption issues
by p2409 (Initiate) on Dec 23, 2008 at 07:04 UTC
    +1 on that one. Mechanize is great, but a hog.

Log In?
Username:
Password:

What's my password?
Create A New User
Domain Nodelet?
Node Status?
node history
Node Type: note [id://732216]
help
Chatterbox?
and the web crawler heard nothing...

How do I use this?Last hourOther CB clients
Other Users?
Others making s'mores by the fire in the courtyard of the Monastery: (None)
    As of 2024-04-19 00:12 GMT
    Sections?
    Information?
    Find Nodes?
    Leftovers?
      Voting Booth?

      No recent polls found