Beefy Boxes and Bandwidth Generously Provided by pair Networks
Perl-Sensitive Sunglasses
 
PerlMonks  

Re: Loading Large files eats away Memory

by dws (Chancellor)
on May 26, 2005 at 06:51 UTC ( [id://460521]=note: print w/replies, xml ) Need Help??


in reply to Loading Large files eats away Memory

Well, yes. Loading a big file into memory takes memory. Note that the way you're loading in the file,

@l_Data = <FILE>;
is going to incur overhead for each line and for the array that holds references to all of the lines. Depending on your line lengths, that overhead could be substantial.

Compare that do

local $/; $data = <FILE>;
which slurps the entire file into a single string. It'll still take a fair chunk of memory, with (possibly significantly) less overhead.

However, if your process involves morphing the text of the existing file, you might be better holding an array of lines, since the incremental cost of making a copy of a line is considerably less than making a copy of a 25Mb string.

What can you say about your processing needs? Perhaps there's a better way yet.

Log In?
Username:
Password:

What's my password?
Create A New User
Domain Nodelet?
Node Status?
node history
Node Type: note [id://460521]
help
Chatterbox?
and the web crawler heard nothing...

How do I use this?Last hourOther CB clients
Other Users?
Others chanting in the Monastery: (8)
As of 2024-04-19 08:18 GMT
Sections?
Information?
Find Nodes?
Leftovers?
    Voting Booth?

    No recent polls found