Beefy Boxes and Bandwidth Generously Provided by pair Networks
laziness, impatience, and hubris
 
PerlMonks  

Re: Loading Large files eats away Memory

by dws (Chancellor)
on May 26, 2005 at 06:51 UTC ( #460521=note: print w/replies, xml ) Need Help??


in reply to Loading Large files eats away Memory

Well, yes. Loading a big file into memory takes memory. Note that the way you're loading in the file,

@l_Data = <FILE>;
is going to incur overhead for each line and for the array that holds references to all of the lines. Depending on your line lengths, that overhead could be substantial.

Compare that do

local $/; $data = <FILE>;
which slurps the entire file into a single string. It'll still take a fair chunk of memory, with (possibly significantly) less overhead.

However, if your process involves morphing the text of the existing file, you might be better holding an array of lines, since the incremental cost of making a copy of a line is considerably less than making a copy of a 25Mb string.

What can you say about your processing needs? Perhaps there's a better way yet.

Log In?
Username:
Password:

What's my password?
Create A New User
Node Status?
node history
Node Type: note [id://460521]
help
Chatterbox?
and the web crawler heard nothing...

How do I use this? | Other CB clients
Other Users?
Others about the Monastery: (6)
As of 2020-10-24 09:20 GMT
Sections?
Information?
Find Nodes?
Leftovers?
    Voting Booth?
    My favourite web site is:












    Results (242 votes). Check out past polls.

    Notices?