Beefy Boxes and Bandwidth Generously Provided by pair Networks
Think about Loose Coupling

Re: Reading huge file content

by KurtSchwind (Chaplain)
on Dec 06, 2007 at 14:00 UTC ( #655405=note: print w/replies, xml ) Need Help??

in reply to Reading huge file content

As with others, I'm going to have to ask 'why'. We need the 'why' to give you a good solution.

If we take the premise that you 'just need to', then the solution is to get a machine with 10G of memory on it and then you won't get that error. The error of running out of system memory isn't perl specific. You'd get that error in C or any other language. You've hit a physical limit.

If it turns out that you really don't NEED to have it all slammed into memory at once and passed around, you have other options as people have pointed out. Passing file handles is a good solution. Another good solution is to use Sys::Mmap; to memory map your file. You can also use PerlIO with the :mmap tag. At any rate, give us some more info and we can provide a solution to the problem, even if it's one you wont' like. :)

I used to drive a Heisenbergmobile, but every time I looked at the speedometer, I got lost.

Log In?

What's my password?
Create A New User
Domain Nodelet?
Node Status?
node history
Node Type: note [id://655405]
and the web crawler heard nothing...

How do I use this? | Other CB clients
Other Users?
Others chilling in the Monastery: (3)
As of 2022-01-21 21:50 GMT
Find Nodes?
    Voting Booth?
    In 2022, my preferred method to securely store passwords is:

    Results (59 votes). Check out past polls.