| [reply] [d/l] |
This is perl, v5.8.8 built for x86_64-linux-thread-multi
| [reply] |
Why when I store my 5 gb of file which has about 7m records of two columns, and I make two hashesh from two different files in the same format and size, even with a large ram (50gb) I run out of memory?
Assuming that your OS and Perl allow you full access to the full 50GB, you should not be running out of memory.
On a 64-bit system, a HoAs with 7 million keys and an average of 10 numbers per array requires ~3.5 GB. For two, reckon on 10 GB max.
I'm not aware of any restrictions or limits on the memory a 64-bit Perl can address, which leave you OS. Linux can apply per-process (and per-user?) limits to memory and cpu usage. I don't know what the commands are for discovering this information, but meybe that is somewhere you should be looking.
Examine what is said, not who speaks -- Silence betokens consent -- Love the truth but pardon error.
"Science is about questioning the status quo. Questioning authority".
In the absence of evidence, opinion is indistinguishable from prejudice.
| [reply] |
I have no idea since I can access the whole memory (all 50 GB)
Do you think it has something to do with my code?
| [reply] |
perl -e' $x = chr(0) x ( 1024**3 * 12 ) '
That will attempt to allocate a single 12GB lump of memory. If it fails, then try adjusting the 12 to a lower value to discover how much memory Perl can actually allocate.
Examine what is said, not who speaks -- Silence betokens consent -- Love the truth but pardon error.
"Science is about questioning the status quo. Questioning authority".
In the absence of evidence, opinion is indistinguishable from prejudice.
| [reply] [d/l] |
| [reply] |
| [reply] [d/l] [select] |