go ahead... be a heretic | |
PerlMonks |
Re^3: write hash to disk after memory limitby FloydATC (Deacon) |
on Mar 14, 2015 at 07:02 UTC ( [id://1120031]=note: print w/replies, xml ) | Need Help?? |
And when his laboratory gets expanded to output 170GB he's supposed to run and buy 10 times more RAM? Like I said, if the data set was 10 times bigger... I don't disagree with any of what you say, it's just that having a working data set of 17 GB that simply isn't suitable for anything else than keeping it all in RAM is not unheard of in this day and age. Assuming for a moment that this isn't a problem that needs to scale for an entire datacenter, and we're not talking about reprogramming a deep space probe launched 20 years ago can also help with reducing the need for throwing man hours on the problem. If it turns out that in this particular case the data set wasn't really 17 GB after all but only expanded to this size as it was read into memory, that's great :-) I was merely trying to illustrate why replacing OS swapping with home baked swapping would probably not be worth the effort.
-- FloydATC
Time flies when you don't know what you're doing
In Section
Seekers of Perl Wisdom
|
|