Your skill will accomplish what the force of many cannot |
|
PerlMonks |
Re^4: Perl solution for storage of large number of small filesby andye (Curate) |
on Apr 30, 2007 at 12:06 UTC ( [id://612754]=note: print w/replies, xml ) | Need Help?? |
Hi isync and salva, interesting topic.
Anyway, if you need to access 2GB of data randomly, there is probably nothing you can do to stop disk trashing other than adding more RAM to your machine, so that all the disk sectors used for the database remain cached. In this situation - more data than memory, but not loads more - I've found memory mapping works well. In my situation the data accesses were randomly scattered but with a non-uniform distribution - if that makes sense. I.e. although the access wasn't sequential, some data was accessed more often than others. So memory mapping meant that the often-access data stayed cached in ram. Any decent database should be able to do pretty much the same thing - as long as you configure it with a big query cache - although disk access will be slower than for memory mapping. The real problem comes if you're making a lot of changes to the data, which busts your cache... Best wishes, andye
In Section
Seekers of Perl Wisdom
|
|