|go ahead... be a heretic|
large data structuresby genehack (Beadle)
|on Apr 28, 2000 at 08:29 UTC||Need Help??|
genehack has asked for the wisdom of the Perl Monks concerning the following question:
I'm struggling with implementing a large two level data structure, and would like to get some input from the mass mind.
Conceptually the structure is a hash of arrays -- about 7000 different arrays, each with 65536 (4**8) elements. Figuring 1 byte per element 1 gives a total size of 458,752,700 bytes -- so that's not going to work. (And, yes, I do need random access at the upper level of the structure, so working on this one hash at a time isn't going to cut it.)
The arrays are pretty sparsely populated, maybe ~25% filled on average, and actually map to strings rather than numbers2, so I considered using a hash of hashes...but I also need this to be persistent (i.e., written out to disk), and from the *DBM_File docs, multi-level structures are a no-op.
So, at this point, I'm looking at using pack to scrunch the second level hashes into blobs, which can then be unpacked on the fly as they are accessed. Of course, that means I've got to have a second structure, telling me how many key/value pairs I have in each packed hash (because each will differ)...
Someone on c.l.p.misc suggested MLDBM, which looks like it might be a solution -- but the inability to directly modify the structure could end up being a PITA in the long run.
So, here I am. Can the Monks come up with a more flexible idea than the c.l.p.misc group?