Perl: the Markov chain saw | |
PerlMonks |
comment on |
( [id://3333]=superdoc: print w/replies, xml ) | Need Help?? |
Was the cell table composed of 100,000 or 1,000,000 rows? If 100,000 then most assuredly keep this in memory as a hash (i.e. read whole thing only once). That might not be a bad idea even if it's a million. And obviously do the same with the tiny lc and dist tables. Consider storing the data in each table as a blob, but first pack it, and blob the packed entity. IIRC, sqlite keeps data as strings, so it's more voluminous than the actual data. And then you have the cost of all those atoi's to decode it. And if your data can be organized by lat-long, consider a small database for each lat-long region. Also, consider the CPAN module Cache::Memcached. In reply to Re^3: speeding up row by row lookup in a large db
by xcramps
|
|