in reply to DBD::CSV eats memory
I'm not sure what your test is supposed to illustrate or what you think memory has to do with speed in this case, but as the maintainer of DBD::CSV I can assure you that it was not built to handle a million rows quickly. Use SQLite or a full RDBMS for data sets of that size. That said, there are some big speed improvements coming to DBD::CSV soon (a new version of its SQL engine SQL::Statement). I'll announce it on this site when its ready.
|
---|
Replies are listed 'Best First'. | |
---|---|
Re^2: DBD::CSV eats memory
by perlmonkdr (Beadle) on Nov 16, 2007 at 14:22 UTC | |
by jZed (Prior) on Nov 16, 2007 at 18:02 UTC | |
by perlmonkdr (Beadle) on Nov 19, 2007 at 23:28 UTC |
In Section
Seekers of Perl Wisdom