http://qs321.pair.com?node_id=651111


in reply to DBD::CSV eats memory

I'm not sure what your test is supposed to illustrate or what you think memory has to do with speed in this case, but as the maintainer of DBD::CSV I can assure you that it was not built to handle a million rows quickly. Use SQLite or a full RDBMS for data sets of that size. That said, there are some big speed improvements coming to DBD::CSV soon (a new version of its SQL engine SQL::Statement). I'll announce it on this site when its ready.