http://qs321.pair.com?node_id=11125097


in reply to Memory efficient way to deal with really large arrays?

If you do not need performance, use a database.

To make a relative transparent treansition: Tie::Array::DBD. The more complex your data, the slower it is, but you can work around memory limitations of a computer relatively easy (which was the only motivation for this module).

Here's an overview of speed differences on my machine. Higher numbers are better:

Size op perl SQLite Pg mysql MariaDB + CSV ------ -- ---------- ---------- ---------- ---------- ---------- ----- +----- 20 rd 6666666.7 227272.7 3916.2 7538.6 9267.8 4 +444.4 20 wr 6666666.7 172413.8 2842.1 1759.3 1882.7 11 +560.7 200 rd 8333333.3 414078.7 12738.9 8710.4 24330.9 +901.9 200 wr 9523809.5 310559.0 12190.7 7399.7 13898.5 13 +556.6 600 rd 24000000.0 371977.7 14399.2 15106.9 24207.2 +346.0 600 wr 20689655.2 376884.4 14260.9 12633.4 23992.3 14 +080.5 2000 rd 13986014.0 392310.7 45244.8 23147.1 23431.8 +107.1 2000 wr 51282051.3 405515.0 18964.5 16786.4 31738.5 14 +729.2 20000 rd 25380710.7 374995.3 42310.4 22139.2 22889.8 + - 20000 wr 40899795.5 390930.4 33410.4 37265.1 32508.7 + -

Enjoy, Have FUN! H.Merijn