aossama has asked for the wisdom of the Perl Monks concerning the following question:

Hello Monks,

I am in charge of re-factoring a script which basically iterates over an array from a SELECT query from a database which returns ~1 Trillion records, the array restructures the items in a certain pattern and produces csv files. The script works fine, but the issue I am facing is that it takes ~16 days to finish. I have to decrease the run-time of the script; So these were the thoughts I had: I appreciate any advice regarding any of the three topics, what modules to use, any past experience with such huge number of records, or any other solutions to optimize the performance of the script. Thanks in advance.