Perl Monk, Perl Meditation | |
PerlMonks |
comment on |
( [id://3333]=superdoc: print w/replies, xml ) | Need Help?? |
Slightly off-topic, but: where are $run_id, $start_account and $end_account coming from, and should you perhaps be using placeholders in the query statement, and passing those variables to the $sth->execute call?
More on-topic: the question becomes "what do you need to do with these millions of rows? wouldn't it be possible to just handle each row as it comes and be done with it before fetching the next row? Rather than pushing all the rows into a single massive HoA structure, do what needs to be done with each row and forget about keeping the row data in a structure. If you think there's some reason why all the rows need to be in a single structure, there's bound to be a way to restructure the process so that you only have to deal with a limited set of rows in memory at any one time. Apart from that, your other choice is to use one of the DBM modules to tie your hash structure to a disk file. This seems kind of weird, because it seems like you end up replicating your MSSQL database in a DBM file. But if it gets the job done... (see AnyDBM_File and the various flavors of DBM modules cited there). Finally, a slightly irrelevant nit-pick: you don't need to do this: It turns out that your next line of code knows how to take care of the details for creating an array-ref as the hash value when necessary, without further ado:
In reply to Re: Script die without warning while getting result from DBI
by graff
|
|