Pathologically Eclectic Rubbish Lister | |
PerlMonks |
comment on |
( [id://3333]=superdoc: print w/replies, xml ) | Need Help?? |
There also is Data::Stream::Bulk, which fetches parts of the results. It has the advantage that you don't need to rewrite your SQL, but you still need to create a loop that fetches and processes the results. Also, as you mention transactions, that module will not help you with transactions growing too large, as it will keep the transaction open and only fetch a slice from the transaction for processing. Personally, when doing long-running SQL stuff, I prefer to have it stateless in the SQL and limit the amount of data returned by the SQL like this:
Depending on the sort criteria, I like to order by some timestamp, so I either upgrade the oldest or the newest rows first. Depending on your flavour of SQL, TOP 1000 needs to be replaced by LIMIT 1000 after the WHERE clause. In reply to Re: Posgres batch read with DBI?
by Corion
|
|