http://qs321.pair.com?node_id=11127509


in reply to Posgres batch read with DBI?

There also is Data::Stream::Bulk, which fetches parts of the results. It has the advantage that you don't need to rewrite your SQL, but you still need to create a loop that fetches and processes the results. Also, as you mention transactions, that module will not help you with transactions growing too large, as it will keep the transaction open and only fetch a slice from the transaction for processing.

Personally, when doing long-running SQL stuff, I prefer to have it stateless in the SQL and limit the amount of data returned by the SQL like this:

select top 1000 foo , bar , baz from my_table where baz > ? order by foo, bar

Depending on the sort criteria, I like to order by some timestamp, so I either upgrade the oldest or the newest rows first.

Depending on your flavour of SQL, TOP 1000 needs to be replaced by LIMIT 1000 after the WHERE clause.

Replies are listed 'Best First'.
A reply falls below the community's threshold of quality. You may see it by logging in.