http://qs321.pair.com?node_id=619313


in reply to Upgraded SQLite: "database changed" error

From the FAQ entry you linked to:

"Because a prepared statement can be invalidated by another process changing the database schema, all code that uses the sqlite3_prepare()/sqlite3_step()/sqlite3_finalize() API should be prepared to handle SQLITE_SCHEMA errors. An example of one approach to this follows: (snip code example of how to handle and recover this problem in C)"

If the code in the DBD::SQLite perl module is using these API calls to provide prepared statement handles, it needs to use the above approach.

So I'd say, file a bug on that module, giving pointers to your script above, the relevant FAQ entry and - if you're feeling generous - a patch.


Edit: OK, I looked into it a little more and it's not quite that simple :-(

The suggested code in the FAQ simply re-runs the query. Since the DBI interface exposes prepared statements to the perl level, perhaps the error (and handling) needs to be propogated to the perl level too. Doesn't feel like there's a simple solution to me, since you may have already processed some rows (e.g. printed them out). It might be possible to keep track of how many rows you've returned and skip them, but I'm not sure if that would be worse (since I guess with a changing DB you could end up missing some rows or duplicating them).

Maybe the best thing to do is to schedule some downtime for when ANALYZE is going to be run.

  • Comment on Re: Upgraded SQLite: "database changed" error

Replies are listed 'Best First'.
Re^2: Upgraded SQLite: "database changed" error
by bsb (Priest) on Jun 05, 2007 at 11:05 UTC
    Thanks for investigating.

    There's code within DBD::SQLite's dbdimp.c which does something like the FAQ snippet, but only at the start of an execute, not after calling sqlite's finish within the execute.

    The SQL string and bind variables are already retained but the DBD C code so it seems possible to transparently re-prepare. There are even functions to help do this in sqlite (transfer_bindings and reset or something).

    Unfortunately, this application needs an ANALYZE after inserting all the basic data in order to run queries with a sane plan. What's more, it's not only the ANALYZE that causes the problems and I also add indexes after bulk loading data. There are probably ways to paper over the problem, something like pinging after each troublesome action.

    If it is indeed a problem at the DBD level and can be addressed there then that's my preferred solution. I suspect that there may be a obstacles ahead though.

    Update: The new version of prepare doesn't fail on schema changes. This may fully or partially fix the problem.
      If your app is in control (and knows when the ANALYZE is going to happen) then perhaps the best approach would be to try and control the use of prepared statement handles?

      If you defer loading your Class::DBI modules until after the bulk load has finished (or are you using Class::DBI for the bulk load?) then it seems that might work around the problem.

      One way to defer the load of some modules until run-time is to wrap the 'use' declaration in a string-eval, e.g. eval "use My::Class::DBI::Module;". Another is to use:

      require 'My/Class/DBI/Module'; My::Class::DBI::Module::import();
      both of these approaches pull in the module at run-time and so can be used to move the Class::DBI loading until after your ANALYZE, if you perform them in a function rather than at top-level.

      There's probably a better way, though.