http://qs321.pair.com?node_id=619312

bsb has asked for the wisdom of the Perl Monks concerning the following question:

Update: DBD::SQLite 1.14 now uses sqlite3_prepare_v2 and this problem has gone away. MattSargeant++

I just dist-upgraded my system and have since encountered the SQLite "database schema has changed" error in an application that worked prior to the upgrade.

The "change" involved is running ANALYZE which invalidates prepared statement handles (see the SQLite FAQ). Unfortunately, I don't control what is prepared where as I use Class::DBI so I'm looking to either revert to a working set of modules or work around the problem at a lower level.

Does the code below reproduce the problem for other people?
Any tips on same versions or work-arounds?
#!/usr/bin/perl use strict; use warnings; use DBI; unlink "a.db"; my $dbh = DBI->connect("dbi:SQLite:dbname=a.db") or die; print '$DBI::VERSION ',$DBI::VERSION, "\n"; print '$DBD::SQLite::VERSION ',$DBD::SQLite::VERSION, "\n"; $dbh->do("create table a as select 123 as b"); my $sth = $dbh->prepare('select "hello" from a'); $dbh->do("analyze"); print "got ", $dbh->selectrow_array($sth), "\n"; $sth->finish; #undef $sth; # to stop "closing dbh with active statement handles" # http://rt.cpan.org/Ticket/Display.html?id=22688 $dbh->disconnect; __END__ $ perl analyze_changed.pl $DBI::VERSION 1.53 $DBD::SQLite::VERSION 1.13 DBD::SQLite::db selectrow_array failed: database schema has changed(1) + at dbdimp.c line 421 at analyze_changed.pl line 16. got
Update: I'm starting to think that the changes during the upgrade causing this problem to manifest were at the Class::DBI/Ima::DBI level. On my laptop I get an error for the above but an older version is the application (using ANALYZE) is fine. I'll compare versions tomorrow.
$DBI::VERSION 1.51 $DBD::SQLite::VERSION 1.12 DBD::SQLite::db selectrow_array failed: database schema has changed(1) + at dbdimp.c line 416 at - line 16. got

Replies are listed 'Best First'.
Re: Upgraded SQLite: "database changed" error
by jbert (Priest) on Jun 05, 2007 at 07:47 UTC
    From the FAQ entry you linked to:

    "Because a prepared statement can be invalidated by another process changing the database schema, all code that uses the sqlite3_prepare()/sqlite3_step()/sqlite3_finalize() API should be prepared to handle SQLITE_SCHEMA errors. An example of one approach to this follows: (snip code example of how to handle and recover this problem in C)"

    If the code in the DBD::SQLite perl module is using these API calls to provide prepared statement handles, it needs to use the above approach.

    So I'd say, file a bug on that module, giving pointers to your script above, the relevant FAQ entry and - if you're feeling generous - a patch.


    Edit: OK, I looked into it a little more and it's not quite that simple :-(

    The suggested code in the FAQ simply re-runs the query. Since the DBI interface exposes prepared statements to the perl level, perhaps the error (and handling) needs to be propogated to the perl level too. Doesn't feel like there's a simple solution to me, since you may have already processed some rows (e.g. printed them out). It might be possible to keep track of how many rows you've returned and skip them, but I'm not sure if that would be worse (since I guess with a changing DB you could end up missing some rows or duplicating them).

    Maybe the best thing to do is to schedule some downtime for when ANALYZE is going to be run.

      Thanks for investigating.

      There's code within DBD::SQLite's dbdimp.c which does something like the FAQ snippet, but only at the start of an execute, not after calling sqlite's finish within the execute.

      The SQL string and bind variables are already retained but the DBD C code so it seems possible to transparently re-prepare. There are even functions to help do this in sqlite (transfer_bindings and reset or something).

      Unfortunately, this application needs an ANALYZE after inserting all the basic data in order to run queries with a sane plan. What's more, it's not only the ANALYZE that causes the problems and I also add indexes after bulk loading data. There are probably ways to paper over the problem, something like pinging after each troublesome action.

      If it is indeed a problem at the DBD level and can be addressed there then that's my preferred solution. I suspect that there may be a obstacles ahead though.

      Update: The new version of prepare doesn't fail on schema changes. This may fully or partially fix the problem.
        If your app is in control (and knows when the ANALYZE is going to happen) then perhaps the best approach would be to try and control the use of prepared statement handles?

        If you defer loading your Class::DBI modules until after the bulk load has finished (or are you using Class::DBI for the bulk load?) then it seems that might work around the problem.

        One way to defer the load of some modules until run-time is to wrap the 'use' declaration in a string-eval, e.g. eval "use My::Class::DBI::Module;". Another is to use:

        require 'My/Class/DBI/Module'; My::Class::DBI::Module::import();
        both of these approaches pull in the module at run-time and so can be used to move the Class::DBI loading until after your ANALYZE, if you perform them in a function rather than at top-level.

        There's probably a better way, though.

Re: Upgraded SQLite: "database changed" error
by ww (Archbishop) on Jun 05, 2007 at 10:46 UTC
    w2k, Activestate: This is perl, v5.8.8 built for MSWin32-x86-multi-thread
    output varies only slighly:

    >perl dblitetest.pl
    $DBI::VERSION 1.55
    $DBD::SQLite::VERSION 1.13
    DBD::SQLite::db selectrow_array failed: database schema has changed(1) at dbdimp.c line 421 at dblitetest.pl line 16.
    got

    >

Re: Upgraded SQLite: "database changed" error
by bsb (Priest) on Jun 06, 2007 at 01:41 UTC
    It is indeed an interaction between Ima::DBI's use of prepare_cached and SQLite's changed error. Invalidating the $dbh->{CachedKids} is a passable work-around. I tried and failed to globally turn caching off in Ima::DBI::set_sql, but that probably would have caused performance issues anyway. I'm still not sure exactly which change revealed this problem.

    DBD::SQLite seems to be the correct place to address this issue. Does anyone else have an opinion on that matter? Is there a better forum for DBD::SQLite discussions? The only other seems to be RT.
    $dbh->{CachedKids} = {}; # after ANALYZE or similar # from the example $dbh->prepare_cached('select "hello" from a'); print Dump $dbh->{CachedKids},"\n"; $dbh->{CachedKids} = {}; $dbh->do("analyze"); my $sth = $dbh->prepare_cached('select "hello" from a'); print "got ", $dbh->selectrow_array($sth), "\n";
    A section of the DBI_TRACE from my main application
      The dbi-users@perl.org mailing list is an appropriate place to discuss changes to DBDs and the most likely place other than rt.cpan.org for DBD authors to see your suggestions. Also, there may be similar issues with other DBDs so posting to the list might pull in some other ideas.