http://qs321.pair.com?node_id=11149642


in reply to Windows Perl with sqlite3

A few things come to mind.

Basically, every time you do anything (select, insert, update), SQLite (just as any other database) rummages around in the datafile(s). Depending on how your filesystem is configured it can cause one or more writes to disk, because it may be updating the "last access" time of those files.

SQLite can also cause writes during read operation, because it supports concurrency as far as i remember. So it would need to write some lock information to disk. Here, it also depends on your file system setup as well as your hardware if these things get buffered in RAM with a delayed write or if the SQLite process is essentially blocked until the system can complete the write operation.

If you use DBD::SQlite, you can try to run everything using the filename ":memory:" to run an in-memory database. This way you can check if the problem is File IO or if this is a processor-optimization-gone-wrong kind of situation.

Write operations are especially "special" if you use some kind of RAID controller or a software RAID. There are all kinds of checksum-generation plus the whole business of having to send everything to multiple disks and waiting for them to finish.

Lastly, there may or may not be a patched security vulnerability with your AMD CPU on the slow system. There were a lot of these since your processor was designed (in 2016 if Google is correct). Meltdown, Spectre, Hertzbleed, etc. Most of the fixes by the manufacturers seem to be to disable all the features that made those processors fast. And if your copy of Perl or SQLite uses one of them (which now needs to be emulated in some other way by the processor or operating system), this can slow things down by a lot. That is another thing where testing with an in-memory database can really help, because it allows you to distinguish between IO-wait and data processing.

PerlMonks XP is useless? Not anymore: XPD - Do more with your PerlMonks XP