Beefy Boxes and Bandwidth Generously Provided by pair Networks
good chemistry is complicated,
and a little bit messy -LW
 
PerlMonks  

Perl DBI for MariaDB v SQLite in a Single Use SBC

by justin423 (Scribe)
on Mar 18, 2023 at 22:14 UTC ( [id://11151062]=perlquestion: print w/replies, xml ) Need Help??

justin423 has asked for the wisdom of the Perl Monks concerning the following question:

ok, I have a perl program that runs a mysql database running on my laptop, but want to convert it to be the only program running on a single board computer with MariaDB at runs at startup using a command in RC Local. Once I get a prototype finalized, I want to make 20-25 copies on SD cards to put on a bunch of cheap SBC's I bought on Ebay and hand out those computers at trade shows that would allow the user to insert a USB drive with their own data, which is in an industry standard fixed width format, and run it through the program that does various data checks (and has 2 static reference files that are updated once or twice a year, both in .csv format)

I was intrigued by using SQLite, but SQLite does not seem to be able to import fixed width formats like LOAD DATA INFILE inside the MariaDB DBI driver, whereas in perl, I can do a system mount command for the USB and just use a variable in the load data statement.

So SQLite seems a non-starter. Once I get the perl script finalized, What would be the recommended way to take the EmmC card and replicate it 20X onto MicroSD cards to use in other SBC's. And after copying it over, do you need to re-issue the GRANT statement afterwards to synch up the hard-coded username/password combination in the Perl Script?

Replies are listed 'Best First'.
Re: Perl DBI for MariaDB v SQLite in a Single Use SBC
by marto (Cardinal) on Mar 19, 2023 at 11:16 UTC

    I don't understand your SQLite concern, is it purely that it doesn't ship with a tool to import fixed width data? I often wrap (or have to wrap) such loads from various sources in a perl script, so that I can apply some logic and error handling when inserting millions of rows into SQLite. This isn't a lot of work and it allows for greater control. Depending on what your code is like SQLite would likely perform better than MariaDB on lower end SBCs. For context I use SQLite with a Mojolicious front end, I recreate the data on a nightly basis, millions of rows, it performs very well.

    Regarding how to replicate your system from emmc to sd card, you could use dd to create an image you'd then write the target sd cards.

Re: Perl DBI for MariaDB v SQLite in a Single Use SBC
by Marshall (Canon) on Mar 19, 2023 at 09:38 UTC
    "So SQLite seems a non-starter. Really? I don't understand that conclusion. Normally converting from MySQL on your laptop to SQLite on an SBC would be a good approach.

    This among other things gets rid of username/password issues and other admin problems. A short script to import a fixed-width file doesn't sound like any big deal. Most SQLite utilities import .csv files. One approach would be a simple filter to make your fixed width input file into a .csv file. And then use an import tool on that file. Another approach is shown here: import-fixed-width-text-file-into-sqlite.

    As long as the SQL features being used in the MySQL program are available in SQLite, SQLite should provide a smaller footprint solution and work very well.

    Update:
    You say that this file is an industry standard fixed width format. Ok show say 3 lines of data, some spec on what the columns should be named - maybe schema from your MySQL DB? We could help you with the code to import that data if you are having trouble doing that. If speed matters, the main thing for import is to minimize the number of transactions. I usually run 150K-250K rows per transaction although even a million is fine. There may be some datatype issues in the port (like datetime type) but there are solutions to all of those things. You will be impressed with SQLite performance.

      first character is the record type. there are 4 types. T-W-Q-C I usually load into a staging and set rec type = substring(data,1) and then do the insert-select substring where rec_typ= into other tables. file is 1017 characters per line and the first two lines are always T-W, then Q's (anywhere from a few dozen, to a few million) and then a single C (which is the reconciliation of the Q records). So the speed is needed for the Q records. (The fake Q records below show what data each fields have. e.g. numeric(always integers with no decimals), and text) <c>
      T2022TTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTT +TTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTT +TTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTT +TTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTT +TTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTT +TTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTT +TTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTT +TTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTT +TTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTT +TTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTT +TTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTT +TTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTT +TTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTT +TTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTT +TTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTT WWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWW +WWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWW +WWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWW +WWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWW +WWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWW +WWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWW +WWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWW +WWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWW +WWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWW +WWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWW +WWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWW +WWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWW +WWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWW +WWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWW +WWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWW Q0001012345678900012345678900012345678900000000GB00000000 01234567890 +001234567890123456789 ABCDEFGHIJKLMNOPQRSTUVWXYABCDEFGHIJKLMNOABCDEF +GHIJKLMNOPQRSTUVWXYABCDEFGHIJKLMNOABCDEFGHIJKLMNOPQRSTUVWXYABCDEFGHIJ +KLMNOABCDEFGHIJKLMNOPQRSTUVWXYABCDEFGHIJKLMNOABCDEFGHIJKLMNOPQRSTUVWX +YABCDEFGHIJKLMNOABCDEFGHIJKLMNOPQRSTUVWXYABCDEFGHIJKLMNOXXYYZZ1234567 +89123456789 012345678912012345678912012345678912 Q00010123456789 +0001234567890001234567890Q000101234567890001234567890001234567890Q000 +101234567890001234567890001234567890 Q000101234567890001234567890001 +234567890Q000101234567890001234567890001234567890Q0001012345678900012 +34567890001234567890XXYYGB123456789123456789ABCDEFGHIJKLMNOPQRSTUVWXY +ABCDEFGHIJKLM4012345678912345678912312-3456789STABCDEFGHIJKLMNOPQRSTU + ABCDEFGHIJKLMNOPQRSTU 3 0115 9TS7QT.99999.S1.092 9999999901234567891 +29TS7QT.99999.SL.0920101ABCDEFGHIJKLMNOPQR20229TS7QT.99999.SL.092ABCD +EFGHIJKLMNOPQRSTUVWXYABCDEFGHIJKLM40ABCDEFGHIJKLMNOPQRSTUVWXYABCDEFGH +IJKLM40999999999 0123456789 0000001 Q0001012345678900012345678900012345678900000000GB00000000 01234567890 +001234567890123456789 ABCDEFGHIJKLMNOPQRSTUVWXYABCDEFGHIJKLMNOABCDEF +GHIJKLMNOPQRSTUVWXYABCDEFGHIJKLMNOABCDEFGHIJKLMNOPQRSTUVWXYABCDEFGHIJ +KLMNOABCDEFGHIJKLMNOPQRSTUVWXYABCDEFGHIJKLMNOABCDEFGHIJKLMNOPQRSTUVWX +YABCDEFGHIJKLMNOABCDEFGHIJKLMNOPQRSTUVWXYABCDEFGHIJKLMNOXXYYZZ1234567 +89123456789 012345678912012345678912012345678912 Q00010123456789 +0001234567890001234567890Q000101234567890001234567890001234567890Q000 +101234567890001234567890001234567890 Q000101234567890001234567890001 +234567890Q000101234567890001234567890001234567890Q0001012345678900012 +34567890001234567890XXYYGB123456789123456789ABCDEFGHIJKLMNOPQRSTUVWXY +ABCDEFGHIJKLM4012345678912345678912312-3456789STABCDEFGHIJKLMNOPQRSTU + ABCDEFGHIJKLMNOPQRSTU 3 0115 9TS7QT.99999.SL.092 9999999901234567891 +29T57QT.99999.SL.0920101ABCDEFGHIJKLMNOPQR20229TS7QT.99999.SL.092ABCD +EFGHIJKLMNOPQRSTUVWXYABCDEFGHIJKLM40ABCDEFGHIJKLMNOPQRSTUVWXYABCDEFGH +IJKLM40999999999 0123456789 0000001 Q0001012345678900012345678900012345678900000000GB00000000 01234567890 +001234567890123456789 ABCDEFGHIJKLMNOPQRSTUVWXYABCDEFGHIJKLMNOABCDEF +GHIJKLMNOPQRSTUVWXYABCDEFGHIJKLMNOABCDEFGHIJKLMNOPQRSTUVWXYABCDEFGHIJ +KLMNOABCDEFGHIJKLMNOPQRSTUVWXYABCDEFGHIJKLMNOABCDEFGHIJKLMNOPQRSTUVWX +YABCDEFGHIJKLMNOABCDEFGHIJKLMNOPQRSTUVWXYABCDEFGHIJKLMNOXXYYZZ1234567 +89123456789 012345678912012345678912012345678912 Q00010123456789 +0001234567890001234567890Q000101234567890001234567890001234567890Q000 +101234567890001234567890001234567890 Q000101234567890001234567890001 +234567890Q000101234567890001234567890001234567890Q0001012345678900012 +34567890001234567890XXYYGB123456789123456789ABCDEFGHIJKLMNOPQRSTUVWXY +ABCDEFGHIJKLM4012345678912345678912312-3456789STABCDEFGHIJKLMNOPQRSTU + ABCDEFGHIJKLMNOPQRSTU 3 0115 9TS7QT.99999.SL.092 9999999901234567891 +29TS7QT.99999.SL.0920101ABCDEFGHIJKLMNOPQR20229TS7Q1.99999.SL.092ABCD +EFGHIJKLMNOPQRSTUVWXYABCDEFGHIJKLM40ABCDEFGHIJKLMNOPQRSTUVWXYABCDEFGH +IJKLM40999999999 0123456789 0000001

        Please wrap that data in <code>...</code> tags.

        As it stands, your post has made viewing the entire thread extremely difficult: on my 27" screen, I need to scroll horizontally about a dozen times; this will likely be even more difficult for those with laptops or other smaller devices.

        For future posts, kindly ensure you "Preview" first, edit to fix any problems such as this, repeat as necessary, and only "Create" when all is well. Thankyou.

        — Ken

        so to condense my prior post. How do I convert the following DBI instructions for mariadb to sqlite

        print "Enter filename"; $filename=<STDIN>; chomp $filename; $path='/media/usb-drive'; $file=$path.$filename; $dbh->do("LOAD DATA INFILE '$file' into TABLE STAGE SET REC_TYP=SUBSTRING(DATA,1)"); $dbh->do("INSERT INTO A(A,B,C) SELECT SUBSTRING(DATA,2,10,SUBSTRING(DATA,13,1),SUBSTRING(DATA,14,20) FROM STAGE WHERE REC_TYP='q'");

Re: Perl DBI for MariaDB v SQLite in a Single Use SBC
by karlgoethebier (Abbot) on Apr 26, 2023 at 14:43 UTC

    Did you take a look at the Command Line Shell For SQLite? It comes with the command .import FILE TABLE. This may be the tool you are looking for. Regards, Karl

    «The Crux of the Biscuit is the Apostrophe»

Re: Perl DBI for MariaDB v SQLite in a Single Use SBC
by alexander_lunev (Pilgrim) on Mar 19, 2023 at 14:55 UTC

    Hello!

    SQLite is exactly the database for your needs, and as already have been said, you have a perfect tool for any data processing, like for loading your data in SQLite - Perl! I've done many almost-desktop apps with Mojolicious+SQLite, that runs in desktop browser on localhost, and if I have to hide black windows on Win platform, then wperl.exe is for help. So SQLite IS the starter point for such apps!

Re: Perl DBI for MariaDB v SQLite in a Single Use SBC
by cavac (Parson) on Mar 20, 2023 at 07:49 UTC

    Besides the other posts telling you how to import the data into SQLite, what's wrong with running MariaDB (or PostgreSQL, for that matter) on the SBC?

    I've had PostgreSQL running on the old Raspberry Pi 3 in conjunction with a webserver. It will need a bit of tuning to run smoothly on limited resources, but modern single board computers are often quite capable. You didn't post the specs of your SBCs, so i can only take a stab in the dark, but it it has at least 512MB Ram and runs Linux, a "grown up" database should run on it when configured correctly.

    PerlMonks XP is useless? Not anymore: XPD - Do more with your PerlMonks XP

      "It will need a bit of tuning to run smoothly on limited resources"

      If SQLite can do the job, why bother? Adding something that's hundreds of megabytes and requires extra effort to run smoothly vs something tiny in comparison that performs well out of the box seems sensible on SBCs.

        That really depends on the OPs requirements. If they already have a working application, changing the database backend (or going farther and make it configurable) can be a huge pain. That starts with simple stuff like timestamp handling between different databases and support for different basic features (foreign keys, typecasting, upper/lowercase conversion functions, unicode handling etc). But even slightly more complex SELECT statements can run into troubles, SQL has many different dialects.

        Also, SQLite doesn't support partitioning, XML and JSON handling (other than "as text") and server side scripting. If an existing application relies on any of that, a major rewrite would be required.

        As for "running smoothly out of the box", SQLite is a bit problematic as well. For one, if your application uses more than a single table, it pretty much has to manually activate correct handling of foreign keys everytime it opens the database:

        PRAGMA foreign_keys = ON;

        This will increase memory requirements and slow things down a bit. But running a database without proper foreign key support is just asking for trouble.

        I'm not saying SQLite is the wrong choice per se, it really depends a lot on the circumstances. If OPs project is a blank slate/green field project that runs a single process SQLite a good choice (i use SQLite for similar stuff). But if it's "porting existing, multi-process/multi-thread software to a different hardware", the situation is a lot more complicated. We don't know the design of OPs software, we don't know how complex and mature it is. But since they plan to hand demo systems out at trade shows (to potential customers, probably?), i have to assume that the demo project has more than "trivial" complexity, has been tested a lot already and it has to work without any major new bugs.

        PerlMonks XP is useless? Not anymore: XPD - Do more with your PerlMonks XP
Re: Perl DBI for MariaDB v SQLite in a Single Use SBC
by Marshall (Canon) on Apr 26, 2023 at 05:52 UTC
    justin423 sent me a private message. I think it has to do with this thread, so I will answer it here publicly.

    justin423 says: Is there a module, or use REGEXP to use the Perl DBD to import CSV files into SQLite? see this one https://www.irs.gov/pub/irs-utl/FFIListExample.csv

    We are talking past each other. Strictly speaking, NO. The Perl DBI is for Perl to talk to a SQLite DB. There is no command line I/F included with the standard Perl distribution. Almost certainly at the beginning of your Perl program, use DBI; is all you need to start writing Perl code to talk to an SQLite DB. That is because almost certainly the SQLite drivers were included with your Perl distribution. I have provided one example at: Example Perl import code. You did not specify the format of your file very well, but my example should get you started. To import a CSV file, I would recommend Text::CSV to parse the input file.

    If you want a command utility to import a CSV file. That task has nothing whatsoever to do with Perl - a completely separate subject. Go to the SQLite FAQ Page. Go to the Download Link and download the appropriate tools for your platform. I assume Windows of some flavor. You want the command line program sqlite3. Read the online documentation to find out about the CSV import utility. I haven't used it myself, but there is such a thing. Again, this is independent of Perl and its DBI. However, Perl will be able to read the SQlite DB created with the command line tools.

    Writing your own import program will give you more flexibility and is not hard to do. This is the way that I do it so that nothing besides Perl needs to be installed. I would encourage you to attempt this yourself based on my example code. If you have trouble, then start a new thread with your new question.

    Update:
    Ok, Oh well, I went ahead and wrote a simple importer for your IRS example file. It makes one table with as many columns as the CSV data has. Assumes all data is textual. In SQLite, it is not necessary to spec say: varchar(24), you can just say: TEXT as the data type. I think you are better off with a special-purpose importer that doesn't have to build confusing SQL command strings as I did. But feel free to use this if you want. Of course, there are issues if some of these values are INTEGER or other types. Note that this CSV file does contain embedded commas, so Text::CSV is mandatory.

    Updated code with a different way of generating the SQL statements.

    use strict; use warnings; use Text::CSV; use DBI; my $dbName = "YourDBname.sqlite"; my %attr = ( RaiseError => 1); # auto die with error printout my $dbh = DBI->connect("dbi:SQLite:dbname=$dbName","","",\%attr) or die "Couldn't connect to database $dbName: " . DBI->errstr; my $csv = Text::CSV->new ({ binary => 1 }); my $columnDefLine = <DATA>; chomp $columnDefLine; $dbh->do("DROP TABLE IF EXISTS Irs"); my $ColDataTypes = $columnDefLine; $ColDataTypes =~ s/(\w+)/$1 TEXT/g; # all cols are TEXT my $createStatement = "CREATE TABLE Irs ($ColDataTypes)"; print "Create SQL Stmt=>$createStatement\n"; $dbh->do("$createStatement"); my $qMarks = $columnDefLine; # placeholders for each column's data $qMarks =~ s/(\w+)/?/g; my $insertSQL = "INSERT INTO Irs ($columnDefLine) VALUES ($qMarks)"; print "insertSQL STMT=>$insertSQL\n"; my $insert_row = $dbh->prepare("$insertSQL"); $dbh->begin_work; # Single Transaction for all of the Table's data # million rows is just fine while (my $row_ref = $csv->getline (*DATA)) { $insert_row->execute(@$row_ref); } $dbh->commit; =Prints: Shows the CREATE TABLE and INSERT SQL statements that were generated Create SQL Stmt=>CREATE TABLE Irs (GIIN TEXT,FINm TEXT,CountryNm TEXT) insertSQL STMT=>INSERT INTO Irs (GIIN,FINm,CountryNm) VALUES (?,?,?) =cut __DATA__ GIIN,FINm,CountryNm "98Q96B.00000.LE.250","Test Bank One, LLP","France" "98Q96B.00000.BR.826","Branch","United Kingdom" "98Q96B.00000.BR.036","Branch","Australia" "98Q96B.00000.BR.076","Branch","Brazil" "98Q96B.00000.BR.818","Branch","Egypt" "98Q96B.00001.ME.276","Test Bank Two","Germany" "98Q96B.00001.BR.826","Branch","United Kingdom" "8124H8.00000.SP.208","Test Bank Three","Denmark" "C54S47.99999.SL.276","Test Bank Four","Germany" "C54S47.99999.BR.208","Branch","Denmark" "126BM7.00000.LE.826","Test Bank Five","United Kingdom" "126BM7.00000.BR.250","Branch","France" "126BM7.00001.ME.208","Test Bank Nine","Denmark" "126BM7.00002.ME.276","Test Bank Ten, LLP","Germany" "126BM7.00002.BR.344","Branch","Hong Kong" "SE19K4.99999.SL.250","Test Bank Eleven","France" "76GHU9.00000.BR.036","Branch","Australia" "AAAAAA.00000.SP.001","Test Bank, 150-character name1 23456-40 123456- +50x123456-60x123456-70x123456-80x123456-90x12345-100 12345-110 12345- +120 12345-130 12345-140 123456-150","Afghanistan" "AAAAAA.00000.SP.002","Test Bank, 150-character name2 (parenthesis) [s +quare brackets] {curly brackets}-60x123456-70x123456-80x123456-90x123 +45-100 12-130 12345-140 123456-150","France" "AAAAAA.00000.SP.003","Test Bank, 150-character name3 (3456-4) [23456- +50x123456-60x123456-70x123456-80x123456-90x12345-10] {2345-11} 12345- +120 12345-130 12345-140 123456-150","Egypt"
    Old code...

Log In?
Username:
Password:

What's my password?
Create A New User
Domain Nodelet?
Node Status?
node history
Node Type: perlquestion [id://11151062]
Approved by 1nickt
Front-paged by Corion
help
Chatterbox?
and the web crawler heard nothing...

How do I use this?Last hourOther CB clients
Other Users?
Others admiring the Monastery: (4)
As of 2024-03-28 17:15 GMT
Sections?
Information?
Find Nodes?
Leftovers?
    Voting Booth?

    No recent polls found