http://qs321.pair.com?node_id=1223435


in reply to [SOLVED]: How to insert 30-50K rows into MySQL DB using DBI?

I am pretty sure that there is better way to do this than how I am doing it here.

yea

#!/usr/bin/perl -w use warnings; use strict; use DBI; use Data::Dumper; use POSIX qw( strftime ); my $database = 'db_name'; my $db_user = 'user'; my $db_password = 'pwd'; my $db_hostname = 'db_hostname'; my $dbh2 = DBI->connect("DBI:mysql:database=$database:host=$db_hostnam +e",$db_user,$db_password, { RaiseError => 1, AutoCommit => 1, mysql_auto_reconnect => 1 } # +Added AutoCommit => 1, mysql_auto_reconnect => 1 while trying to make + it work ); die "unable to connect to server $DBI::errstr" unless $dbh2; my $sql2_1 = q^ INSERT INTO table_name (col_1, col_2, col_3, col_4, col_5) VALUES (?,?,?,?,?) ^; my $sth2_1 = $dbh2->prepare($sql2_1); my $grp=0; $dbh2->begin_work; while(my @row = $sth1->fetchrow_array){ $grp++; if ($grp>1000) {$dbh2->commit;$dbh2->begin_work; $grp=0;} unless ($row[3]) {$row[3]=undef;} $sth2_1->execute($row[0],$row[1],$row[2],$row[3],$row[4]); } $dbh2->commit; $sth1->finish(); $dbh2->disconnect();

notice the begin_work and commit calls, and the batching of inserts before calling commit then begin work again

as for why the placeholders(?), you did not take into account proper mysql quoting or http://bobby-tables.com/

But even this may be too slow for 50K rows, ill be back

Replies are listed 'Best First'.
Re^2: How to insert 30-50K rows into MySQL DB using DBI?
by huck (Prior) on Oct 03, 2018 at 00:17 UTC

    ill be back

    When i needed to insert 10's of K records across a network interface i found the above method still too slow since there was an overhead for each sql call. so i went with something like this instead.

    #!/usr/bin/perl -w use warnings; use strict; use DBI; use Data::Dumper; use POSIX qw( strftime ); my $database = 'db_name'; my $db_user = 'user'; my $db_password = 'pwd'; my $db_hostname = 'db_hostname'; my $dbh2 = DBI->connect("DBI:mysql:database=$database:host=$db_hostnam +e",$db_user,$db_password, { RaiseError => 1, AutoCommit => 1, mysql_auto_reconnect => 1 } # +Added AutoCommit => 1, mysql_auto_reconnect => 1 while trying to make + it work ); die "unable to connect to server $DBI::errstr" unless $dbh2; my $sql2_1000 = q^ INSERT INTO table_name (col_1, col_2, col_3, col_4, col_5) VALUES ^; $sql2_1000.=' (?,?,?,?,?) 'x1000; my $sth2_1000 = $dbh2->prepare($sql2_1000); my $grp=0; $dbh2->begin_work; my @list=(); while(my @row = $sth1->fetchrow_array){ $grp++; if ($grp>1000) { $dbh2->commit; $dbh2->begin_work; $sth2_1000->execute(@list); $grp=1; @list=(); # edit/add } unless ($row[3]) {$row[3]=undef;} push @list,$row[0],$row[1],$row[2],$row[3],$row[4]; } $dbh2->commit; my $sql2_n = q^ INSERT INTO table_name (col_1, col_2, col_3, col_4, col_5) VALUES ^; my $n=scalar(@list)/5; $sql2_n.=' (?,?,?,?,?) ' x $n; my $sth2_n = $dbh2->prepare($sql2_n); $dbh2->begin_work; $sth2_n->execute(@list); $dbh2->commit; $sth1->finish(); $dbh2->disconnect();
    notice the batching of rows per sql call, and the final creation of a insert of the exact length.

    Both of these are untested code for example sake only, your mileage may vary.