http://qs321.pair.com?node_id=1223435


in reply to [SOLVED]: How to insert 30-50K rows into MySQL DB using DBI?

I am pretty sure that there is better way to do this than how I am doing it here.

yea

#!/usr/bin/perl -w use warnings; use strict; use DBI; use Data::Dumper; use POSIX qw( strftime ); my $database = 'db_name'; my $db_user = 'user'; my $db_password = 'pwd'; my $db_hostname = 'db_hostname'; my $dbh2 = DBI->connect("DBI:mysql:database=$database:host=$db_hostnam +e",$db_user,$db_password, { RaiseError => 1, AutoCommit => 1, mysql_auto_reconnect => 1 } # +Added AutoCommit => 1, mysql_auto_reconnect => 1 while trying to make + it work ); die "unable to connect to server $DBI::errstr" unless $dbh2; my $sql2_1 = q^ INSERT INTO table_name (col_1, col_2, col_3, col_4, col_5) VALUES (?,?,?,?,?) ^; my $sth2_1 = $dbh2->prepare($sql2_1); my $grp=0; $dbh2->begin_work; while(my @row = $sth1->fetchrow_array){ $grp++; if ($grp>1000) {$dbh2->commit;$dbh2->begin_work; $grp=0;} unless ($row[3]) {$row[3]=undef;} $sth2_1->execute($row[0],$row[1],$row[2],$row[3],$row[4]); } $dbh2->commit; $sth1->finish(); $dbh2->disconnect();

notice the begin_work and commit calls, and the batching of inserts before calling commit then begin work again

as for why the placeholders(?), you did not take into account proper mysql quoting or http://bobby-tables.com/

But even this may be too slow for 50K rows, ill be back