ill be back
When i needed to insert 10's of K records across a network interface i found the above method still too slow since there was an overhead for each sql call.
so i went with something like this instead.
#!/usr/bin/perl -w
use warnings;
use strict;
use DBI;
use Data::Dumper;
use POSIX qw( strftime );
my $database = 'db_name';
my $db_user = 'user';
my $db_password = 'pwd';
my $db_hostname = 'db_hostname';
my $dbh2 = DBI->connect("DBI:mysql:database=$database:host=$db_hostnam
+e",$db_user,$db_password,
{ RaiseError => 1, AutoCommit => 1, mysql_auto_reconnect => 1 } #
+Added AutoCommit => 1, mysql_auto_reconnect => 1 while trying to make
+ it work
);
die "unable to connect to server $DBI::errstr" unless $dbh2;
my $sql2_1000 = q^
INSERT INTO table_name (col_1, col_2, col_3, col_4, col_5)
VALUES ^;
$sql2_1000.=' (?,?,?,?,?) 'x1000;
my $sth2_1000 = $dbh2->prepare($sql2_1000);
my $grp=0;
$dbh2->begin_work;
my @list=();
while(my @row = $sth1->fetchrow_array){
$grp++;
if ($grp>1000) {
$dbh2->commit;
$dbh2->begin_work;
$sth2_1000->execute(@list);
$grp=1;
@list=(); # edit/add
}
unless ($row[3]) {$row[3]=undef;}
push @list,$row[0],$row[1],$row[2],$row[3],$row[4];
}
$dbh2->commit;
my $sql2_n = q^
INSERT INTO table_name (col_1, col_2, col_3, col_4, col_5)
VALUES ^;
my $n=scalar(@list)/5;
$sql2_n.=' (?,?,?,?,?) ' x $n;
my $sth2_n = $dbh2->prepare($sql2_n);
$dbh2->begin_work;
$sth2_n->execute(@list);
$dbh2->commit;
$sth1->finish();
$dbh2->disconnect();
notice the batching of rows per sql call, and the final creation of a insert of the exact length.
Both of these are untested code for example sake only, your mileage may vary. |