in reply to How to Do multiples SQL Insert at one time?

A few notes:

First, as others have mentioned, you normally don't use SQL for bulk loading. Instead, use the bulk loading tool provided for your database (e.g., BCP for Sybase / MSSQL Server).

However, if you want to use SQL, there are a few tricks you can use to speed things up:

I hope you find this helpful...

...roboticus

Replies are listed 'Best First'.
Re^2: How to Do multiples SQL Insert at one time?
by MelaOS (Beadle) on Jan 08, 2008 at 05:55 UTC
    Hi Guys, great advice, really appreciate it.
    as i don't really need everything in perl and my main concern is speed. i will try to do the bulk insert using the BCP first. again, thanks!!
      If you are using MS SQL Server 2K another option might be to generate a Data Transformation Services (DTS) script to introduce a temp table into the database, then use that temp table for the insert. This DTS script can be saved and used from Perl. You can delete this temp table with a T-SQL command like:
      if exists(select 1 from INFORMATION_SCHEMA.tables where table_name = ' +Table_with_data') DROP TABLE Table_with_data;
Re^2: How to Do multiples SQL Insert at one time?
by nikosv (Deacon) on Jan 08, 2008 at 06:40 UTC
    I totally agree.It is better to drop the index (modify the table to 'heap') first and then do the inserts thus avoiding the overhead of updating the indexes. Then assuming that you are reading from the log file and that each line has the following format $line="value1:value2:value3" then you can use placeholders:

    open(FILE,"logfile.txt")||die; while($line=<FILE>){ ($field1,$field2,$field3)=split(/:/,$line); $sth = $dbh->prepare ('INSERT INTO tablename (fieldname1,fieldname2,fi +eldname3) VALUES(?,?,?)'); $sth->execute ($field1,$field2,$field3); } $sth->finish ();
    as roboticus points out, it might be better to group the inserts and do a commit let's say every 500 records. you might also check for errors when executing the statement and rollback the transaction when one occurs

      I hope you don't really use code like this. In case to have DBI do bulkloading faster, you should also write faster code. That prepare should be outside the loop

      open my $log, "<", $logfile or die "$logfile: $!"; my $sth = $dbh->prepare ("insert into foo values (?,?,?)"; while (<$log>) { chomp; $sth->execute (split m/:/, $_); } $sth->finish; $dbh->commit;

      Enjoy, Have FUN! H.Merijn

        Alternatively, let someone else separate preparation and execution!

        use DBIx::Simple; my $db = ...; while (<$log>) { $db->query('INSERT INTO foo VALUES (??)', split /:/); }
        This prepares the query only once.

        Juerd # { site => 'juerd.nl', do_not_use => 'spamtrap', perl6_server => 'feather' }