in reply to Perl DBI (sybase) rapidly delete large number of records

You could delete them in chunks instead of one at a time.

my @deletable_ids = qw( 1 2 3 ); my $delete_at_a_time = 100; while ( @deletable_ids ) { my @delete_now; if ( scalar @deletable_ids > $delete_at_a_time ) { @delete_now = splice @deletable_ids, 0, $delete_at_a_time; } else { @delete_now = @deletable_ids; @deletable_ids = (); } my $in_clause = join q{,}, @delete_now; $dbh->do( "DELETE FROM t WHERE blah IN ($in_clause)" ); }

You may not be able to shove all of your ids into one DELETE, but with some trial and error you can figure out how many you can delete at once.

Update: If you want to make use of parameters (to get quoting, for instance), you can make the end of the loop look like this:

my $qmarks = join q{,}, map { '?' } @delete_now; $dbh->do( "DELETE FROM t WHERE blah IN ($qmarks)", undef, @delete_now );

Replies are listed 'Best First'.
Re^2: Perl DBI (sybase) rapidly delete large number of records
by jhourcle (Prior) on May 07, 2007 at 16:23 UTC

    You actually don't need the if/else block, as splice will return all of the remaining records in the 'else' case. I use the following:

    (I probably don't need to rebuild the sql statement each iteration of the loop, only the first/last ones, but it's rare that I delete more than 100 at a time)

    while ( @delete ) { my @files = splice( @delete, 0, 100 ); my $sth_delete = $db->prepare( 'DELETE FROM SECCHI WHERE filep +ath IN (' . join ( ',', ( ('?') x @files ) ) . ')' ); $sth_delete->execute( @files ) or warn "Couldn't delete files"; }