You could delete them in chunks instead of one at a time.
my @deletable_ids = qw( 1 2 3 ); my $delete_at_a_time = 100; while ( @deletable_ids ) { my @delete_now; if ( scalar @deletable_ids > $delete_at_a_time ) { @delete_now = splice @deletable_ids, 0, $delete_at_a_time; } else { @delete_now = @deletable_ids; @deletable_ids = (); } my $in_clause = join q{,}, @delete_now; $dbh->do( "DELETE FROM t WHERE blah IN ($in_clause)" ); }
You may not be able to shove all of your ids into one DELETE, but with some trial and error you can figure out how many you can delete at once.
Update: If you want to make use of parameters (to get quoting, for instance), you can make the end of the loop look like this:
my $qmarks = join q{,}, map { '?' } @delete_now; $dbh->do( "DELETE FROM t WHERE blah IN ($qmarks)", undef, @delete_now );
In reply to Re: Perl DBI (sybase) rapidly delete large number of records
by kyle
in thread Perl DBI (sybase) rapidly delete large number of records
by simm42
| For: | Use: | ||
| & | & | ||
| < | < | ||
| > | > | ||
| [ | [ | ||
| ] | ] |