in reply to Perl DBI (sybase) rapidly delete large number of records
My guess is that you don't have a covering index for the id column of your table.
That's easy to fix: create an index on the table (preferably an unique index) that has 'id' as one of the keys.
The thing you need to worry about is the danger of putting the entire delete in a single tran is that you may fill up your transaction log. 20,000 rows isn't that big and neither is 200k row tables. Let's cover that:
Say you have a query that easily fills up the transaction log of the Sybase ASE server:
delete from rep_queues_archive where sample_date < dateadd(wk, -1, getdate())
How would you break up the transaction so it doesn't fill up the log? There are several ways to do it, but I'll just cover two of them:
declare @rows_affected int
declare @dbName varchar(50)
select @dbName = db_name()
set rowcount 1000
select @rows_affected = 1
while @rows_affected > 0
begin
delete from rep_queues_archive where sample_date < dateadd(wk, -1, getdate())
select @rows_affected = @@rowcount
if @@error = 0
begin
commit tran
dump tran @dbName with truncate only end else
rollback tran
end
set rowcount 0
Of course, if you are performing incremental backups, you will want to modify the dump tran @dbName with truncate only line to read something like dump tran @dbName to "/sybdumps/mydb_tran.dmp". Modify to suit your naming standard.
HTH,
Jason L. Froebe
Help find a cure for breast cancer! Net proceeds benefit the Susan G. Komen Breast Cancer Foundation and the National Philanthropic Trust. Help by donating - I'm walking 60 miles in 3 days in August 2007. (The day I return from TechWave is the first day of the Walk).
|
|---|