Delete from table Index strategy

Posted on 2014-02-06
Last Modified: 2016-02-10

I have a query that looks like this:

DELETE FROM dbo.fact_EmailActivity
      WHERE wk_EmailActivity IN (SELECT wk_EmailActivity FROM stg_fact_emailActivity_process)

There are several problems, the table has more than 250M rows, 10 columns, 9 non clustered index and one clustered index (PK)

Is it better for me to do a delete based on a join? or use the sub query?

aslo, should i drop all the index and recreate ?

One thing i didn't mention, is that i execute this inside of an ssis pacakge loop, so i do it 100,000 rows at a time, then get the next 100K

It's taken over 23 hours, and it looks like it's doing about 1M rows per hour.there are a few other steps in the package, however, i'm positive that this is the biggest bottle neck .I don't know why the table was designed with so many indices. However, it seems like even if i drop the indices, rebuilding them, would take equally as long....

Any help, greatly appreciated, to even cut this time in half....
Question by:basile
LVL 75

Assisted Solution

by:Aneesh Retnakaran
Aneesh Retnakaran earned 250 total points
Comment Utility
>Is it better for me to do a delete based on a join? or use the sub query?
both methods will have the same performance. You need to make sure that there is an idex on dbo.fact_EmailActivity.wk_EmailActivity
> is that i execute this inside of an ssis pacakge loop, so i do it 100,000 rows at a time, then get the next 100K

deleting in batches is the best option as it wont lock the tables. On a heavily used table, if you delete millions of rows, its gonna create a lock on the table and applications may time out. So I suggest you delete them in batches

If you have to delete majority of records from that table, you can actually insert the rest of the data into a new table and drop the old one. You may need to plan this action ahead so that the application wont insert anything into the original table while you do the insert.
LVL 69

Accepted Solution

ScottPletcher earned 250 total points
Comment Utility
Definitely create an index on dbo.fact_EmailActivity ( wk_EmailActivity ), but you must also look at the query plan and verify that it is using that index for the DELETE.

Also, Cluster table stg_fact_emailActivity on wk_EmailActivity, just in case.  It may not help much, but it definitely won't hurt.

>> I don't know why the table was designed with so many indices. <<

Probably the wrong clustering column.  By default, id/identity is usually used for clustering, and even some experts very strongly push that idea.  But that often ends up resulting in vastly worse performance overall, in massive extra I/O for standard table joins and because of the large number of covering, nonclustered indexes that have to be built on the table.

Author Closing Comment

Comment Utility
Thanks for the help guys. i went a slightly different way,

i did the following for the first historical delete. after that, it's a much smaller data set. so the package runs alot smoother, something that was taking more than 24 hours to run, (at 1m per hour, it would have taken 100 hours to complete this), it's completing in less than one hour.

/*1. create a new table that is structurally identical to the original. Fact_emailActivity_New
2. create the clustered index
3. insert the data from fact_emailActivity into the new table (that we want to keep where fk_SentDate > 8036)
4. create the non-clustered indexes on the new table
5. after verifying the data in the new table, re-name the original table to fact_EmailActivity_old
6. rename the new table to fact_EmailActivity
7. select the data we want to archive into fact_EmailActivity_arc
 (could also create a recordset in ssis, with the desired rows,
 where fk_SentDate < 8036)
8. move data from fact_EmailActivity_arc to EDWArchive (arc_Fact_EmailActivity)
9. verify data, then truncate fact_EmailActivity_OLD
10. drop fact_EmailActivity_OLD and fact_EmailActivity_arc

Featured Post

Highfive Gives IT Their Time Back

Highfive is so simple that setting up every meeting room takes just minutes and every employee will be able to start or join a call from any room with ease. Never be called into a meeting just to get it started again. This is how video conferencing should work!

Join & Write a Comment

Lessons learned during ten years of interviewing for SQL Server Integration Services (SSIS) and other Extract-Transform-Load (ETL) contract roles and two years of staff manager interviewing contractors.
JSON is being used more and more, besides XML, and you surely wanted to parse the data out into SQL instead of doing it in some Javascript. The below function in SQL Server can do the job for you, returning a quick table with the parsed data.
Via a live example, show how to set up a backup for SQL Server using a Maintenance Plan and how to schedule the job into SQL Server Agent.
Viewers will learn how to use the SELECT statement in SQL and will be exposed to the many uses the SELECT statement has.

763 members asked questions and received personalized solutions in the past 7 days.

Join the community of 500,000 technology professionals and ask your questions.

Join & Ask a Question

Need Help in Real-Time?

Connect with top rated Experts

9 Experts available now in Live!

Get 1:1 Help Now