Solved

MS sql sever 2005 - In recovery

Posted on 2010-11-19
6
288 Views
Last Modified: 2012-05-10
Hi,
Running a large delete sql in taking sql server to auto recovery mode after which the DB isnt available for long time (hours).
My understanding the is that running large quries clutters the db buffer space and when reaching a certain point it goes on recovery mode (thus cleaning the buffer) - please correct
Issue is that we are running a web from the DB so cant affrod the down time.
What is an appropriate fix to this?
thanks.
0
Comment
Question by:SameerMirza
  • 3
  • 3
6 Comments
 
LVL 39

Expert Comment

by:lcohan
ID: 34175315
you must batch up your deletes and I won't go beyond 10000 rows per batch delete even if your table is partitioned.
deletes are affected by:

clustered index/pk on the table
foreign keys - pretty big impact
I/O
recovery mode of your DB - if possible simple would be prefered but if you run 24/7 web site against it this is unlikely to be available.

I need to do similar thing to purge expired data and for some tables (100million plus clients) the batch is no larger than 20-50 records for delets but it goes silent and pretty fast. also if possible include hints in your delete - I use ROWLOCK
0
 

Author Comment

by:SameerMirza
ID: 34175701
Problem is that my log backup disk space is running out of memory.
I have 10z of millions of record to delete. How can u delete them in such short batches
If that's the best way then could you please paste link on how to? :)
Also about the rowlock
Thanks
0
 
LVL 39

Expert Comment

by:lcohan
ID: 34175896
This looks challenging indeed but let me ask a question:

Is this something like a purge process that you need to setup and the innitial volumes are high then once you catch up the delete volumes are smaller or
This is something you need to constantly do on such large volumes?

For the first option I can provide some batch example including a chekpoint and update stats in it so your log file may be somewhat under control due to the volumes to delete however you will need to test that CAUTIOSLY in your live environment for speed/impact in your web site UI due to locking/blocking. You could start the deletes from small to larger batch in a SQL job.

For the second option if you need to constantly delete such volumes I suggest/recommand partitioning then you could switch out unwanted large volumes faster with less impact - I use this method for archiving obsolite data by date range or inactive clients.

0
Microsoft Certification Exam 74-409

Veeam® is happy to provide the Microsoft community with a study guide prepared by MVP and MCT, Orin Thomas. This guide will take you through each of the exam objectives, helping you to prepare for and pass the examination.

 

Author Comment

by:SameerMirza
ID: 34284230
Disk space issue - Logs filling the disk space
0
 
LVL 39

Accepted Solution

by:
lcohan earned 500 total points
ID: 34295173
Is it your db t-log where you do the deletes or tempdb t-log? I use the command below in all may batch deletes relying on temp tables and you could include an explicit CHECKPOINT (and even a "update statistics table_name") after a larger number of rows were deleted -like 100,000 or 1,000,000 rows:

EXEC sp_executesql N'USE TEMPDB; CHECKPOINT;'
0
 

Author Closing Comment

by:SameerMirza
ID: 34411570
thanks
0

Featured Post

Use Case: Protecting a Hybrid Cloud Infrastructure

Microsoft Azure is rapidly becoming the norm in dynamic IT environments. This document describes the challenges that organizations face when protecting data in a hybrid cloud IT environment and presents a use case to demonstrate how Acronis Backup protects all data.

Question has a verified solution.

If you are experiencing a similar issue, please ask a related question

Suggested Solutions

Why is this different from all of the other step by step guides?  Because I make a living as a DBA and not as a writer and I lived through this experience. Defining the name: When I talk to people they say different names on this subject stuff l…
Slowly Changing Dimension Transformation component in data task flow is very useful for us to manage and control how data changes in SSIS.
Via a live example, show how to set up a backup for SQL Server using a Maintenance Plan and how to schedule the job into SQL Server Agent.
Viewers will learn how to use the SELECT statement in SQL and will be exposed to the many uses the SELECT statement has.

777 members asked questions and received personalized solutions in the past 7 days.

Join the community of 500,000 technology professionals and ask your questions.

Join & Ask a Question