Solved

SQL index and maintenance plan

Posted on 2014-04-15
11
588 Views
Last Modified: 2014-04-29
Hi Experts,
I know something about SQL but I am not a Guru.
So I have a question about indexing with SQL databases.
After the weekend always our log file is very big. Grows up to 80GB.
The DB is nearly 100GB.

During the week the log backup works fine and log file is filled with 1% not more.
After the weekend the logfile is again 80GB and filled with 1%.
I have seen we have a maintenance job running which recreates the indexes.

What can I do, not to have so big logfiles ?
How bigger companies are doing this ?
0
Comment
Question by:Eprs_Admin
  • 3
  • 3
  • 2
  • +3
11 Comments
 
LVL 142

Assisted Solution

by:Guy Hengel [angelIII / a3]
Guy Hengel [angelIII / a3] earned 100 total points
ID: 40001154
if you have the standart index maintenance plan, you should indeed disable it (or tune it like discussed here http://www.mssqltips.com/sqlservertip/3100/reduce-time-for-sql-server-index-rebuilds-and-update-statistics/, playing with the MAXDOP setting ...

usually, the index rebuild task will do this task for ALL tables/indexes, which is eventually completely overkill, especially if you have large tables were rows are not often "updated", but only inserted or eventually deleted.
0
 
LVL 13

Expert Comment

by:sameer2010
ID: 40001218
Hi,

Is this online maintenance or offline? If offline, I would suggest taking backup, disabling logging and then running utilities/maintenance jobs. If online, I would suggest considering rebuild over reorg, or performing backup as soon as log gets full so that the size is again controlled.

Thanks,
Sam
0
 
LVL 48

Accepted Solution

by:
PortletPaul earned 100 total points
ID: 40001410
>>"How bigger companies are doing this ?"

As stated by angelIII rebuilding all indexes is probably overkill.

I have seen several of the experts participating here at EE recommend the following:

SQL Server Backup, Integrity Check, and Index and Statistics Maintenance
The SQL Server Maintenance Solution comprises scripts for running backups, integrity checks, and index and statistics maintenance on all editions of Microsoft SQL Server 2005, SQL Server 2008, SQL Server 2008 R2, SQL Server 2012, and SQL Server 2014. The solution is based on stored procedures, the sqlcmd utility, and SQL Server Agent jobs. I designed the solution for the most mission-critical environments, and it is used in many organizations around the world. Numerous SQL Server community experts recommend the SQL Server Maintenance Solution, which has been a Gold winner in the 2013, 2012, 2011, and 2010 SQL Server Magazine Awards. The SQL Server Maintenance Solution is free.
Ola Hallengren
http://ola.hallengren.com/

This conference video is a useful introduction as Ola explains his approach and answers questions: recommended video

also refer to
http://www.experts-exchange.com/Microsoft/Development/MS-SQL-Server/Q_28163719.html#a39265474


Alternatively others have recommended "Brent Ozar Unlimited" for a scripts on index maintenance (blitzindex).
http://www.brentozar.com/blitzindex/
0
 
LVL 8

Assisted Solution

by:ProjectChampion
ProjectChampion earned 300 total points
ID: 40006162
I looks like your transaction log is backed up only once a week at weekend. Depending on the type of application that runs on the back  of your DB and business requirements you can increase the frequency of that job to make sure the log is backed up on a regular basis, which not only prevents the log file from growing too large, but also improves the chances of recovering your the data in a disaster recovery situation.
Again I emphasize that it depends on the individual case but it's quite common that most databases are fully backed up on a daily or weekly basis but the Transaction Log is usually backed up every few hours or even every few minutes.

As I mentioned, in your case I'd be more worried about the potential data loss in a disaster recovery situation rather than the size of the log file. For instance consider if a disk failure occurs late in the week... you'll loose almost a weeks worth of transaction if you rely on weekly transaction log backup!
0
 

Author Comment

by:Eprs_Admin
ID: 40027361
our backup of the transactions logs is each 4 hours.
The file stays the size of 50GB but it is empty or used with 1%.
After the weekend and the recreation of the indexes it grows to 50GB or more.
What can I do about the indexes and the recreation ?
0
PRTG Network Monitor: Intuitive Network Monitoring

Network Monitoring is essential to ensure that computer systems and network devices are running. Use PRTG to monitor LANs, servers, websites, applications and devices, bandwidth, virtual environments, remote systems, IoT, and many more. PRTG is easy to set up & use.

 
LVL 8

Assisted Solution

by:ProjectChampion
ProjectChampion earned 300 total points
ID: 40028985
>>our backup of the transactions logs is each 4 hours...
That sounds sensible.

>>The file stays the size of 50GB but it is empty or used with 1%...
That's not a problem per se,  but if you're concerned about the unused space you can schedule a transaction log back up to run immediately after you weekly index rebuild task. That should truncate the log file and recover the unused space.... but under certain circumstances it may be delayed, so you can run something like the following to try recover the unused space:
DBCC SHRINKFILE (N'your_log_file' , 0)

So in short modify your maintenance plan and add a TLog backup followed by a log shrink  right after completion of your index rebuild.
0
 

Author Comment

by:Eprs_Admin
ID: 40028998
...so is it not a problem to have a transaction logfile which is 50GB and 1% used ?
0
 
LVL 8

Assisted Solution

by:ProjectChampion
ProjectChampion earned 300 total points
ID: 40029008
Not at all. There are case studies involving significant unused space in "data" files (usually accompanied by sever physical fragmentation of the data across the disk) that may cause or contribute to performance degradation of both random and bulk reads....

However due to the nature of read/write operations in a log file (which are basically serial operations) there's no grounds for the same performance issues as the data files.

So if you're not short on space on the storage, I'd leave the log as is because as you know, it'll require that ~50 GB for the next index rebuild anyways.

The only advantage of shrinking the unused space in this case is that your full backups will probably be smaller, specially if your backups are not compressed.
0
 
LVL 75

Expert Comment

by:Anthony Perkins
ID: 40029293
So in short modify your maintenance plan and add a TLog backup followed by a log shrink  right after completion of your index rebuild.
I would never recommend scheduling a DBCC SHRINKFILE().  All you are causing is massive Transaction Log fragmentation.  But don't take my word for it, just do it for a short while and then check DBCC LOGINFO(), if you get back more than 50 entries that is usually signs of fragmentation.
0
 
LVL 75

Expert Comment

by:Anthony Perkins
ID: 40029298
After the weekend and the recreation of the indexes it grows to 50GB or more.
Why are you recreating the indexes.  You should only be reindexing/reorganizing the ones that are required.  You were given the solution to this earlier on here.

What can I do about the indexes and the recreation ?
Don't recreate all your indexes.
0
 
LVL 75

Expert Comment

by:Anthony Perkins
ID: 40029304
The only advantage of shrinking the unused space in this case is that your full backups will probably be smaller, specially if your backups are not compressed.
If you are referring to shrinking the Transaction Log, then there is no correlation between doing this and the size of full backups.  In case I was not clear before, there is no reason to schedule a DBCC SHRINKFILE() ever.  If you are doing it, that would be because you are (hopefully) unaware of the consequences and don't realize there are far better methods of keeping the Transaction Log size down.
0

Featured Post

Complete Microsoft Windows PC® & Mac Backup

Backup and recovery solutions to protect all your PCs & Mac– on-premises or in remote locations. Acronis backs up entire PC or Mac with patented reliable disk imaging technology and you will be able to restore workstations to a new, dissimilar hardware in minutes.

Join & Write a Comment

CCModeler offers a way to enter basic information like entities, attributes and relationships and export them as yEd or erviz diagram. It also can import existing Access or SQL Server tables with relationships.
The Delta outage: 650 cancelled flights, more than 1200 delayed flights, thousands of frustrated customers, tens of millions of dollars in damages – plus untold reputational damage to one of the world’s most trusted airlines. All due to a catastroph…
Using examples as well as descriptions, and references to Books Online, show the documentation available for date manipulation functions and by using a select few of these functions, show how date based data can be manipulated with these functions.
Viewers will learn how the fundamental information of how to create a table.

707 members asked questions and received personalized solutions in the past 7 days.

Join the community of 500,000 technology professionals and ask your questions.

Join & Ask a Question

Need Help in Real-Time?

Connect with top rated Experts

16 Experts available now in Live!

Get 1:1 Help Now