Solved

Database restore

Posted on 2016-09-26
8
77 Views
Last Modified: 2016-10-20
We have centralized database (SQL server 2000) where data gets consolidated from distributed databases (databases on remote branches - SQL Server 2008). Two of the remote branches are failed as the database servers are crashed and we have back up of a couple of days back (due to some issues current back up is corrupt) which we have restored and to bring the those databases on current state, we are applying the data from data transfer files sent by those databases to the centralized database. This process is taking time as the data transfer files are quite huge in numbers.

All the database servers are proper in terms of hardware requirements for normal operations but these applying of files on databases is taking time and with current rate we have estimated  around 36 hours. Our customer is not agreeing with that much down time for those branches as those are one of the critical branches for their business operations. Is there any way we can expedite the data reading and applying process of files for those databases?
0
Comment
Question by:A D
8 Comments
 
LVL 46

Assisted Solution

by:Vitor Montalvão
Vitor Montalvão earned 125 total points
ID: 41815547
It will help if you share with us the import process.
Are you using BCP, SSIS, Linked Servers, a 3rd party application, ...?
There are any validation and transformation tasks during the process?
The destination tables has triggers enabled?
1
 

Author Comment

by:A D
ID: 41815555
We are using a in-house developed (developed around 10 years back) VB utility which has few validations to check transfer partner id in table etc.. Yes, destination databases has triggers too. Not exactly sure what they do.
0
 
LVL 46

Assisted Solution

by:Vitor Montalvão
Vitor Montalvão earned 125 total points
ID: 41815563
It would worth to review the VB code. I guess isn't tuned for SQL Server 2008.
Also check what triggers do. They may do some validations and run stored procedures that can take time to execute.
1
 
LVL 26

Accepted Solution

by:
Zberteoc earned 250 total points
ID: 41816109
Simply put there is nothing you can do at this point. If you try to tweak with the import apps or SQL code you may very well do some mistakes and compromise the whole data, let alone that it may take you even more than 36 hours. I would let the process finish.

Going further I would recommend you to upgrade the centralized server, which being a 2000 version could be a problem by itself when it comes to performance. I know is a long shot but you can't wait for the disaster to happen in order to take measures. Also there is a need for some form of high availability configuration or at least a better maintenance plans in order to avoid these kind of situation with failure AND backup corruption.
1
Zoho SalesIQ

Hassle-free live chat software re-imagined for business growth. 2 users, always free.

 
LVL 35

Assisted Solution

by:David Todd
David Todd earned 125 total points
ID: 41816872
Hi,

I guess the question is: Can the data import be batched, or does the catch-up need to happen in one go? Can you put off the catch-up to the weekend where you'll have 48 hours or more?

I find that using Ola's Hallengren's maintenance script and running the integrity job before the backup a good way to find corruption before it occurrs in the backup.

HTH
  David

PS Time to upgrade off SQL 2000!
0
 
LVL 46

Expert Comment

by:Vitor Montalvão
ID: 41837933
AD, any feedback will be appreciated.
Cheers
0
 

Author Comment

by:A D
ID: 41852009
Hi,

We have proposed to update SQL and also given most of the other inputs above to management. They are all cost induced hence will take time so currently living with the available workarounds.

Thanks a lot for all the inputs.
0
 

Author Closing Comment

by:A D
ID: 41852019
Thanks for all the inputs. During post incident review, we have used snippet from comments provided by you and presented a case to our management, as the most of the options suggested are cost induced, they may take time but certainly management is happy with the inputs provided and looked positive for the proposed changes. Hopefully situation will improve in coming future.

Thanks a lot again for your help :)
0

Featured Post

Optimizing Cloud Backup for Low Bandwidth

With cloud storage prices going down a growing number of SMBs start to use it for backup storage. Unfortunately, business data volume rarely fits the average Internet speed. This article provides an overview of main Internet speed challenges and reveals backup best practices.

Question has a verified solution.

If you are experiencing a similar issue, please ask a related question

Suggested Solutions

Title # Comments Views Activity
Excel conversion issue with Sql server 14 46
T-SQL:  Embedding a CTE 5 31
SQL Syntax join to include values from first table 3 26
Sql Join Problem 2 22
Why is this different from all of the other step by step guides?  Because I make a living as a DBA and not as a writer and I lived through this experience. Defining the name: When I talk to people they say different names on this subject stuff l…
This article explains how to reset the password of the sa account on a Microsoft SQL Server.  The steps in this article work in SQL 2005, 2008, 2008 R2, 2012, 2014 and 2016.
Using examples as well as descriptions, and references to Books Online, show the different Recovery Models available in SQL Server and explain, as well as show how full, differential and transaction log backups are performed
Via a live example, show how to set up a backup for SQL Server using a Maintenance Plan and how to schedule the job into SQL Server Agent.

919 members asked questions and received personalized solutions in the past 7 days.

Join the community of 500,000 technology professionals and ask your questions.

Join & Ask a Question

Need Help in Real-Time?

Connect with top rated Experts

14 Experts available now in Live!

Get 1:1 Help Now