Solved

log shipping; T-SQL--SQL 2000

Posted on 2007-04-04
6
659 Views
Last Modified: 2012-06-21
Can Log shipping be done through T-SQL instead of the wizard?  Any examples?
0
Comment
Question by:yanci1179
  • 3
  • 2
6 Comments
 
LVL 6

Assisted Solution

by:DocGyver
DocGyver earned 500 total points
ID: 18852030
The best approach I've seen to date is the one found here

http://www.sqlservercentral.com/columnists/ckempster/customlogshipping.asp

I have made modifications to the scripts he provides and can give a great deal more hands on help to implement.  Look it over and see if that seems to do what you want.  If so then I can give you some quick-start instructions.
0
 
LVL 9

Expert Comment

by:dan_neal
ID: 18852305
I've used the following article just recently to configure my servers.

http://www.sql-server-performance.com/sql_server_log_shipping.asp

The only error I found in the article pertaining to SQL 2000 is that you can't use both DBO_ONLY and single user in the restore scripts.  I took out the DBO_Only and is working great.
0
 

Author Comment

by:yanci1179
ID: 18858084
Hi DocGyver,
can you give the quick-start instructions that you referred to?  thanks for your help!!
0
Free Trending Threat Insights Every Day

Enhance your security with threat intelligence from the web. Get trending threat insights on hackers, exploits, and suspicious IP addresses delivered to your inbox with our free Cyber Daily.

 
LVL 6

Expert Comment

by:DocGyver
ID: 18859838
Quick is of course relative so don't be overwhelmed by what needs to be done.
Not sure how much space is allowed in a single post so I'll break it up into major chunks.

There are a few things that are not strictly necessary for log shipping to work but that you should consider.

#1  Is the location of the files (mdf, ldf,...) the same on both boxes.  If not then the RESTORE will have to have MOVE options to make sure things end up in the right place.  The scripts handle this but it is something to keep in mind.

Create a list of all files by going into Enterprise Manager, right click on your database and choose properties.  Under the Datafiles and Transaction Log tabs will be a FileName column with associated Locations.  You will need these if the Source and Destination machines are different so make a note of each of these.

#2 are the DB Users associated with logins that are based on SQL vs Windows Authentication
If the former then realize you will have to run sp_change_users_login to reassociate your users to the correct logins.  If they are Windows Auth then all is well.

#3 You will be storing backups so you need to consider how much space you have and where that space is relative to both the Source and Destination machines.  The scripts support compressing the backup which is handy if the two machines are seperated by a slow pipe.
At a minimum you need enough space for two full backups and space for all the Transaction Log backups that are done between the two.  A good rough estimate is 3x the total size of the database.  This can be reduced to enough space for 1 full backup if you already have backups for disaster recovery.

Now is a good time to suggest that you should go ahead and download the scripts from the link above if you have not already done so.  Once you have them you need to:

unzip the download and then unzip the zip file found in it to both Source and Destination into the c:\scripts directory.
Then open up QA and the server and desination scripts against the appropriate servers.  Make sure you are pointing to the master db at the time.  The scripts should do a "use master" at the beginning but don't.  Wish I had a way of sharing my modified versions easily.

With everything setup my next post will get you on your way to logshipping.
0
 
LVL 6

Accepted Solution

by:
DocGyver earned 500 total points
ID: 18860106
something else to know is that the procs do execute the xp_cmdshell so the user that runs the jobs will need to be a sqladmin and since the routines are writing to a UNC the SQL service has to be running as a user (rather than a service account).  That user will of course have to have read and write permissions to the share/folder(s) used by the procs.

On to the next steps...

Step 1:
Create the database on the destination machine with the same name as the source.  It doesn't have to have anything in it but it does have to exist.

Below, where ever you see the string "mydb" you should update it for the appropriate database.  Sometimes it is used as part of a file name and other times as the actual name of the database to backup/restore.

Step 2:
Create a SQL Job to run the full backup on a reasonable schedule.  Remember that if you have to recover from scratch you will need to keep the full backup and all transaction logs until the next full backup is complete.

/*
SQL Step for job on source server to create a full backup

Keep in mind available storage when choosing how many days
of backup files to keep

parameters:
      backup_type,
      database_name,
      source for clean up command,
      how many days of backup files to keep,
      source of zip/compression command,
      compress or not (0=no, 1=yes),
      destination for backup file,
      destination 2 (for extra copy),
      email to send notices to,
      notify for non-error events
*/
DBBackup_sp 'full', 'mydb', 'c:\scripts', 4, 'c:\scripts', 0, '\\dbasebackup\Backups\',null, null, 'N'



Step 3:
Create a job to create your transaction log backups.  You are running in Full Recovery Mode Right?
You should schedule this fairly frequently to keep the logs small and to reduce latency
--
--SQL Step for job on source server to create transaction log backups
DBBackup_sp 'log', 'mydb', 'c:\scripts', 8, 'c:\scripts', 0, '\\dbasebackup\Backups\',null, null, 'N'


Step 4:
Initialize the destination machine with its first restore of the full backup done in Step 2
/*
Once one full backup exists for the target machine to read, the following is run to initialize the database on the target machine.  The database will have to exist for this to work.
Be sure to update MOVE command file names to point to data files belonging to the target database 'mydb'
To find Logical names use RESTORE FILENAMESONLY
If the source and destination are in the exact same locations the MOVE options below can be replaced with a single occurance of "STATS=5".
*/

exec usp_LogShipping_Init
    ,'d:\FileShare\'
    ,'d:\FileShare\mydb_standby.rdo'
    ,'mydb_'
    ,'.bak*'
    ,'_full.bak'
    ,'_dif.bak'
    ,'_trn.bak'
    ,'
    MOVE ''MyDB_Data'' TO ''D:\Data\MyDB_Data.MDF'',
    MOVE ''MyDB_Log'' TO ''D:\Logs\MyDB_Log.LDF''
    '
    ,'c:\scripts'


Step 5:
Create a SQL job on the destination to run at the same frequency as the log backup job on the Source machine.
/*
SQL Step for job on destination/target server to load backups and apply transaction logs
Be sure to update MOVE command file names to point to data files belonging to the target database
*/

exec usp_LogShipping_Continue
    'mydb'
    ,'d:\FileShare\'
    ,'d:\FileShare\mydb_standby.rdo'
    ,'mydb_'
    ,'.bak*'
    ,'_full.bak'
    ,'_dif.bak'
    ,'_trn.bak'
    ,'
    MOVE ''MyDB_Data'' TO ''D:\Data\MyDB_Data.MDF'',
    MOVE ''MyDB_Log'' TO ''D:\Logs\MyDB_Log.LDF''
    '
    ,'c:\scripts'
--end of target machine job step


That should do it.  I've setup many a machine with this.

Doc..
0
 

Author Comment

by:yanci1179
ID: 18875708
wow you really did give some detail to this one.  thanks!!
0

Featured Post

How to run any project with ease

Manage projects of all sizes how you want. Great for personal to-do lists, project milestones, team priorities and launch plans.
- Combine task lists, docs, spreadsheets, and chat in one
- View and edit from mobile/offline
- Cut down on emails

Join & Write a Comment

Suggested Solutions

I wrote this interesting script that really help me find jobs or procedures when working in a huge environment. I could I have written it as a Procedure but then I would have to have it on each machine or have a link to a server-related search that …
Let's review the features of new SQL Server 2012 (Denali CTP3). It listed as below: PERCENT_RANK(): PERCENT_RANK() function will returns the percentage value of rank of the values among its group. PERCENT_RANK() function value always in be…
Using examples as well as descriptions, and references to Books Online, show the documentation available for date manipulation functions and by using a select few of these functions, show how date based data can be manipulated with these functions.
This video shows how to set up a shell script to accept a positional parameter when called, pass that to a SQL script, accept the output from the statement back and then manipulate it in the Shell.

705 members asked questions and received personalized solutions in the past 7 days.

Join the community of 500,000 technology professionals and ask your questions.

Join & Ask a Question

Need Help in Real-Time?

Connect with top rated Experts

20 Experts available now in Live!

Get 1:1 Help Now