Solved

log shipping; T-SQL--SQL 2000

Posted on 2007-04-04
6
668 Views
Last Modified: 2012-06-21
Can Log shipping be done through T-SQL instead of the wizard?  Any examples?
0
Comment
Question by:yanci1179
  • 3
  • 2
6 Comments
 
LVL 6

Assisted Solution

by:DocGyver
DocGyver earned 500 total points
ID: 18852030
The best approach I've seen to date is the one found here

http://www.sqlservercentral.com/columnists/ckempster/customlogshipping.asp

I have made modifications to the scripts he provides and can give a great deal more hands on help to implement.  Look it over and see if that seems to do what you want.  If so then I can give you some quick-start instructions.
0
 
LVL 9

Expert Comment

by:dan_neal
ID: 18852305
I've used the following article just recently to configure my servers.

http://www.sql-server-performance.com/sql_server_log_shipping.asp

The only error I found in the article pertaining to SQL 2000 is that you can't use both DBO_ONLY and single user in the restore scripts.  I took out the DBO_Only and is working great.
0
 

Author Comment

by:yanci1179
ID: 18858084
Hi DocGyver,
can you give the quick-start instructions that you referred to?  thanks for your help!!
0
Complete VMware vSphere® ESX(i) & Hyper-V Backup

Capture your entire system, including the host, with patented disk imaging integrated with VMware VADP / Microsoft VSS and RCT. RTOs is as low as 15 seconds with Acronis Active Restore™. You can enjoy unlimited P2V/V2V migrations from any source (even from a different hypervisor)

 
LVL 6

Expert Comment

by:DocGyver
ID: 18859838
Quick is of course relative so don't be overwhelmed by what needs to be done.
Not sure how much space is allowed in a single post so I'll break it up into major chunks.

There are a few things that are not strictly necessary for log shipping to work but that you should consider.

#1  Is the location of the files (mdf, ldf,...) the same on both boxes.  If not then the RESTORE will have to have MOVE options to make sure things end up in the right place.  The scripts handle this but it is something to keep in mind.

Create a list of all files by going into Enterprise Manager, right click on your database and choose properties.  Under the Datafiles and Transaction Log tabs will be a FileName column with associated Locations.  You will need these if the Source and Destination machines are different so make a note of each of these.

#2 are the DB Users associated with logins that are based on SQL vs Windows Authentication
If the former then realize you will have to run sp_change_users_login to reassociate your users to the correct logins.  If they are Windows Auth then all is well.

#3 You will be storing backups so you need to consider how much space you have and where that space is relative to both the Source and Destination machines.  The scripts support compressing the backup which is handy if the two machines are seperated by a slow pipe.
At a minimum you need enough space for two full backups and space for all the Transaction Log backups that are done between the two.  A good rough estimate is 3x the total size of the database.  This can be reduced to enough space for 1 full backup if you already have backups for disaster recovery.

Now is a good time to suggest that you should go ahead and download the scripts from the link above if you have not already done so.  Once you have them you need to:

unzip the download and then unzip the zip file found in it to both Source and Destination into the c:\scripts directory.
Then open up QA and the server and desination scripts against the appropriate servers.  Make sure you are pointing to the master db at the time.  The scripts should do a "use master" at the beginning but don't.  Wish I had a way of sharing my modified versions easily.

With everything setup my next post will get you on your way to logshipping.
0
 
LVL 6

Accepted Solution

by:
DocGyver earned 500 total points
ID: 18860106
something else to know is that the procs do execute the xp_cmdshell so the user that runs the jobs will need to be a sqladmin and since the routines are writing to a UNC the SQL service has to be running as a user (rather than a service account).  That user will of course have to have read and write permissions to the share/folder(s) used by the procs.

On to the next steps...

Step 1:
Create the database on the destination machine with the same name as the source.  It doesn't have to have anything in it but it does have to exist.

Below, where ever you see the string "mydb" you should update it for the appropriate database.  Sometimes it is used as part of a file name and other times as the actual name of the database to backup/restore.

Step 2:
Create a SQL Job to run the full backup on a reasonable schedule.  Remember that if you have to recover from scratch you will need to keep the full backup and all transaction logs until the next full backup is complete.

/*
SQL Step for job on source server to create a full backup

Keep in mind available storage when choosing how many days
of backup files to keep

parameters:
      backup_type,
      database_name,
      source for clean up command,
      how many days of backup files to keep,
      source of zip/compression command,
      compress or not (0=no, 1=yes),
      destination for backup file,
      destination 2 (for extra copy),
      email to send notices to,
      notify for non-error events
*/
DBBackup_sp 'full', 'mydb', 'c:\scripts', 4, 'c:\scripts', 0, '\\dbasebackup\Backups\',null, null, 'N'



Step 3:
Create a job to create your transaction log backups.  You are running in Full Recovery Mode Right?
You should schedule this fairly frequently to keep the logs small and to reduce latency
--
--SQL Step for job on source server to create transaction log backups
DBBackup_sp 'log', 'mydb', 'c:\scripts', 8, 'c:\scripts', 0, '\\dbasebackup\Backups\',null, null, 'N'


Step 4:
Initialize the destination machine with its first restore of the full backup done in Step 2
/*
Once one full backup exists for the target machine to read, the following is run to initialize the database on the target machine.  The database will have to exist for this to work.
Be sure to update MOVE command file names to point to data files belonging to the target database 'mydb'
To find Logical names use RESTORE FILENAMESONLY
If the source and destination are in the exact same locations the MOVE options below can be replaced with a single occurance of "STATS=5".
*/

exec usp_LogShipping_Init
    ,'d:\FileShare\'
    ,'d:\FileShare\mydb_standby.rdo'
    ,'mydb_'
    ,'.bak*'
    ,'_full.bak'
    ,'_dif.bak'
    ,'_trn.bak'
    ,'
    MOVE ''MyDB_Data'' TO ''D:\Data\MyDB_Data.MDF'',
    MOVE ''MyDB_Log'' TO ''D:\Logs\MyDB_Log.LDF''
    '
    ,'c:\scripts'


Step 5:
Create a SQL job on the destination to run at the same frequency as the log backup job on the Source machine.
/*
SQL Step for job on destination/target server to load backups and apply transaction logs
Be sure to update MOVE command file names to point to data files belonging to the target database
*/

exec usp_LogShipping_Continue
    'mydb'
    ,'d:\FileShare\'
    ,'d:\FileShare\mydb_standby.rdo'
    ,'mydb_'
    ,'.bak*'
    ,'_full.bak'
    ,'_dif.bak'
    ,'_trn.bak'
    ,'
    MOVE ''MyDB_Data'' TO ''D:\Data\MyDB_Data.MDF'',
    MOVE ''MyDB_Log'' TO ''D:\Logs\MyDB_Log.LDF''
    '
    ,'c:\scripts'
--end of target machine job step


That should do it.  I've setup many a machine with this.

Doc..
0
 

Author Comment

by:yanci1179
ID: 18875708
wow you really did give some detail to this one.  thanks!!
0

Featured Post

Free Tool: IP Lookup

Get more info about an IP address or domain name, such as organization, abuse contacts and geolocation.

One of a set of tools we are providing to everyone as a way of saying thank you for being a part of the community.

Question has a verified solution.

If you are experiencing a similar issue, please ask a related question

Nowadays, some of developer are too much worried about data. Who is using data, who is updating it etc. etc. Because, data is more costlier in term of money and information. So security of data is focusing concern in days. Lets' understand the Au…
Ever wondered why sometimes your SQL Server is slow or unresponsive with connections spiking up but by the time you go in, all is well? The following article will show you how to install and configure a SQL job that will send you email alerts includ…
Familiarize people with the process of utilizing SQL Server functions from within Microsoft Access. Microsoft Access is a very powerful client/server development tool. One of the SQL Server objects that you can interact with from within Microsoft Ac…
This videos aims to give the viewer a basic demonstration of how a user can query current session information by using the SYS_CONTEXT function

713 members asked questions and received personalized solutions in the past 7 days.

Join the community of 500,000 technology professionals and ask your questions.

Join & Ask a Question