Solved

Choosing The Best Transactional Replication Method

Posted on 2004-08-01
6
350 Views
Last Modified: 2008-02-01
Question:

I have a GoldMine database in MSSQL.  I want to snapshot just the CONTACT1, CONTACT2 and the CONTHISTtables only.

These tables will get very large, the CONTACT1 table having eventually over 1,000,000 records.  I will be using another application to access just these three tables for analysis and I do not want to lock up the live database.  Because these tables are getting quite large, I was thinking of transactional replication, however the GoldMine tables do not have a Primary Key and I do not know how GoldMine would react if I was to add them.

Therefore I was going to use SNAPSHOT publication, but I am concerned about resource load.

We have a beefy server, 4 GB RAM and a quad processor, the SNAPSHOT and the HOST database would be on the same server.

So, should I be worried about running SNAPSHOT?

Most of our resource problems come from record locking when the users run their filters and queries from GoldMine, the server locks up because all the clients are fighting for the same info, and the CPU's on each side (client and server) are getting maxed bc (I think) they keep slamming the server waiting for the client who locks the tables running these queries (extremem filters) to end.

So, how long would such a SNAPSHOT take, what is a good time interval, and / or am I completely off base and smoking crack?

RAT

Thanks!
0
Comment
Question by:ratzephyr
[X]
Welcome to Experts Exchange

Add your voice to the tech community where 5M+ people just like you are talking about what matters.

  • Help others & share knowledge
  • Earn cash & points
  • Learn & ask questions
6 Comments
 
LVL 9

Accepted Solution

by:
crescendo earned 63 total points
ID: 11689991
You would be better off with log shipping. It sends the transaction logs over, so the amount of data is small compared to a snapshot, and I don't think it insists on a primary key. You can set the interval to suit your demands. How up-to-date does the data need to be? How often is it updated, because that will affect the size of the data to be shipped. You would need to experiment a bit, start off every 10 minutes or so and go up or down and see how that affects performance.
0
 
LVL 34

Expert Comment

by:arbert
ID: 11690868
I agree...The only thing, you will need the entire database on the second table and not just the three tables you're interested in.
0
 
LVL 13

Assisted Solution

by:danblake
danblake earned 62 total points
ID: 11693354
So, should I be worried about running SNAPSHOT?
--> The snapshot will lock N records at a time (you can change the N record lock / time), not anything to be too worried about unless your hardware isn't up to scratch (i.e. your Disk Queue lengths are way above baseline -- which is what it sounds like due to the locking problems you are experiencing ).  Its very similar to bcp.

So, how long would such a SNAPSHOT take, what is a good time ?
--> Snapshot, will depend on your hardware and if you put out the data to the same drives as the other components.  But for about 1 million records (which is small) should be no longer than 1 hr.  The transfer across the net would take a lot longer.  If you are distributing this data, the snapshot replication component is the same as a bcp.

If this was me, how many hours do you have between 6pm -> 9am ?  This is when I would run the snapshot, get a perf baseline to see how it affects your site, then reschedule for day time hours when you know your snapshot output works, you can then do any tweaking as necessary.

I can understand that you need this data, a BCP out&in or a  regular database transfer (DTS) would also do the trick.  Like all export processes they normally lock, and will lock with data going into your new database.
0
 
LVL 4

Expert Comment

by:tbojoe
ID: 14488367
I know this question is quite old but GoldMine does have a primary key which is ACCOUNTNO it is a little difficult to work with though, because it used special characters in it's creation.
0

Featured Post

Use Case: Protecting a Hybrid Cloud Infrastructure

Microsoft Azure is rapidly becoming the norm in dynamic IT environments. This document describes the challenges that organizations face when protecting data in a hybrid cloud IT environment and presents a use case to demonstrate how Acronis Backup protects all data.

Question has a verified solution.

If you are experiencing a similar issue, please ask a related question

Introduction SQL Server Integration Services can read XML files, that’s known by every BI developer.  (If you didn’t, don’t worry, I’m aiming this article at newcomers as well.) But how far can you go?  When does the XML Source component become …
International Data Corporation (IDC) prognosticates that before the current the year gets over disbursing on IT framework products to be sent in cloud environs will be $37.1B.
Familiarize people with the process of utilizing SQL Server functions from within Microsoft Access. Microsoft Access is a very powerful client/server development tool. One of the SQL Server objects that you can interact with from within Microsoft Ac…
This video shows, step by step, how to configure Oracle Heterogeneous Services via the Generic Gateway Agent in order to make a connection from an Oracle session and access a remote SQL Server database table.

737 members asked questions and received personalized solutions in the past 7 days.

Join the community of 500,000 technology professionals and ask your questions.

Join & Ask a Question