Solved

How to overcome table locking after applying a trigger to a table

Posted on 2004-08-02
11
374 Views
Last Modified: 2008-02-26
Hi,
need some advice on how to overcome a problem when applying a trigger.

I had a SP which download from DB2 to table A record by record (using a looping). I create a trigger (insert/delete/update) to table A so that any transaction on table A will effect table B as I need a realtime mirror between table A and B.

But I face a problem where table A was locked during the download (didn't happen before I apply the trigger) which cause another application could not access this table A(retrieve/insert data).

How to overcome this?
0
Comment
Question by:mantech
  • 4
  • 4
  • 3
11 Comments
 
LVL 9

Expert Comment

by:miron
ID: 11701551
did you try to remove trigger and setup transactional replication from table A to table B.
0
 

Author Comment

by:mantech
ID: 11702720
miron,
How to setup a transactional replication?
0
 
LVL 9

Expert Comment

by:miron
ID: 11712211
try using Enterprise Manager. Here is links to the MSN articles. If you set out to follow the steps described in articles, you will find that most of the wizards setting up replication are self - explanatory.

http://msdn.microsoft.com/library/default.asp?url=/library/en-us/howtosql/ht_replctem_85rd.asp?frame=true
http://msdn.microsoft.com/library/default.asp?url=/library/en-us/howtosql/ht_replctem_1fop.asp?frame=true

-- cheers
0
 

Author Comment

by:mantech
ID: 11712235
Miron,
I try to browse the Enterprise manager yesterday but I found that the replication is to replicate the whole database. FYI, the table A and B mentioned is located in the same database.

Are you sure that this replication is the right method?
0
 
LVL 9

Expert Comment

by:miron
ID: 11712358
absolutely, the databases mentioned is either for example purposes, or as a "publication". However, the tables are "article(s)" and those selected one at a time. So, you will need to enable the same database for publication and subscription and create article that is published in the same database.

-- cheers
0
Threat Intelligence Starter Resources

Integrating threat intelligence can be challenging, and not all companies are ready. These resources can help you build awareness and prepare for defense.

 
LVL 34

Expert Comment

by:arbert
ID: 11728224
I don't understand the entire original question.  Do you still have a link to DB2, or did you import the data into sql and you're using it there?

I'm not positive that replication is the answer--especially transactional--you'll possibly run into the same problems that you did with your trigger.

Post your trigger code as well....
0
 

Author Comment

by:mantech
ID: 11732462
Arbert,
i import the data from to SQL using DTS into a temp tables. The from the temp table I insert the data into table A (actually there are few more tables involve) after massaging the data.

My intention is to replicate the data to table B too because I'll try to make table B as an archive table. I will always maintain table  A with only 2 weeks data retention but for table B, I'll let the table to keep up to 6 month data. This is because most of the user would like to access within 2 weeks data but in certain cases, the will need to retrieve up to 6 months data.

The idea to have this type of architecture is to let the majority of the user to acces the data fast (because the table will only consist of 2 weeks data).

Both table should have up to the same latest data because the application(users use to access the data) will straight away point to table B if the date selected is more than 2 weeks before.

BTW, I attach together the code:

CREATE TRIGGER InsertA ON tableA
FOR INSERT
AS
begin
      insert into tableB
      select * from inserted
end

CREATE TRIGGER DeleteA ON tableA
FOR DELETE
AS
begin
      delete from tableB where a in (
      select a from deleted)
end
 
CREATE TRIGGER UpdateA ON tableA
FOR UPDATE
AS
begin
      delete from tableB where a in (
      select a from deleted)

      insert into tableB
      select * from inserted

end



0
 
LVL 34

Expert Comment

by:arbert
ID: 11732702
How much data are you talking about?  Sounds like a lot of effort when SQL Server should be able to handle your volume of data in one table.  I would look at partitioning and let sql server handle everything on its own.  Here is an overview of partitioning:

http://www.sqlteam.com/Item.ASP?ItemID=684


What problems are you seeing that your only keep 2weeks "online"?

Brett
0
 

Author Comment

by:mantech
ID: 11733355
Yes, it involve everage of 62,220 records for each tables for 2 weeks data. Number of tables involve is around 100. This table actually stored my line production transaction details.

Partitioning seems to be a good idea to speed up the query to the table but I don't see that it will help me to replicate the data to another table. FYI, each tables mentioned above has it own "archive" table. So, every transaction happened in these tables should effect the "archive" tables too.

Besides, the partition method will require me to modify all the download script in SP. For example: if currently the statement is just to insert the records to table A, but when partition method is used, I need to add few more insert statement to the table A1, A2, A3 and so on.

Anymore idea?
0
 
LVL 9

Accepted Solution

by:
miron earned 200 total points
ID: 11737285
well, the replication seems like the most natural solution to me. Make sure it is non - updateabe, one way, and the "replicated" table should be the one carrying the last two weeks of data.

To reduce the locking you need to have a clustered index on an identity column ( ordered ascending ) so that the latest records get added on top of the heap, while the index seek is supported.

-- cheers.
0
 
LVL 34

Expert Comment

by:arbert
ID: 11739488
"query to the table but I don't see that it will help me to replicate the data to another table."

I guess I don't understand what you're doing then--above you said you were creating the second DB for users to query against.  With partitioning, you wouldn't need to create the second table/db.

"Besides, the partition method will require me to modify all the download script in SP. For example: if currently the statement is just to insert the records to table A, but when partition method is used, I need to add few more insert statement to the table A1, A2, A3 and so on."

Not a big change.  You create a view on top of the tables and insert into the view--the check constraint on the tables will handle the inserts into the appropriate tables.

"well, the replication seems like the most natural solution to me. Make sure it is non - updateabe, one way, and the "replicated" table should be the one carrying the last two weeks of data."

If the data has to be realtime to the second database/table, you're adding just as much overhead as people querying off the original set of data--I think the good would be negated by the additional overhead of the replication.  I also would never "waste" a clustered index on an identity column (unless it's a heavily used primary/foreign key)--there are other ways to reduce contention based on natural keys....
0

Featured Post

What Is Threat Intelligence?

Threat intelligence is often discussed, but rarely understood. Starting with a precise definition, along with clear business goals, is essential.

Join & Write a Comment

Suggested Solutions

Title # Comments Views Activity
MS SQL export CSV & schedule It 9 44
Full Text Search string 5 33
MS SQL Backup 24 70
How to find duplicates in SQL Server 3 22
Introduction SQL Server Integration Services can read XML files, that’s known by every BI developer.  (If you didn’t, don’t worry, I’m aiming this article at newcomers as well.) But how far can you go?  When does the XML Source component become …
Slowly Changing Dimension Transformation component in data task flow is very useful for us to manage and control how data changes in SSIS.
This videos aims to give the viewer a basic demonstration of how a user can query current session information by using the SYS_CONTEXT function
Via a live example, show how to backup a database, simulate a failure backup the tail of the database transaction log and perform the restore.

759 members asked questions and received personalized solutions in the past 7 days.

Join the community of 500,000 technology professionals and ask your questions.

Join & Ask a Question

Need Help in Real-Time?

Connect with top rated Experts

18 Experts available now in Live!

Get 1:1 Help Now