Solved

How to overcome table locking after applying a trigger to a table

Posted on 2004-08-02
11
377 Views
Last Modified: 2008-02-26
Hi,
need some advice on how to overcome a problem when applying a trigger.

I had a SP which download from DB2 to table A record by record (using a looping). I create a trigger (insert/delete/update) to table A so that any transaction on table A will effect table B as I need a realtime mirror between table A and B.

But I face a problem where table A was locked during the download (didn't happen before I apply the trigger) which cause another application could not access this table A(retrieve/insert data).

How to overcome this?
0
Comment
Question by:mantech
  • 4
  • 4
  • 3
11 Comments
 
LVL 9

Expert Comment

by:miron
ID: 11701551
did you try to remove trigger and setup transactional replication from table A to table B.
0
 

Author Comment

by:mantech
ID: 11702720
miron,
How to setup a transactional replication?
0
 
LVL 9

Expert Comment

by:miron
ID: 11712211
try using Enterprise Manager. Here is links to the MSN articles. If you set out to follow the steps described in articles, you will find that most of the wizards setting up replication are self - explanatory.

http://msdn.microsoft.com/library/default.asp?url=/library/en-us/howtosql/ht_replctem_85rd.asp?frame=true
http://msdn.microsoft.com/library/default.asp?url=/library/en-us/howtosql/ht_replctem_1fop.asp?frame=true

-- cheers
0
NAS Cloud Backup Strategies

This article explains backup scenarios when using network storage. We review the so-called “3-2-1 strategy” and summarize the methods you can use to send NAS data to the cloud

 

Author Comment

by:mantech
ID: 11712235
Miron,
I try to browse the Enterprise manager yesterday but I found that the replication is to replicate the whole database. FYI, the table A and B mentioned is located in the same database.

Are you sure that this replication is the right method?
0
 
LVL 9

Expert Comment

by:miron
ID: 11712358
absolutely, the databases mentioned is either for example purposes, or as a "publication". However, the tables are "article(s)" and those selected one at a time. So, you will need to enable the same database for publication and subscription and create article that is published in the same database.

-- cheers
0
 
LVL 34

Expert Comment

by:arbert
ID: 11728224
I don't understand the entire original question.  Do you still have a link to DB2, or did you import the data into sql and you're using it there?

I'm not positive that replication is the answer--especially transactional--you'll possibly run into the same problems that you did with your trigger.

Post your trigger code as well....
0
 

Author Comment

by:mantech
ID: 11732462
Arbert,
i import the data from to SQL using DTS into a temp tables. The from the temp table I insert the data into table A (actually there are few more tables involve) after massaging the data.

My intention is to replicate the data to table B too because I'll try to make table B as an archive table. I will always maintain table  A with only 2 weeks data retention but for table B, I'll let the table to keep up to 6 month data. This is because most of the user would like to access within 2 weeks data but in certain cases, the will need to retrieve up to 6 months data.

The idea to have this type of architecture is to let the majority of the user to acces the data fast (because the table will only consist of 2 weeks data).

Both table should have up to the same latest data because the application(users use to access the data) will straight away point to table B if the date selected is more than 2 weeks before.

BTW, I attach together the code:

CREATE TRIGGER InsertA ON tableA
FOR INSERT
AS
begin
      insert into tableB
      select * from inserted
end

CREATE TRIGGER DeleteA ON tableA
FOR DELETE
AS
begin
      delete from tableB where a in (
      select a from deleted)
end
 
CREATE TRIGGER UpdateA ON tableA
FOR UPDATE
AS
begin
      delete from tableB where a in (
      select a from deleted)

      insert into tableB
      select * from inserted

end



0
 
LVL 34

Expert Comment

by:arbert
ID: 11732702
How much data are you talking about?  Sounds like a lot of effort when SQL Server should be able to handle your volume of data in one table.  I would look at partitioning and let sql server handle everything on its own.  Here is an overview of partitioning:

http://www.sqlteam.com/Item.ASP?ItemID=684


What problems are you seeing that your only keep 2weeks "online"?

Brett
0
 

Author Comment

by:mantech
ID: 11733355
Yes, it involve everage of 62,220 records for each tables for 2 weeks data. Number of tables involve is around 100. This table actually stored my line production transaction details.

Partitioning seems to be a good idea to speed up the query to the table but I don't see that it will help me to replicate the data to another table. FYI, each tables mentioned above has it own "archive" table. So, every transaction happened in these tables should effect the "archive" tables too.

Besides, the partition method will require me to modify all the download script in SP. For example: if currently the statement is just to insert the records to table A, but when partition method is used, I need to add few more insert statement to the table A1, A2, A3 and so on.

Anymore idea?
0
 
LVL 9

Accepted Solution

by:
miron earned 200 total points
ID: 11737285
well, the replication seems like the most natural solution to me. Make sure it is non - updateabe, one way, and the "replicated" table should be the one carrying the last two weeks of data.

To reduce the locking you need to have a clustered index on an identity column ( ordered ascending ) so that the latest records get added on top of the heap, while the index seek is supported.

-- cheers.
0
 
LVL 34

Expert Comment

by:arbert
ID: 11739488
"query to the table but I don't see that it will help me to replicate the data to another table."

I guess I don't understand what you're doing then--above you said you were creating the second DB for users to query against.  With partitioning, you wouldn't need to create the second table/db.

"Besides, the partition method will require me to modify all the download script in SP. For example: if currently the statement is just to insert the records to table A, but when partition method is used, I need to add few more insert statement to the table A1, A2, A3 and so on."

Not a big change.  You create a view on top of the tables and insert into the view--the check constraint on the tables will handle the inserts into the appropriate tables.

"well, the replication seems like the most natural solution to me. Make sure it is non - updateabe, one way, and the "replicated" table should be the one carrying the last two weeks of data."

If the data has to be realtime to the second database/table, you're adding just as much overhead as people querying off the original set of data--I think the good would be negated by the additional overhead of the replication.  I also would never "waste" a clustered index on an identity column (unless it's a heavily used primary/foreign key)--there are other ways to reduce contention based on natural keys....
0

Featured Post

Comprehensive Backup Solutions for Microsoft

Acronis protects the complete Microsoft technology stack: Windows Server, Windows PC, laptop and Surface data; Microsoft business applications; Microsoft Hyper-V; Azure VMs; Microsoft Windows Server 2016; Microsoft Exchange 2016 and SQL Server 2016.

Question has a verified solution.

If you are experiencing a similar issue, please ask a related question

JSON is being used more and more, besides XML, and you surely wanted to parse the data out into SQL instead of doing it in some Javascript. The below function in SQL Server can do the job for you, returning a quick table with the parsed data.
Load balancing is the method of dividing the total amount of work performed by one computer between two or more computers. Its aim is to get more work done in the same amount of time, ensuring that all the users get served faster.
This video shows how to set up a shell script to accept a positional parameter when called, pass that to a SQL script, accept the output from the statement back and then manipulate it in the Shell.
Viewers will learn how to use the UPDATE and DELETE statements to change or remove existing data from their tables. Make a table: Update a specific column given a specific row using the UPDATE statement: Remove a set of values using the DELETE s…

856 members asked questions and received personalized solutions in the past 7 days.

Join the community of 500,000 technology professionals and ask your questions.

Join & Ask a Question