handling table level locks for multiple inserts at the same time. (2012)

The below code is in a stored procedure. It is called by different users. when the #tempTABLE is huge (more than 160k), we see deadlocks that cause errors ProcessControl, when multiple users are running the stored procedure to
upload.

could you suggest what are the best standards that can be applied to handle this graciously?

   BEGIN TRAN InsertBlockFromTemp  
INSERT INTO [dbo].ProcessControl WITH (ROWLOCK) ( 9 COLUMNS )  
SELECT 9 columns FROM #tempTABLE ORDER BY IDENTITYColumnPK_Column  
   COMMIT TRAN InsertBlockFromTemp
LVL 5
25112Asked:
Who is Participating?
I wear a lot of hats...

"The solutions and answers provided on Experts Exchange have been extremely helpful to me over the last few years. I wear a lot of hats - Developer, Database Administrator, Help Desk, etc., so I know a lot of things but not a lot about one thing. Experts Exchange gives me answers from people who do know a lot about one thing, in a easy to use platform." -Todd S.

Najam UddinCommented:
Are your concerned for dirty reads? Try Using NOLOCK hint or setting ISOLATION LEVEL to READ UNCOMMITTED;
0

Experts Exchange Solution brought to you by

Your issues matter to us.

Facing a tech roadblock? Get the help and guidance you need from experienced professionals who care. Ask your question anytime, anywhere, with no hassle.

Start your 7-day free trial
lcohanDatabase AnalystCommented:
And do not use explicit transactions in SQL in particular on a simple insert like this because...if your SELECT takes time to complete OR insert record set is huge and will take some time to be done (for instance due to PK/FKeys, triggers, indexes, etc) your destination table is LOCKED and NOTHING can use it until COMMIT.
In simple words drop explicit transactions (BEGIN TRAN...COMMIT TRAN) in SQL assuming you are using READ COMMITTED default isolation level.
0
25112Author Commented:
najam-
it is the issue lcohan brought up: the destination table is locked.. NOLOCK can't help in this regard?

lcohan: what is the alternative to using explicit transactions? (to avoid this issue)
0
Cloud Class® Course: Microsoft Exchange Server

The MCTS: Microsoft Exchange Server 2010 certification validates your skills in supporting the maintenance and administration of the Exchange servers in an enterprise environment. Learn everything you need to know with this course.

25112Author Commented:
In this regard, am i understanding right that

1)READ UNCOMMITTED is better than READ COMMITTED? are these only STORED PROCEDURE level Isolation levels and not to affect outside it, right?

2)are there any other ISOLATION LEVELs to consider?
0
Najam UddinCommented:
You will have to reset to READ UNCOMMITTED to Read Committed after you are done.
0
25112Author Commented:
>>You will have to reset to READ UNCOMMITTED to Read Committed after you are done.
this will not help in this:
after 5000 rows, ROWLEVEL lock, defaults to TABLE LEVEL lock?
http://social.technet.microsoft.com/wiki/contents/articles/19870.sql-server-understanding-lock-escalation.aspx
0
lcohanDatabase AnalystCommented:
The alternative is to NOT use explicit transactions period. I use default SQL isolation levels without loosing any data because of that and without (or very little) any performance issues since their inception (on about 1000+ instances with DB sizes varying from few GB to few TB per DB).
Only once in my career on newly SQL 2000 RTM we had some issue with SQL generating some invoice numbers that MUST be unique and we had to use explicit transactions to read the last commited row invoice number then generate the next in series but that was IT.

As far as Table HINTS like NOLOCK they DO help indeed however they are NOT guaranteed and that depends on multiple factors however.and again, well written SQL code objects do not require explicit transactions like many other DB engines do.
0
25112Author Commented:
>> I use default SQL isolation levels without loosing any data
can you explain this pl?
do you start a stored procedure declaring a specific preferred IL?
and then before ending stored procedure reverting back to READ COMMITTED?

thank you.
0
lcohanDatabase AnalystCommented:
READ COMMITTED is the default isolation level for the Microsoft SQL Server Database Engine.

"READ COMMITTED: A query in the current transaction cannot read data modified by another transaction that has not yet committed, thus preventing dirty reads. However, data can still be modified by other transactions between issuing statements within the current transaction, so nonrepeatable reads and phantom reads are still possible. The isolation level uses shared locking or row versioning to prevent dirty reads, depending on whether the READ_COMMITTED_SNAPSHOT database option is enabled. Read Committed is the default isolation level for all SQL Server databases."

So basically all my SQL  code runs under that level with very few exceptions as mentioned but never with explicit transactions in MSSQL server databases. For lot more details/explanations please see:

https://www.simple-talk.com/sql/t-sql-programming/questions-about-t-sql-transaction-isolation-levels-you-were-too-shy-to-ask/
0
It's more than this solution.Get answers and train to solve all your tech problems - anytime, anywhere.Try it for free Edge Out The Competitionfor your dream job with proven skills and certifications.Get started today Stand Outas the employee with proven skills.Start learning today for free Move Your Career Forwardwith certification training in the latest technologies.Start your trial today
Microsoft SQL Server 2008

From novice to tech pro — start learning today.

Question has a verified solution.

Are you are experiencing a similar issue? Get a personalized answer when you ask a related question.

Have a better answer? Share it in a comment.