bkuk insert failed, cannot obtain lock resources

Posted on 2011-03-02
Medium Priority
Last Modified: 2012-05-11
Many times when I execute our ssis package job to bulk insert data, it return me errors below,
I also have drop all cleanbuffer, procedure cache and set bulk insert to 1000 instead of 10k per batch but still getting same errors, actually this job is trigger from another sql server but actual db is  at another set of sql server, is this error from the server which I installed and deploy the package or in actual db server ?
How do we fix this kind of issues ?

Date            3/3/2011 4:39:07 AM
Log            Job History (Import)

Step ID            2
Server            SERVERB
Job Name            Import
Step Name            MportNOw
Duration            00:00:34
Sql Severity            0
Sql Message ID            0
Operator Emailed            
Operator Net sent            
Operator Paged            
Retries Attempted            0

Executed as user: SA. Microsoft (R) SQL Server Execute Package Utility  
Version 9.00.4035.00 for 32-bit  Copyright (C) Microsoft Corp 1984-2005. All rights reserved.    
Started:  4:39:07 AM  Error: 2011-03-03 04:39:38.78     Code: 0xC002F304    
Source: DBIMport Bulk Insert Task     Description: An error occurred with the following error message:
"The instance of the SQL Server Database Engine cannot obtain a LOCK resource at this time. Rerun your statement when there are fewer active users.
Ask the database administrator to check the lock and memory configuration for this instance, or to check for long-running transactions.".  
End Error  DTExec: The package execution returned DTSER_FAILURE (1).  Started:  4:39:07 AM  Finished: 4:39:41 AM  Elapsed:  34.203 seconds.  
The package execution failed.  The step failed.
Question by:motioneye
Welcome to Experts Exchange

Add your voice to the tech community where 5M+ people just like you are talking about what matters.

  • Help others & share knowledge
  • Earn cash & points
  • Learn & ask questions
  • 2
LVL 17

Assisted Solution

dbaSQL earned 600 total points
ID: 35024078
>>>actually this job is trigger from another sql server but actual db is  at another set of sql server, is this error from the server which I installed and deploy the package or in actual db server ?

It's hard to say where it is actually coming from, but I am guessing it is the actual db server, where the data is being loaded.  Potentially there is another concurrent activity that is tripping the bulk insert.

There are many things you can do to avoid locking.  See here:

And you could run this to try and detect what is causing the conflict:

-- Detect blocking
SELECT blocked_query.session_id AS blocked_session_id,

blocking_query.session_id AS blocking_session_id,

sql_text.text AS blocked_text,

sql_btext.text AS blocking_text,

waits.wait_type AS blocking_resource

FROM sys.dm_exec_requests AS blocked_query

INNER JOIN sys.dm_exec_requests AS blocking_query

ON blocked_query.blocking_session_id = blocking_query.session_id



FROM sys.dm_exec_sql_text(blocking_query.sql_handle)

) sql_btext



FROM sys.dm_exec_sql_text(blocked_query.sql_handle)

) sql_text

INNER JOIN sys.dm_os_waiting_tasks AS waits

ON waits.session_id = blocking_query.session_id


Author Comment

ID: 35024292
There is no blocking during the bluk insert as this db have no once accessing into it, I also set the read_committed_snapshot ON, but the job staill failing with the same error message.
LVL 17

Expert Comment

ID: 35024311
LVL 15

Accepted Solution

Aaron Shilo earned 1400 total points
ID: 35024704
each lock that SQL takes out uses up a small amount of memory (cannot remember how much - something like a few KB). The upper limit of no of locks can be configured to a set value but the default, and usual setting, is to allow SQL to dynamically configure the upper limit depending on memory available. Check to see how yours is set (it listed at the start of the errrorlog or use server properties or sp_configure.

from MSDN

If you suspect that SQL Server cannot allocate sufficient memory, try the following:

If applications besides SQL Server are consuming resources, try stopping these applications or consider running them on a separate server. This will remove release memory from other processes for SQL Server.

If you have configured max server memory, increase max server memory setting.

If you suspect that the lock manager has used the maximum amount of available memory identify the transaction that is holding the most locks and terminate it. The following script will identify the transaction with the most locks:

Copy SELECT request_session_id, COUNT (*) num_locks
FROM sys.dm_tran_locks
GROUP BY request_session_id
ORDER BY count (*) DESC
Take the highest session id, and terminate it using the KILL command.


Featured Post

Get MongoDB database support online, now!

At Percona’s web store you can order your MongoDB database support needs in minutes. No hassles, no fuss, just pick and click. Pay online with a credit card. Handle your MongoDB database support now!

Question has a verified solution.

If you are experiencing a similar issue, please ask a related question

JSON is being used more and more, besides XML, and you surely wanted to parse the data out into SQL instead of doing it in some Javascript. The below function in SQL Server can do the job for you, returning a quick table with the parsed data.
For both online and offline retail, the cross-channel business is the most recent pattern in the B2C trade space.
Via a live example combined with referencing Books Online, show some of the information that can be extracted from the Catalog Views in SQL Server.
Viewers will learn how to use the INSERT statement to insert data into their tables. It will also introduce the NULL statement, to show them what happens when no value is giving for any given column.
Suggested Courses

777 members asked questions and received personalized solutions in the past 7 days.

Join the community of 500,000 technology professionals and ask your questions.

Join & Ask a Question