Want to win a PS4? Go Premium and enter to win our High-Tech Treats giveaway. Enter to Win


bkuk insert failed, cannot obtain lock resources

Posted on 2011-03-02
Medium Priority
Last Modified: 2012-05-11
Many times when I execute our ssis package job to bulk insert data, it return me errors below,
I also have drop all cleanbuffer, procedure cache and set bulk insert to 1000 instead of 10k per batch but still getting same errors, actually this job is trigger from another sql server but actual db is  at another set of sql server, is this error from the server which I installed and deploy the package or in actual db server ?
How do we fix this kind of issues ?

Date            3/3/2011 4:39:07 AM
Log            Job History (Import)

Step ID            2
Server            SERVERB
Job Name            Import
Step Name            MportNOw
Duration            00:00:34
Sql Severity            0
Sql Message ID            0
Operator Emailed            
Operator Net sent            
Operator Paged            
Retries Attempted            0

Executed as user: SA. Microsoft (R) SQL Server Execute Package Utility  
Version 9.00.4035.00 for 32-bit  Copyright (C) Microsoft Corp 1984-2005. All rights reserved.    
Started:  4:39:07 AM  Error: 2011-03-03 04:39:38.78     Code: 0xC002F304    
Source: DBIMport Bulk Insert Task     Description: An error occurred with the following error message:
"The instance of the SQL Server Database Engine cannot obtain a LOCK resource at this time. Rerun your statement when there are fewer active users.
Ask the database administrator to check the lock and memory configuration for this instance, or to check for long-running transactions.".  
End Error  DTExec: The package execution returned DTSER_FAILURE (1).  Started:  4:39:07 AM  Finished: 4:39:41 AM  Elapsed:  34.203 seconds.  
The package execution failed.  The step failed.
Question by:motioneye
Welcome to Experts Exchange

Add your voice to the tech community where 5M+ people just like you are talking about what matters.

  • Help others & share knowledge
  • Earn cash & points
  • Learn & ask questions
  • 2
LVL 17

Assisted Solution

dbaSQL earned 600 total points
ID: 35024078
>>>actually this job is trigger from another sql server but actual db is  at another set of sql server, is this error from the server which I installed and deploy the package or in actual db server ?

It's hard to say where it is actually coming from, but I am guessing it is the actual db server, where the data is being loaded.  Potentially there is another concurrent activity that is tripping the bulk insert.

There are many things you can do to avoid locking.  See here:

And you could run this to try and detect what is causing the conflict:

-- Detect blocking
SELECT blocked_query.session_id AS blocked_session_id,

blocking_query.session_id AS blocking_session_id,

sql_text.text AS blocked_text,

sql_btext.text AS blocking_text,

waits.wait_type AS blocking_resource

FROM sys.dm_exec_requests AS blocked_query

INNER JOIN sys.dm_exec_requests AS blocking_query

ON blocked_query.blocking_session_id = blocking_query.session_id



FROM sys.dm_exec_sql_text(blocking_query.sql_handle)

) sql_btext



FROM sys.dm_exec_sql_text(blocked_query.sql_handle)

) sql_text

INNER JOIN sys.dm_os_waiting_tasks AS waits

ON waits.session_id = blocking_query.session_id


Author Comment

ID: 35024292
There is no blocking during the bluk insert as this db have no once accessing into it, I also set the read_committed_snapshot ON, but the job staill failing with the same error message.
LVL 17

Expert Comment

ID: 35024311
LVL 15

Accepted Solution

Aaron Shilo earned 1400 total points
ID: 35024704
each lock that SQL takes out uses up a small amount of memory (cannot remember how much - something like a few KB). The upper limit of no of locks can be configured to a set value but the default, and usual setting, is to allow SQL to dynamically configure the upper limit depending on memory available. Check to see how yours is set (it listed at the start of the errrorlog or use server properties or sp_configure.

from MSDN

If you suspect that SQL Server cannot allocate sufficient memory, try the following:

If applications besides SQL Server are consuming resources, try stopping these applications or consider running them on a separate server. This will remove release memory from other processes for SQL Server.

If you have configured max server memory, increase max server memory setting.

If you suspect that the lock manager has used the maximum amount of available memory identify the transaction that is holding the most locks and terminate it. The following script will identify the transaction with the most locks:

Copy SELECT request_session_id, COUNT (*) num_locks
FROM sys.dm_tran_locks
GROUP BY request_session_id
ORDER BY count (*) DESC
Take the highest session id, and terminate it using the KILL command.


Featured Post

Concerto's Cloud Advisory Services

Want to avoid the missteps to gaining all the benefits of the cloud? Learn more about the different assessment options from our Cloud Advisory team.

Question has a verified solution.

If you are experiencing a similar issue, please ask a related question

This article explains how to reset the password of the sa account on a Microsoft SQL Server.  The steps in this article work in SQL 2005, 2008, 2008 R2, 2012, 2014 and 2016.
When trying to connect from SSMS v17.x to a SQL Server Integration Services 2016 instance or previous version, you get the error “Connecting to the Integration Services service on the computer failed with the following error: 'The specified service …
Via a live example, show how to backup a database, simulate a failure backup the tail of the database transaction log and perform the restore.
Viewers will learn how to use the SELECT statement in SQL and will be exposed to the many uses the SELECT statement has.

618 members asked questions and received personalized solutions in the past 7 days.

Join the community of 500,000 technology professionals and ask your questions.

Join & Ask a Question