bkuk insert failed, cannot obtain lock resources

Many times when I execute our ssis package job to bulk insert data, it return me errors below,
I also have drop all cleanbuffer, procedure cache and set bulk insert to 1000 instead of 10k per batch but still getting same errors, actually this job is trigger from another sql server but actual db is  at another set of sql server, is this error from the server which I installed and deploy the package or in actual db server ?
How do we fix this kind of issues ?

Date            3/3/2011 4:39:07 AM
Log            Job History (Import)

Step ID            2
Server            SERVERB
Job Name            Import
Step Name            MportNOw
Duration            00:00:34
Sql Severity            0
Sql Message ID            0
Operator Emailed            
Operator Net sent            
Operator Paged            
Retries Attempted            0

Executed as user: SA. Microsoft (R) SQL Server Execute Package Utility  
Version 9.00.4035.00 for 32-bit  Copyright (C) Microsoft Corp 1984-2005. All rights reserved.    
Started:  4:39:07 AM  Error: 2011-03-03 04:39:38.78     Code: 0xC002F304    
Source: DBIMport Bulk Insert Task     Description: An error occurred with the following error message:
"The instance of the SQL Server Database Engine cannot obtain a LOCK resource at this time. Rerun your statement when there are fewer active users.
Ask the database administrator to check the lock and memory configuration for this instance, or to check for long-running transactions.".  
End Error  DTExec: The package execution returned DTSER_FAILURE (1).  Started:  4:39:07 AM  Finished: 4:39:41 AM  Elapsed:  34.203 seconds.  
The package execution failed.  The step failed.
Who is Participating?
Aaron ShiloConnect With a Mentor Chief Database ArchitectCommented:
each lock that SQL takes out uses up a small amount of memory (cannot remember how much - something like a few KB). The upper limit of no of locks can be configured to a set value but the default, and usual setting, is to allow SQL to dynamically configure the upper limit depending on memory available. Check to see how yours is set (it listed at the start of the errrorlog or use server properties or sp_configure.

from MSDN

If you suspect that SQL Server cannot allocate sufficient memory, try the following:

If applications besides SQL Server are consuming resources, try stopping these applications or consider running them on a separate server. This will remove release memory from other processes for SQL Server.

If you have configured max server memory, increase max server memory setting.

If you suspect that the lock manager has used the maximum amount of available memory identify the transaction that is holding the most locks and terminate it. The following script will identify the transaction with the most locks:

Copy SELECT request_session_id, COUNT (*) num_locks
FROM sys.dm_tran_locks
GROUP BY request_session_id
ORDER BY count (*) DESC
Take the highest session id, and terminate it using the KILL command.

dbaSQLConnect With a Mentor Commented:
>>>actually this job is trigger from another sql server but actual db is  at another set of sql server, is this error from the server which I installed and deploy the package or in actual db server ?

It's hard to say where it is actually coming from, but I am guessing it is the actual db server, where the data is being loaded.  Potentially there is another concurrent activity that is tripping the bulk insert.

There are many things you can do to avoid locking.  See here:

And you could run this to try and detect what is causing the conflict:

-- Detect blocking
SELECT blocked_query.session_id AS blocked_session_id,

blocking_query.session_id AS blocking_session_id,

sql_text.text AS blocked_text,

sql_btext.text AS blocking_text,

waits.wait_type AS blocking_resource

FROM sys.dm_exec_requests AS blocked_query

INNER JOIN sys.dm_exec_requests AS blocking_query

ON blocked_query.blocking_session_id = blocking_query.session_id



FROM sys.dm_exec_sql_text(blocking_query.sql_handle)

) sql_btext



FROM sys.dm_exec_sql_text(blocked_query.sql_handle)

) sql_text

INNER JOIN sys.dm_os_waiting_tasks AS waits

ON waits.session_id = blocking_query.session_id

motioneyeAuthor Commented:
There is no blocking during the bluk insert as this db have no once accessing into it, I also set the read_committed_snapshot ON, but the job staill failing with the same error message.
Question has a verified solution.

Are you are experiencing a similar issue? Get a personalized answer when you ask a related question.

Have a better answer? Share it in a comment.

All Courses

From novice to tech pro — start learning today.