SSIS memory failure

Hi,

I have an SSIS package that runs fine in a LIVE environment (which is a well specified SQL Server cluster). I need to run the same package in the UAT environment but I'm thwarted by a memory error when it gets to a certain point in a series of data flows and SQL updates:
"There is insufficient memory available in the buffer pool".

The UAT environment is a single server and is poorly specified, particularly in respect to RAM, which is just 2GB. I feel that the issue will probably just disappear if someone beefs the RAM up to at least 4GB but does anyone have any ideas for any tricks or workarounds to try and give the SSIS package more resources to run to completion? I have commented on the low RAM in this server before but nothing has happened so I could be stuck for some time.

I tried restarting the SQL Server database engine and Task Manager shows about 1GB RAM available but still the memory error on the package. The SQL Server instance is configured for dynamic memory management, with no upper limit to the amount of RAM is can grab.

Thanks in advance.
irb56Asked:
Who is Participating?

Improve company productivity with a Business Account.Sign Up

x
 
geek_vjConnect With a Mentor Commented:
>>"There is insufficient memory available in the buffer pool"
This is a clear message that you are running out of memory. That is the amount of data which can be stored in buffer pool is not sufficient for retrieving the data using the SSIS package. So, there are two ways to overcome this situation:

1. Revisit the SSIS package and change it so that it retrives few data at a time instead of retrieving large amount of data at a single shot
2. Upgrade your RAM to the next level
0
 
irb56Author Commented:
Thanks for your help. The SSIS package fails on the same data flow task each time and when I disable this task it runs all the way to completion. The problem data flow task features three lookup transforms between the OLE DB Source and OLE DB Destination. I am experimenting with partial cache settings on the lookup transforms but the default setting for partial cache (25MB) has a massive performance impact. I gave up after about half an hour of extremely slow progress and I'm now trying out a partial cache of 250MB (which might hit the memory error again).

I agree that the best solution is simply to have the server RAM upgraded.

Many thanks.  :-)
0
 
geek_vjCommented:
You are welcome irb56!
0
Question has a verified solution.

Are you are experiencing a similar issue? Get a personalized answer when you ask a related question.

Have a better answer? Share it in a comment.

All Courses

From novice to tech pro — start learning today.