Link to home
Start Free TrialLog in
Avatar of arthrex
arthrexFlag for Germany

asked on

A buffer failed while allocating 10485104 bytes; The system reports 98 percent memory load

Dear Experts,

I'm running a SSIS package with 2 dataflow task.
One task is failing from time to time with error message:
Description: A buffer failed while allocating 10485104 bytes.  
End Error  Error: 2010-03-06 03:47:55.76     Code: 0xC0047011    
 Source: Aggregate EH, SKO, BE 2006 DTS.Pipeline    
 Description: The system reports 98 percent memory load.
 There are 4294660096 bytes of physical memory with 78651392 bytes free.
 There are 2147352576 bytes of virtual memory with 101441536 bytes free.
 The paging file has 6268792832 bytes with 2157301760 bytes free.  
 End Error  Error: 2010-03-06 03:47:55.76     Code: 0xC0047022    
 Source: Aggregate EH, SKO, BE 2006 DTS.Pipeline    
 Description: SSIS Error Code DTS_E_PROCESSINPUTFAILED.  
 The ProcessInput method on component "Union All" (510) failed with
 error code 0x8007000E. Th...  The package execution fa...  The step failed.

The task is aggreagting a hudge amount of data then it unions it with two very small tables (component "Union All").
Thats all.
I read a lot about Defaultbuffersize and Defualtbuffermaxrows but at the end I didn't know how
to solve my error with changing these settings.
I increased the Defaultbuffersize to 20970208 but the error show up again.
Is deceasing the solution?


It is a 32bit windows 2003 server  with 4GB RAM. Sql Server 2005

I've also logged "BufferSizeTuning":
BufferSizeTuning,COGNOSKFD,ARTHREX\user,Aggregate EH, SKO, BE 2006,{FE01230D-545D-488A-A08F-2E01FC6BC83A},{8480C0BD-5F15-48FF-A762-0027DFC0F3C7},3/9/2010 3:46:42 AM,3/9/2010 3:46:42 AM,0,0x,Rows in buffer type 0 would cause a buffer size greater than the configured maximum. There will be only 3343 rows in buffers of this type.
BufferSizeTuning,COGNOSKFD,ARTHREX\user,Aggregate EH, SKO, BE 2006,{FE01230D-545D-488A-A08F-2E01FC6BC83A},{8480C0BD-5F15-48FF-A762-0027DFC0F3C7},3/9/2010 3:46:42 AM,3/9/2010 3:46:42 AM,0,0x,Rows in buffer type 4 would cause a buffer size greater than the configured maximum. There will be only 3995 rows in buffers of this type.
BufferSizeTuning,COGNOSKFD,ARTHREX\user,Aggregate EH, SKO, BE 2006,{FE01230D-545D-488A-A08F-2E01FC6BC83A},{8480C0BD-5F15-48FF-A762-0027DFC0F3C7},3/9/2010 3:46:42 AM,3/9/2010 3:46:42 AM,0,0x,Rows in buffer type 10 would cause a buffer size greater than the configured maximum. There will be only 9165 rows in buffers of this type.
BufferSizeTuning,COGNOSKFD,ARTHREX\user,Aggregate EH, SKO, BE 2006,{FE01230D-545D-488A-A08F-2E01FC6BC83A},{8480C0BD-5F15-48FF-A762-0027DFC0F3C7},3/9/2010 3:46:42 AM,3/9/2010 3:46:42 AM,0,0x,Rows in buffer type 18 would cause a buffer size greater than the configured maximum. There will be only 7181 rows in buffers of this type.
PackageStart,COGNOSKFD,ARTHREX\user,KfdData2Staging_SalesFactsAggr,{38531CB0-1F49-4A6C-AE49-C2DB37F5C630},{8480C0BD-5F15-48FF-A762-0027DFC0F3C7},3/9/2010 3:46:42 AM,3/9/2010 3:46:42 AM,0,0x,Beginning of package execution.

BufferSizeTuning,COGNOSKFD,ARTHREX\user,Aggregate EH, SKO, BE 2006,{FE01230D-545D-488A-A08F-2E01FC6BC83A},{8480C0BD-5F15-48FF-A762-0027DFC0F3C7},3/9/2010 3:46:43 AM,3/9/2010 3:46:43 AM,0,0x,Rows in buffer type 0 would cause a buffer size greater than the configured maximum. There will be only 3343 rows in buffers of this type.
BufferSizeTuning,COGNOSKFD,ARTHREX\user,Aggregate EH, SKO, BE 2006,{FE01230D-545D-488A-A08F-2E01FC6BC83A},{8480C0BD-5F15-48FF-A762-0027DFC0F3C7},3/9/2010 3:46:43 AM,3/9/2010 3:46:43 AM,0,0x,Rows in buffer type 4 would cause a buffer size greater than the configured maximum. There will be only 3995 rows in buffers of this type.
BufferSizeTuning,COGNOSKFD,ARTHREX\user,Aggregate EH, SKO, BE 2006,{FE01230D-545D-488A-A08F-2E01FC6BC83A},{8480C0BD-5F15-48FF-A762-0027DFC0F3C7},3/9/2010 3:46:43 AM,3/9/2010 3:46:43 AM,0,0x,Rows in buffer type 10 would cause a buffer size greater than the configured maximum. There will be only 9165 rows in buffers of this type.
BufferSizeTuning,COGNOSKFD,ARTHREX\user,Aggregate EH, SKO, BE 2006,{FE01230D-545D-488A-A08F-2E01FC6BC83A},{8480C0BD-5F15-48FF-A762-0027DFC0F3C7},3/9/2010 3:46:43 AM,3/9/2010 3:46:43 AM,0,0x,Rows in buffer type 18 would cause a buffer size greater than the configured maximum. There will be only 7181 rows in buffers of this type.
BufferSizeTuning,COGNOSKFD,ARTHREX\user,Aggregate EH, SKO, BE 2006,{FE01230D-545D-488A-A08F-2E01FC6BC83A},{8480C0BD-5F15-48FF-A762-0027DFC0F3C7},3/9/2010 3:46:43 AM,3/9/2010 3:46:43 AM,0,0x,Rows in buffer type 0 would cause a buffer size greater than the configured maximum. There will be only 3343 rows in buffers of this type.
BufferSizeTuning,COGNOSKFD,ARTHREX\user,Aggregate EH, SKO, BE 2006,{FE01230D-545D-488A-A08F-2E01FC6BC83A},{8480C0BD-5F15-48FF-A762-0027DFC0F3C7},3/9/2010 3:46:43 AM,3/9/2010 3:46:43 AM,0,0x,Rows in buffer type 4 would cause a buffer size greater than the configured maximum. There will be only 3995 rows in buffers of this type.
BufferSizeTuning,COGNOSKFD,ARTHREX\user,Aggregate EH, SKO, BE 2006,{FE01230D-545D-488A-A08F-2E01FC6BC83A},{8480C0BD-5F15-48FF-A762-0027DFC0F3C7},3/9/2010 3:46:43 AM,3/9/2010 3:46:43 AM,0,0x,Rows in buffer type 10 would cause a buffer size greater than the configured maximum. There will be only 9165 rows in buffers of this type.
BufferSizeTuning,COGNOSKFD,ARTHREX\user,Aggregate EH, SKO, BE 2006,{FE01230D-545D-488A-A08F-2E01FC6BC83A},{8480C0BD-5F15-48FF-A762-0027DFC0F3C7},3/9/2010 3:46:43 AM,3/9/2010 3:46:43 AM,0,0x,Rows in buffer type 18 would cause a buffer size greater than the configured maximum. There will be only 7181 rows in buffers of this type.

Thanks for your help!!
Avatar of Steve Hogg
Steve Hogg
Flag of United States of America image

On your dataflow task, check the properties. BufferTempStoaragePath. Assign a path location where the server can use file storage to help with bufferring needs.
HoggZilla
>> The task is aggreagting a hudge amount of data then it unions it with two very small tables

Is there any alternative logic that can be done for this part of your query..
this would definitely impact your system memory..
Avatar of arthrex

ASKER

thanks for your answers.
@HoggZilla: I read about the BufferTempStoragePath. Is my understanding correct:
as long as the variable is not set, the job writes to the user's temp folder under whom the job is running.
This is normally under "Documents and Settings"...
So, if I set now the path to a folder on C: (C has more then 100 GB free) this would result in the same, correct?
Because the "Documents and Settings" folder is on C too.
So the error could occur again, am I right?

@rrjegan17:I could aggregate in two steps and split the aggregation in two dataflow parts.
Do you think that woul help? But anyway I would like to avoid this.

Another question:
I'm not sure where the problem generally comes from. The Server has 4 GB RAm. Is this not enough? Is it because of 32bit and one process can only get 2 GB? Or is it the setting "Default Buffer size" which causes trouble?


ASKER CERTIFIED SOLUTION
Avatar of Raja Jegan R
Raja Jegan R
Flag of India image

Link to home
membership
This solution is only available to members.
To access this solution, you must be a member of Experts Exchange.
Start Free Trial
Avatar of arthrex

ASKER

Thanks rrjegan17.

Am I able to enable AWE on SQL 2005 Standard Edition. Just read about it in the Internet and it weems for me, as if this is only possible for Enterprise edition.

>> "whether you have enabled 3GB and PAE switches"
is it just this?
sp_configure 'show advanced options', 1
RECONFIGURE
GO
sp_configure 'awe enabled', 1
RECONFIGURE
GO

do i need to change that too?
sp_configure 'min server memory', 1024
RECONFIGURE
GO
sp_configure 'max server memory', 6144
RECONFIGURE
GO



>> " kindly check whether you have enabled 3GB and PAE switches in your boot.ini file "
I just need to add that > "/PAE" at the end of the boot.ini file correct?

Anything else?
Oops..you have only 4GB of RAM and hence /PAE switch is not required..
Just add /3GB at the end of your boot.ini file which should suffice.

>> sp_configure 'max server memory', 6144

Seems like this is the culprit..
You have only 4GB RAM and mentioned SQL Server to take 6GB memory for itself by configuring Max Server Memory as 6GB and hence you might face this issue..

Just set it to 3GB and leave aside 1 GB for OS and other processes in the system.

And Restart your entire machine and check out the performance..
Avatar of arthrex

ASKER

Hi rrjegan17,

thanks for your answer.
ok I set the /3GB RAM in the boot.ini.
No, so far the 'max server memory" was set to 2 GB, so this wasn't the issue. I was just asking whether I need to run this statement in general. Sorry for the 6GB which confuses.

Ok, I'll set 'max server memory" to 3 GB. Is it also possible to set 3,5? Or is it not possible to give the sql server more, as I set the /3GB flag in the boot.ini?
In case we'll add in our 32bit server more RAM, would it be possible to give sql server more than 3 GB even though it's a 32 bit server?

Where can I see how much the sql server is allowed to use?
I ran this statement

SELECT   SUM(virtual_memory_reserved_kb) / 1024 AS [virtual memory allocated, MB],  
SUM(virtual_memory_committed_kb) / 1024 AS [virtual memory committed, MB],    
 SUM(awe_allocated_kb) / 1024 AS [AWE allocated, Mb]  
FROM      sys.dm_os_memory_clerks

and get this result:
virtual memory allocated, MB
2610      
virtual memory committed, MB
43      
AWE allocated, Mb
2783

when I run this:
EXECUTE sp_configure 'awe enabled'
GO
I get:
name                 minimum maximum config_value run_value
awe enabled            0         1      1      1
>> Ok, I'll set 'max server memory" to 3 GB. Is it also possible to set 3,5?

Fine.. Setting it to 3GB forces SQL server to utilize 3 GB completely allowing the rest 1 GB for the system processes..
OS should be given at least 1 or 1.5 GB for its overall performance and hence 3 GB should be fine..

Ok.. Now restart your Server since you have modified boot.ini file and test it out..
Avatar of arthrex

ASKER

Thanks for your help!
Works now!
Welcome..
And glad to help you out..