VB6 "Out of Memory" error (7) dimensioning public array with plenty of physical memory free

My application throws a "error 7 - out of memory" when i dimension an array of 9million records of 48byte user-defined type (i.e. approximately 500Mb). There is plenty (about 3 Gig) of physical memory available on the machine and yet it immediately errors when dimming the array. It does not fail, however, on approximately 7 million records.

Machine spec is dual processor (~ 1.7 GHz each) server with 4 Gb physical memory and 2-4 Gb virtual memory.

I have entered the offending code into a new application (see below) which simply dimensions the same public array. This application will happily dimension the same array with as many as 20 million rows (about 1.5 gig). If i run it on the same machine at the point where the main app fails it works! i.e. the operating system will allocate THIS app the memory but not the other.

The main application has many public variables, some of them collections with up to a few thousand members. The test app has 1 public variable. I'm thinking perhaps there is a limit to the number/size of public variables within VB6?

Any help would be most appreciated! (thanks in advance)

Here is the code in the test app (that works):

Const CLILEN = 30
Type bulk_index_row
  stUser As String * CLILEN       ' all fixed size
  lDate As Long                   ' value YYMMDD
  lTime As Long                   ' value HHnnSS
  iFlag As Integer                ' for counting 'revenue'
  ' pointing to
  ixFile As Integer               ' a dated events (DEF)
  lxRecord As Long                ' in file
  sEventType As String * 2        ' event type, e.g. "WG"
End Type
Public aSortIndex() As bulk_index_row

Private Sub cmdTest_Click()
  Dim lCalls As Long
  Dim bulkindexrec As bulk_index_row
  On Error GoTo procErr
  lCalls = 9000000
  ReDim aSortIndex(0 To lCalls)
  MsgBox "succeeded"
  Exit Sub

  MsgBox "Failed!"
End Sub

Who is Participating?
anthonywjones66Connect With a Mentor Commented:

Each process running on the computer has a total addressable range of memory of 4GB.  Half of this is shared with all other processes, called the global memory, where things like common dlls (such as kernel32.dll) are loaded and global memory mapped files can be allocated etc.  The other half of the memory address range is private to the process, no other process will ever see data in another process's private memory.  Hence it's possible to have more than the 4GB of addressable memory commited to all the processes it's just not possible for any one process to have more than 4GB (unless we go up to 64Bit version)

It's in this private area that things like VB variables are allocated including arrays.  All elements in an array must be in contiguous addresses, it can't be split a bit here then another bit there, it all has to be in one single lump of memory.  If you need a 500MB array then it only takes 4 other small memory allocations in just the wrong places in the 2GB space to make it such that a single unit of 500MB cannot be found.

In a complex VB application which is allocating a lot of memory (130MB is quite a lot of memory for a VB app) then memory fragmentation is an issue.


I believe the memory must be contiguous when assigning arrays.  So if there is not a chunk large enough to hold the entire structure, it will fail (I think).
Erick37 is quite right the memory range for an array needs to contiguous.  You will probably only have a 2GB range of addresses for the private memory of a process so it's quite possible that this space is fragmented enough so that a 500MB contiguous space can't be found.

Free Tool: Subnet Calculator

The subnet calculator helps you design networks by taking an IP address and network mask and returning information such as network, broadcast address, and host range.

One of a set of tools we're offering as a way of saying thank you for being a part of the community.

One question you might ask yourself:  Do I really really need all these data structures in memory at the same time?

simonspringAuthor Commented:
Thank you for your posts.

My application is using about 130MB when it tries to dimension the 500Mb array and fails (there is about 3000Mb free at the time). My test application, if i run it at this point, will happily dimension an array of 1500Mb. I'm confused why one application is allowed to dim a 1500 Mb array when the other fails to dim a 500Mb array on top of the 130Mb its already using - i.e. totalling 630Mb.

If it requires a contiguous range then why can one process find 1500Mb when the other cannot find 500Mb?

(It's driving me mad!)
Simple app has memory that looks like this:-


Whereas the complex app has memory that looks like this:-


Where x = allocated in use memory.

Because of previous activities which may have involved more memory at one time the address range becomes a little fragmented with individual bits of memory still being in use.  The application whilst in total still having a great deal of the available memory range free is unable to find a singe solid range of memory that can hold the array.


memory must be contiguous when assigning arrays, but lot of virual memory is available for it. I dont think that is the problem. Maybe this is because of VB.
simonspringAuthor Commented:

Thanks for your help on this. I have resolved the issue (for now) by reserving the memory at application startup. At this point there is about 1.5 Gig of contiguous memory available. When this limit is reached in the future, i will boot windows using the /3GB switch, reducing the kernel memory size to 1GB whilst increasing the private area from 2GB to 3GB. This should then keep me going for a while longer...

Question has a verified solution.

Are you are experiencing a similar issue? Get a personalized answer when you ask a related question.

Have a better answer? Share it in a comment.

All Courses

From novice to tech pro — start learning today.