• Status: Solved
  • Priority: Medium
  • Security: Public
  • Views: 1214
  • Last Modified:

C#.NET & Oracle database: how to avoid "out of memory" exception when querying large volume data.

Hi Experts:

    I am writing a batch process program with C#.NET in VS2005 connecting an Oracle database.
    The basic logic is as the attached code skeleton. The query in DbCommand object object will return a lot of data. In development environment I debug the program and monitored the memory usage. I found that at the first execution of the "dataReader.Read()" command, 700 Mb memory would be used. This would stop my batch program from working, because:
1. in production environment the data volume can be several times larger.
2. the program needs a lot of memory at other places, either another data query at the same time or building up large collection data structures.

    Please give me some opinion how to resolve this issue from programming perspective:
1. is there anyway not to let dataReader.Read() use all the memory at once? I am guessing it buffers all the data from Oracle database at the first execution. Instead of doing that, is there any way to let it buffer part of the data at a time (such as 10 Mb)?
2. any other approaches to reduce the memory usage?

    Many thanks!
DbCommand cmd;
// construct the cmd variable as a query 
// returning large volume of data
using (IDataReader dataReader = db.ExecuteReader(cmd))
{
   //...  
   while (dataReader.Read())   // causing a lot of memory
                               // in the first time run
   {
       // process data...
   }
}

Open in new window

0
huangs3
Asked:
huangs3
2 Solutions
 
tomcatkevCommented:
I can't help with your VB code, but you may just have a query that is returning "too many rows" to be practical to process, you may control the memory usage from that side by adding a "where rownum<1e6" or scale that to the maximum memory usage that you find it practical to support, and then try and break your batch job to process in batches of 1e6 records (or whatever the chunk size that your own testing may determine).

And of course, the other thought is that memory is cheap, and if this is an "off hours" batch there may be plenty of excess memory resources available to just allocate more and let 'er rip through larger batch chunks, but on the other hand you may need to keep the batch chunks small if you are contending with heavy daytime interactive usage.
0
 
huangs3Author Commented:
Hi tomcatkev and CuteBug:

    Sorry I found I made a mistake when asking this question. The .Read() function wouldn't cause memory, and the memory curve I observed was from another application. I retested several times to make sure this.
    So there will not be memory problem as long as I process all the data on-the-fly.
    Thank you!
0

Featured Post

[Webinar] Cloud and Mobile-First Strategy

Maybe you’ve fully adopted the cloud since the beginning. Or maybe you started with on-prem resources but are pursuing a “cloud and mobile first” strategy. Getting to that end state has its challenges. Discover how to build out a 100% cloud and mobile IT strategy in this webinar.

Tackle projects and never again get stuck behind a technical roadblock.
Join Now