Solved

OutofMemoryException occurs whenLoading large volume of Data from DataReader into a Datatable.

Posted on 2013-01-28
4
423 Views
Last Modified: 2013-06-24
When loading Oracle datareader that fetches large volume of records into a datatable, we get OutofMemory Exception. How to overcome this error. Is there any technique by which, we can read huge data and store in DataTable.

 'Exception occurs in this line. here dtReader is OracleDataReader object
dtReportResult.Load(dtReader)
0
Comment
[X]
Welcome to Experts Exchange

Add your voice to the tech community where 5M+ people just like you are talking about what matters.

  • Help others & share knowledge
  • Earn cash & points
  • Learn & ask questions
  • 2
4 Comments
 
LVL 9

Accepted Solution

by:
sognoct earned 500 total points
ID: 38826565
never experienced this kind of issue, but in a forum there is a similar problem.
According to them for solving you should connect in Direct mode, without oracle client.

example :
dotConnect for Oracle
0
 
LVL 40
ID: 38830190
Retrieve only what you need, nothing more. If you try to read 8GB of data in a computer that has only 4GB of memory, even with a swap file, you will end up not having enough memory.

Use DISTINCT in the SQL when it is OK to do so in order to prevent duplicate data.

Very often, you can delegate part of the job to the database server. If you need to sum or average columns for instance, do it in a stored procedure instead of doing it in your code. Not only will it bring less data to the client, but it is also usually better on performance.

A DataTable needs a lot of extra information over the data in order to work properly. Each Row has properties that add to memory consumption. If you do not use those properties, you are eating memory for no reason. If you simply print a report, loading the data into a custom collection instead of a DataTable will require a lot less memory.
0
 

Author Comment

by:Ramadurai_Hariharaputhran
ID: 38879421
The requirement is fetching the rows and save it as dbf file. We are executing a query, and captured as Oracle DataReader object and same is loaded to DataTable.

If we have to Custom Collection instead of DataTable, then which collection list will be suitable and that can store huge volume of data (say 5 -20 lac records)

So, Pls suggest me robust solution so that we will not encounter a MemoryOutOfException.

a) Can we async command object methods.
b) Is ther any Bulk data access methods.

Thanks,
Udayashankar.
0
 
LVL 40
ID: 38879621
You seem to be using a report generator of some kind (dtReportResult). It probably loads the whole thing in memory.

Why not control the DataReader yourself. The DataReader reads the line one by one. If you control it yourself instead of sending it to the report, you will have only one line in memory at a time, and that would solve your problem.

Simply write the code that inserts the line in the .dbf instead of letting the report do it for you.
0

Featured Post

Free Tool: Path Explorer

An intuitive utility to help find the CSS path to UI elements on a webpage. These paths are used frequently in a variety of front-end development and QA automation tasks.

One of a set of tools we're offering as a way of saying thank you for being a part of the community.

Question has a verified solution.

If you are experiencing a similar issue, please ask a related question

The object model of .Net can be overwhelming at times – so overwhelming that quite trivial tasks often take hours of research. In this case, the task at hand was to populate the datagrid from SQL Server database in Visual Studio 2008 Windows applica…
Recently while returning home from work my wife (another .NET developer) was murmuring something. On further poking she said that she has been assigned a task where she has to serialize and deserialize objects and she is afraid of serialization. Wha…
In an interesting question (https://www.experts-exchange.com/questions/29008360/) here at Experts Exchange, a member asked how to split a single image into multiple images. The primary usage for this is to place many photographs on a flatbed scanner…
Finding and deleting duplicate (picture) files can be a time consuming task. My wife and I, our three kids and their families all share one dilemma: Managing our pictures. Between desktops, laptops, phones, tablets, and cameras; over the last decade…

751 members asked questions and received personalized solutions in the past 7 days.

Join the community of 500,000 technology professionals and ask your questions.

Join & Ask a Question