Ramadurai_Hariharaputhran
asked on
OutofMemoryException occurs whenLoading large volume of Data from DataReader into a Datatable.
When loading Oracle datareader that fetches large volume of records into a datatable, we get OutofMemory Exception. How to overcome this error. Is there any technique by which, we can read huge data and store in DataTable.
'Exception occurs in this line. here dtReader is OracleDataReader object
dtReportResult.Load(dtRead er)
'Exception occurs in this line. here dtReader is OracleDataReader object
dtReportResult.Load(dtRead
ASKER CERTIFIED SOLUTION
membership
This solution is only available to members.
To access this solution, you must be a member of Experts Exchange.
ASKER
The requirement is fetching the rows and save it as dbf file. We are executing a query, and captured as Oracle DataReader object and same is loaded to DataTable.
If we have to Custom Collection instead of DataTable, then which collection list will be suitable and that can store huge volume of data (say 5 -20 lac records)
So, Pls suggest me robust solution so that we will not encounter a MemoryOutOfException.
a) Can we async command object methods.
b) Is ther any Bulk data access methods.
Thanks,
Udayashankar.
If we have to Custom Collection instead of DataTable, then which collection list will be suitable and that can store huge volume of data (say 5 -20 lac records)
So, Pls suggest me robust solution so that we will not encounter a MemoryOutOfException.
a) Can we async command object methods.
b) Is ther any Bulk data access methods.
Thanks,
Udayashankar.
You seem to be using a report generator of some kind (dtReportResult). It probably loads the whole thing in memory.
Why not control the DataReader yourself. The DataReader reads the line one by one. If you control it yourself instead of sending it to the report, you will have only one line in memory at a time, and that would solve your problem.
Simply write the code that inserts the line in the .dbf instead of letting the report do it for you.
Why not control the DataReader yourself. The DataReader reads the line one by one. If you control it yourself instead of sending it to the report, you will have only one line in memory at a time, and that would solve your problem.
Simply write the code that inserts the line in the .dbf instead of letting the report do it for you.
Use DISTINCT in the SQL when it is OK to do so in order to prevent duplicate data.
Very often, you can delegate part of the job to the database server. If you need to sum or average columns for instance, do it in a stored procedure instead of doing it in your code. Not only will it bring less data to the client, but it is also usually better on performance.
A DataTable needs a lot of extra information over the data in order to work properly. Each Row has properties that add to memory consumption. If you do not use those properties, you are eating memory for no reason. If you simply print a report, loading the data into a custom collection instead of a DataTable will require a lot less memory.