I am writing a batch process program with C#.NET in VS2005 connecting an Oracle database.
The basic logic is as the attached code skeleton. The query in DbCommand object object will return a lot of data. In development environment I debug the program and monitored the memory usage. I found that at the first execution of the "dataReader.Read()" command, 700 Mb memory would be used. This would stop my batch program from working, because:
1. in production environment the data volume can be several times larger.
2. the program needs a lot of memory at other places, either another data query at the same time or building up large collection data structures.
Please give me some opinion how to resolve this issue from programming perspective:
1. is there anyway not to let dataReader.Read() use all the memory at once? I am guessing it buffers all the data from Oracle database at the first execution. Instead of doing that, is there any way to let it buffer part of the data at a time (such as 10 Mb)?
2. any other approaches to reduce the memory usage?
// construct the cmd variable as a query
// returning large volume of data
using (IDataReader dataReader = db.ExecuteReader(cmd))
while (dataReader.Read()) // causing a lot of memory
// in the first time run
// process data...