DataReader.Close() command won't work.

I have a SqlDataReader that is accessing 13 million rows of data.  Once I'm done with the dataReader the close command never works.  There isn't an error message that it throws it just never completes the operation.  With a smaller set of data in the dataReader it closes without any problems.  Does anyone know how to get a very large dataReader to close?
LVL 4
CBeach1980Asked:
Who is Participating?
I wear a lot of hats...

"The solutions and answers provided on Experts Exchange have been extremely helpful to me over the last few years. I wear a lot of hats - Developer, Database Administrator, Help Desk, etc., so I know a lot of things but not a lot about one thing. Experts Exchange gives me answers from people who do know a lot about one thing, in a easy to use platform." -Todd S.

dave4dlCommented:
check the state of the data reader before closing.  You may need to sleep (wait) until it is done doing work (if it is busy) before closing it.  What error message are you seeing?  Paste in the full stack trace too.
0
CBeach1980Author Commented:
What is the command to check that state of the data reader?  There is no error message it just never completes the operation (after hours).
0
The Ultimate Tool Kit for Technolgy Solution Provi

Broken down into practical pointers and step-by-step instructions, the IT Service Excellence Tool Kit delivers expert advice for technology solution providers. Get your free copy for valuable how-to assets including sample agreements, checklists, flowcharts, and more!

dave4dlCommented:
how big is the dataset that does work?

Is the computer maxed out on memory when you try to close the sqldatareader (this will happen if you load all those 13,000,000 rows into an array)?  If the computer is completely out of memory it will take forever to do anything.

Try to pinpoint where the problem occurs more precisesly (such as the exact number of rows where it starts not closing).  Does the problem gradually get worse as you add more rows?

I do not think this is a bug because a few google searches turn up no other similar problems.  Also 13 million rows is not that much so i would expect that to be well within the dev team's testing parameters.
0
CBeach1980Author Commented:
Wouldn't .isClosed only tell me if the dataReader has been closed something I know not to be the case before I've closed it?  The data is being processed one row at a time so it doesn't get loaded into an array or anything along those lines for the exact reason of not maxing out on memory.  There is plenty of memory left when I attempt to close the dataReader.  I'd have to do further tests to see if there is a cutoff point though what I do know currently is that 100,000 rows works while 13,000,000 does not.
0
CBeach1980Author Commented:
There are 3 threads each with the same number of rows just from different data sources.
100,000 closes all 3 in about 1 second
1,000,000 closes all 3 in about 10 seconds
2,000,000 closes all 3 in about 20 seconds
5,000,000 closes all 3 in about 90 seconds

So yes it does get progressively worse but not in a linear fashion.
0
REA_ANDREWCommented:
just out of interest are you enclosing your statement inside a using Block?

using(...)
{

}

if not, use one, then the closing and disposing is done for you, upon completion
0
REA_ANDREWCommented:
also could it be that you are not waiting long enough for it to close?

Also take a look at

SqlNotificationRequest
0
dave4dlCommented:
hmm, that is interesting
does 8,388,608 (2^23) close?

I think it would be useful if you posted the relevant parts of your code (simplified if necessary)
0
CBeach1980Author Commented:
The code is pretty basic.

Load dataReader (from DataAccess class)
while (dataReader.Read())
{
   ...Read in and perform analysis on data...
}
dataReader.Close();

8,388,608 does close after about 3.5 minutes.

If the .Close() operation is having trouble, would the using statement help any since doesn't it basically call the exact same operation?

As far as waiting long enough.  The Close() operation is happening in a windows application as part of a thread.  The thread sits and waits for it to close.  The code doesn't ever actually give up or throw an error it just never completes it's process (not really practical for what I'm going for).
0
dave4dlCommented:
take out everything you summarized with "...Read in and perform analysis on data..." (temporarily)

Rerun it with all the rows to see how long it takes to close.  If it closes without problems the trouble is caused by a piece of code we deleted
0
CBeach1980Author Commented:
When removing the processing section it causes it to close quickly.  I'll see if I can isolate a particular section that is causing the problem.
0
CBeach1980Author Commented:
Okay, here is what I've found.  If the entire process is allowed to run (takes about 3 hours) then there is no problem closing the dataReader.  However, the problem comes from the fact that I have to be able to stop the process midway in the event that a setting was wrong.  Apparently when you close a dataReader without having read each row the code inside of the .Close() operation will read the rows one by one until it gets to the end of the dataReader and then close it.  

Is there any way to skip to the end of the dataReader so that it doesn't waste time reading the additional rows that I'm not interested in?
Secondly, could I just set the dataReader = null to destroy the object or will that leave an orphaned dataReader in memory with a connection to the database until .NET gets around to cleaning it up?

Thanks for all the help so far.  This has been one of the more odd problems I've ever encountered.
0
dave4dlCommented:
try just closing the connection (not the data reader)
0
CBeach1980Author Commented:
That ends up with the same problem.  For whatever reason if I try to close the connection it again takes a long time to complete.
0
dave4dlCommented:
what happens if you call the dispose method directly (without calling close)?
0
CBeach1980Author Commented:
I tried that before.  I seems to act identically to Close().  The only way I've gotten it to not take forever is to set the object to null which basically almost the same as ignoring it I think.
0
dave4dlCommented:
If you are searching for one specific record or a few specific records, limit what you see by using a WHERE or HAVING clause in your SELECT statement.
0
CBeach1980Author Commented:
I need all the rows in normal processing.  The only reason to want only part of the dataReader is if the process needs to prematurely stopped because a setting was wrong.  Is there any real danger or problem is I just set the object to null?
0
dave4dlCommented:
no danger other than not having the object properly disposed of at a time of your choosing, which will lead to memory leaks (allocated memory that is not used for anything).  Eventually (some time after the containing block completes) .net garbage collection will take care of it (which may just call the dispose method thus calling the close method).  Hopefully this happends at a convenient time in the life of your process because it will probably take just as long to call the close method later as it does earlier.
0

Experts Exchange Solution brought to you by

Your issues matter to us.

Facing a tech roadblock? Get the help and guidance you need from experienced professionals who care. Ask your question anytime, anywhere, with no hassle.

Start your 7-day free trial
dave4dlCommented:
Thanks!  

good luck
0
It's more than this solution.Get answers and train to solve all your tech problems - anytime, anywhere.Try it for free Edge Out The Competitionfor your dream job with proven skills and certifications.Get started today Stand Outas the employee with proven skills.Start learning today for free Move Your Career Forwardwith certification training in the latest technologies.Start your trial today
Microsoft SQL Server 2005

From novice to tech pro — start learning today.