Solved

DataReader.Close() command won't work.

Posted on 2007-04-06
21
302 Views
Last Modified: 2010-03-19
I have a SqlDataReader that is accessing 13 million rows of data.  Once I'm done with the dataReader the close command never works.  There isn't an error message that it throws it just never completes the operation.  With a smaller set of data in the dataReader it closes without any problems.  Does anyone know how to get a very large dataReader to close?
0
Comment
Question by:CBeach1980
  • 10
  • 9
  • 2
21 Comments
 
LVL 15

Expert Comment

by:dave4dl
Comment Utility
check the state of the data reader before closing.  You may need to sleep (wait) until it is done doing work (if it is busy) before closing it.  What error message are you seeing?  Paste in the full stack trace too.
0
 
LVL 4

Author Comment

by:CBeach1980
Comment Utility
What is the command to check that state of the data reader?  There is no error message it just never completes the operation (after hours).
0
 
LVL 15

Expert Comment

by:dave4dl
Comment Utility
0
 
LVL 15

Expert Comment

by:dave4dl
Comment Utility
how big is the dataset that does work?

Is the computer maxed out on memory when you try to close the sqldatareader (this will happen if you load all those 13,000,000 rows into an array)?  If the computer is completely out of memory it will take forever to do anything.

Try to pinpoint where the problem occurs more precisesly (such as the exact number of rows where it starts not closing).  Does the problem gradually get worse as you add more rows?

I do not think this is a bug because a few google searches turn up no other similar problems.  Also 13 million rows is not that much so i would expect that to be well within the dev team's testing parameters.
0
 
LVL 4

Author Comment

by:CBeach1980
Comment Utility
Wouldn't .isClosed only tell me if the dataReader has been closed something I know not to be the case before I've closed it?  The data is being processed one row at a time so it doesn't get loaded into an array or anything along those lines for the exact reason of not maxing out on memory.  There is plenty of memory left when I attempt to close the dataReader.  I'd have to do further tests to see if there is a cutoff point though what I do know currently is that 100,000 rows works while 13,000,000 does not.
0
 
LVL 4

Author Comment

by:CBeach1980
Comment Utility
There are 3 threads each with the same number of rows just from different data sources.
100,000 closes all 3 in about 1 second
1,000,000 closes all 3 in about 10 seconds
2,000,000 closes all 3 in about 20 seconds
5,000,000 closes all 3 in about 90 seconds

So yes it does get progressively worse but not in a linear fashion.
0
 
LVL 20

Expert Comment

by:REA_ANDREW
Comment Utility
just out of interest are you enclosing your statement inside a using Block?

using(...)
{

}

if not, use one, then the closing and disposing is done for you, upon completion
0
 
LVL 20

Expert Comment

by:REA_ANDREW
Comment Utility
also could it be that you are not waiting long enough for it to close?

Also take a look at

SqlNotificationRequest
0
 
LVL 15

Expert Comment

by:dave4dl
Comment Utility
hmm, that is interesting
does 8,388,608 (2^23) close?

I think it would be useful if you posted the relevant parts of your code (simplified if necessary)
0
 
LVL 4

Author Comment

by:CBeach1980
Comment Utility
The code is pretty basic.

Load dataReader (from DataAccess class)
while (dataReader.Read())
{
   ...Read in and perform analysis on data...
}
dataReader.Close();

8,388,608 does close after about 3.5 minutes.

If the .Close() operation is having trouble, would the using statement help any since doesn't it basically call the exact same operation?

As far as waiting long enough.  The Close() operation is happening in a windows application as part of a thread.  The thread sits and waits for it to close.  The code doesn't ever actually give up or throw an error it just never completes it's process (not really practical for what I'm going for).
0
Zoho SalesIQ

Hassle-free live chat software re-imagined for business growth. 2 users, always free.

 
LVL 15

Expert Comment

by:dave4dl
Comment Utility
take out everything you summarized with "...Read in and perform analysis on data..." (temporarily)

Rerun it with all the rows to see how long it takes to close.  If it closes without problems the trouble is caused by a piece of code we deleted
0
 
LVL 4

Author Comment

by:CBeach1980
Comment Utility
When removing the processing section it causes it to close quickly.  I'll see if I can isolate a particular section that is causing the problem.
0
 
LVL 4

Author Comment

by:CBeach1980
Comment Utility
Okay, here is what I've found.  If the entire process is allowed to run (takes about 3 hours) then there is no problem closing the dataReader.  However, the problem comes from the fact that I have to be able to stop the process midway in the event that a setting was wrong.  Apparently when you close a dataReader without having read each row the code inside of the .Close() operation will read the rows one by one until it gets to the end of the dataReader and then close it.  

Is there any way to skip to the end of the dataReader so that it doesn't waste time reading the additional rows that I'm not interested in?
Secondly, could I just set the dataReader = null to destroy the object or will that leave an orphaned dataReader in memory with a connection to the database until .NET gets around to cleaning it up?

Thanks for all the help so far.  This has been one of the more odd problems I've ever encountered.
0
 
LVL 15

Expert Comment

by:dave4dl
Comment Utility
try just closing the connection (not the data reader)
0
 
LVL 4

Author Comment

by:CBeach1980
Comment Utility
That ends up with the same problem.  For whatever reason if I try to close the connection it again takes a long time to complete.
0
 
LVL 15

Expert Comment

by:dave4dl
Comment Utility
what happens if you call the dispose method directly (without calling close)?
0
 
LVL 4

Author Comment

by:CBeach1980
Comment Utility
I tried that before.  I seems to act identically to Close().  The only way I've gotten it to not take forever is to set the object to null which basically almost the same as ignoring it I think.
0
 
LVL 15

Expert Comment

by:dave4dl
Comment Utility
If you are searching for one specific record or a few specific records, limit what you see by using a WHERE or HAVING clause in your SELECT statement.
0
 
LVL 4

Author Comment

by:CBeach1980
Comment Utility
I need all the rows in normal processing.  The only reason to want only part of the dataReader is if the process needs to prematurely stopped because a setting was wrong.  Is there any real danger or problem is I just set the object to null?
0
 
LVL 15

Accepted Solution

by:
dave4dl earned 500 total points
Comment Utility
no danger other than not having the object properly disposed of at a time of your choosing, which will lead to memory leaks (allocated memory that is not used for anything).  Eventually (some time after the containing block completes) .net garbage collection will take care of it (which may just call the dispose method thus calling the close method).  Hopefully this happends at a convenient time in the life of your process because it will probably take just as long to call the close method later as it does earlier.
0
 
LVL 15

Expert Comment

by:dave4dl
Comment Utility
Thanks!  

good luck
0

Featured Post

IT, Stop Being Called Into Every Meeting

Highfive is so simple that setting up every meeting room takes just minutes and every employee will be able to start or join a call from any room with ease. Never be called into a meeting just to get it started again. This is how video conferencing should work!

Join & Write a Comment

Introduction SQL Server Integration Services can read XML files, that’s known by every BI developer.  (If you didn’t, don’t worry, I’m aiming this article at newcomers as well.) But how far can you go?  When does the XML Source component become …
Load balancing is the method of dividing the total amount of work performed by one computer between two or more computers. Its aim is to get more work done in the same amount of time, ensuring that all the users get served faster.
This videos aims to give the viewer a basic demonstration of how a user can query current session information by using the SYS_CONTEXT function
Via a live example, show how to shrink a transaction log file down to a reasonable size.

763 members asked questions and received personalized solutions in the past 7 days.

Join the community of 500,000 technology professionals and ask your questions.

Join & Ask a Question

Need Help in Real-Time?

Connect with top rated Experts

10 Experts available now in Live!

Get 1:1 Help Now