I have an VB.NET 2003/2005 program that is a report server. It's currently using classic ADO and has no problems... when I stress test it runs select statements every 3 seconds to check a queue on a connection that's global and never closed (if it happens to lose connection, it reconnects).
Now, I've updated my code to use ADO.NET's SqlConnection, SqlCommand and SqlDataReader (used to use ADODB.Connection and ADODB.Recordset). Same theory, I create a global connection and set MultipleActiveResultSets=True so the connection can handle multiple data readers (using SQL Server 2005). The problem is, now the program crashes under heavy load with a not enough system resources error (when there is plenty of RAM left and plenty of disk space left)... I can only surmise this is a leak of some sort.
I believe I'm disposing of all of my commands and then setting them equal to nothing... the connection however is constant so I'm not disposing of it at all unless it should lose connection for some reason, then it re-establishes. I've tried with and without connection pools... My brain is fried, I need suggestions to get me back on track.
I could try opening and closing a connection every three seconds when it runs the query but I thought it would be more effiecient if I didn't, if that assumption is incorrect, also let me know. I'm putting 500 on this because my brain hurts, it's Friday and I'm going to obsess over this until I figure it out.