SqlClient Multiple Connections with SQL Server 2000

I have a report engine that monitors a Queue table.  It runs a Select TOP 1 Query every 2-3 seconds.  What I have done in the past with ADODB was open a global connection for the program and leave it open for the duration of the .EXE lifetime (it never caused resource issues in the past).  I'm wanting to migrate this to use SqlClient through the .NET Framework 2 and I've found that with SQL Server 2000 it doesn't support multiple connections if the connection has one open reader without opening more connection objects?  

Is there a way to get around this with 2000 as the DB Server.  I could open a new connection and close it every 2-3 seconds but wouldn't that be less efficient than opening a single connection that's just reused over and over again?
LVL 3
IUFITSAsked:
Who is Participating?
 
Bob LearnedConnect With a Mentor Commented:
Here is some reading about connection pooling:

SQL Server Connection Pooling Myths
http://www.sql-server-performance.com/sk_connection_pooling_myths.asp

Connection Pooling with ASP
http://www.15seconds.com/issue/970531.htm

<Quote> Why Use Connection Pooling?

Connection pooling is used to enhance the performance of executing commands on a database. Before executing a command a connection to that database needs to be established. Sometimes creating and tearing down the connection is more costly then executing the command. For this reason connection pools are created to keep connections alive. After a connection is created it is placed in the connection pool. Connections are then used from the pool so that a new connection does not need to be established. If all the connections in the pool are being used, new connections are created and made available through the pool.

Connection pooling is extremely useful when used with applications that do not have a state.  State is a presence between instances. Active Server pages are consider stateless since they do not share data between themselves. Stateless applications benefit from connection pooling since they can not hold a connection open by themselves.

Connection pooling is also useful for sharing resources. An application can keep a connection alive, can reusing the connection multiple times to execute commands without connection pooling. But, if that application has many instances, the available resources on the machine to handle the connections will be reduced. If that application uses connection pooling, the connection resources are shared between instances of the application. With connection pooling, less connection resources are needed.</Quote>

Bob
0
 
muzzy2003Commented:
Don't use a DataReader. This holds the connection open until it is closed and disposed. Instead, use the methods to populate a DataSet (you'll need a SqlDataAdapter, and its .Fill method). This will allow each call to reuse a connection.
0
 
strickddConnect With a Mentor Commented:
You can dispose of the reader object and create a new one each time.
0
Cloud Class® Course: CompTIA Healthcare IT Tech

This course will help prep you to earn the CompTIA Healthcare IT Technician certification showing that you have the knowledge and skills needed to succeed in installing, managing, and troubleshooting IT systems in medical and clinical settings.

 
Bob LearnedCommented:
Use connection pooling.

Bob
0
 
strickddConnect With a Mentor Commented:
Connection pooling will establish multiple SQL connections each time you connect though.
0
 
Bob LearnedCommented:
In order to reuse a DataReader, the connection needs to be opened and closed anyway, so I fail to see the benefit of using one.

Bob
0
 
IUFITSAuthor Commented:
If it helps, for my context my report engine is a VB.NET application that's running on an application server 24/7.
0
 
IUFITSAuthor Commented:
Here's what I did.  I moved the database to a SQL 2005 server and added the "MultipleActiveResultSets=True" which allowed the connection to have a data reader open and then also execute other data readers (or in my case, non query commands associated with the data processing I was doing on the original record set).  I kept running out of memory and realized that I wasn't disposing of my command objects, once I handled that properly it appears to be working great.  I'm running a load test now both monitoring the local memory/cpu usage as well as the load it places on the server.

I'm going to accept the answers that helped get me thinking in the right direction, thanks for all who helped!
0
Question has a verified solution.

Are you are experiencing a similar issue? Get a personalized answer when you ask a related question.

Have a better answer? Share it in a comment.

All Courses

From novice to tech pro — start learning today.