DB memory and Max Pool Size

Posted on 2014-12-19
Last Modified: 2014-12-22
How can I check if my DB has enough memory to handle the extra connections for a Max pool size of 200?

That's the short question. Here's the explanation as to why:

I have a sql server 2012 database. I am getting the error:
Timeout expired. The timeout period elapsed prior to obtaining a connection from the pool

All of the connections are closed immediately, I use con.close or Using Con.

I checked and there were 220 people using the site. And I ran an sp_who after the problem was reported (and had gone away) and there were only 54 items. Not exactly sure how running that tells me anything, but the forum post said to run it.

So I'm thinking maybe I need to up the Max Pool Size=200; in my connection string. However, it says to make sure your DB has enough memory to handle the extra connections.

How would I know that? How can I check if my DB has enough memory to handle the extra connections?

Question by:Starr Duskk
  • 3
  • 2
LVL 23

Accepted Solution

Racim BOUDJAKDJI earned 500 total points
ID: 40510270
If your objective is to guarantee SQL Server will not overload the CPU an memory buffer pool, what you should do is seriously consider using MARS instead of expanding memory randomly.  

This is the right way to guarantee resources get consumed economically.  Simple and efficient.  If that is possible that you won't have to add any additional memory and it will work better and faster.  For more info on MARS please read the following article...

Hope this helps...
LVL 47

Expert Comment

by:Vitor Montalvão
ID: 40512879
Why are you pointing to a SQL Server issue?
Can you provide more information?
Where are you getting the error? What kind of operation are you trying to perform? Which kind of object are you using to connect to the database?
LVL 23

Expert Comment

ID: 40512972
<<Why are you pointing to a SQL Server issue?>>

It is reasonable to assume this could be a client issue and this is why I suggested using MARS (if possible).  I found this  simple change highly effective in most cases of connections pools related issues.
3 Use Cases for Connected Systems

Our Dev teams are like yours. They’re continually cranking out code for new features/bugs fixes, testing, deploying, testing some more, responding to production monitoring events and more. It’s complex. So, we thought you’d like to see what’s working for us.


Author Comment

by:Starr Duskk
ID: 40513417
My original question was:
How can I check if my DB has enough memory to handle the extra connections?

I thought perhaps someone would tell me how I can know if going to 200 pool size, what kind of memory is required.

I have 15 GB memory on this server, so I'm just going to assume that will be enough, since no one seems to know. I have to get this done now.

thanks for your suggestions, I'm closing this out.

Author Closing Comment

by:Starr Duskk
ID: 40513425
Thanks. I have no intention of using MARS at this point, but maybe down the road it will be useful. I looked at the page and it will take some major changes to implement and that wasn't my original question.
LVL 23

Expert Comment

ID: 40513514
<<I thought perhaps someone would tell me how I can know if going to 200 pool size, what kind of memory is required.>>
Your question is based on the assumption that the resources consumed on SQL Server depend only on the number of connections : this assumption is false.  

The number of connections plays one part but what plays on the SQL Side is the size of each individual connection has to mobilize in SQL Server memory on average.  You may have 10 connections requiring a cache of 45GB and 2000 connections requiring 50MB.  So there is no way to answer this straight away.  The only way to bring up an answer your situation is to either evaluate the average of cache that has to be maintained to sustain your client pool to evaluate your server pool OR to proactively reduce the footprint and hope for the best.  Which is why I suggested MARS as the simplest way to get your mind to ease.  If you can not do that then you will have to evaluate how much data should be requested each sec on average from your client then evaluate how much RAM is required on the data cache to sustain all the data required.

Another way to say this is that the client pool size does say anything about the server pool only when you know how much each connection requires on average.  Sorry I could not tell you more but there is not much more to say given the information you gave us.

Good luck for the rest of the sizing and hope for the best.  Let us know if we can help further on this.

Featured Post

Optimizing Cloud Backup for Low Bandwidth

With cloud storage prices going down a growing number of SMBs start to use it for backup storage. Unfortunately, business data volume rarely fits the average Internet speed. This article provides an overview of main Internet speed challenges and reveals backup best practices.

Question has a verified solution.

If you are experiencing a similar issue, please ask a related question

In this article I will describe the Copy Database Wizard method as one possible migration process and I will add the extra tasks needed for an upgrade when and where is applied so it will cover all.
JSON is being used more and more, besides XML, and you surely wanted to parse the data out into SQL instead of doing it in some Javascript. The below function in SQL Server can do the job for you, returning a quick table with the parsed data.
Via a live example, show how to extract information from SQL Server on Database, Connection and Server properties
This videos aims to give the viewer a basic demonstration of how a user can query current session information by using the SYS_CONTEXT function

776 members asked questions and received personalized solutions in the past 7 days.

Join the community of 500,000 technology professionals and ask your questions.

Join & Ask a Question