Go Premium for a chance to win a PS4. Enter to Win

x
  • Status: Solved
  • Priority: Medium
  • Security: Public
  • Views: 231
  • Last Modified:

DB memory and Max Pool Size

How can I check if my DB has enough memory to handle the extra connections for a Max pool size of 200?

That's the short question. Here's the explanation as to why:

I have a sql server 2012 database. I am getting the error:
Timeout expired. The timeout period elapsed prior to obtaining a connection from the pool

All of the connections are closed immediately, I use con.close or Using Con.

I checked and there were 220 people using the site. And I ran an sp_who after the problem was reported (and had gone away) and there were only 54 items. Not exactly sure how running that tells me anything, but the forum post said to run it.

So I'm thinking maybe I need to up the Max Pool Size=200; in my connection string. However, it says to make sure your DB has enough memory to handle the extra connections.

How would I know that? How can I check if my DB has enough memory to handle the extra connections?

thanks!
0
Starr Duskk
Asked:
Starr Duskk
  • 3
  • 2
1 Solution
 
Racim BOUDJAKDJICommented:
If your objective is to guarantee SQL Server will not overload the CPU an memory buffer pool, what you should do is seriously consider using MARS instead of expanding memory randomly.  

This is the right way to guarantee resources get consumed economically.  Simple and efficient.  If that is possible that you won't have to add any additional memory and it will work better and faster.  For more info on MARS please read the following article...

http://msdn.microsoft.com/fr-fr/library/ms131686.aspx
http://technet.microsoft.com/en-us/library/ms345109(v=sql.90).aspx

Hope this helps...
0
 
Vitor MontalvãoMSSQL Senior EngineerCommented:
Why are you pointing to a SQL Server issue?
Can you provide more information?
Where are you getting the error? What kind of operation are you trying to perform? Which kind of object are you using to connect to the database?
0
 
Racim BOUDJAKDJICommented:
<<Why are you pointing to a SQL Server issue?>>
Exactly.  

It is reasonable to assume this could be a client issue and this is why I suggested using MARS (if possible).  I found this  simple change highly effective in most cases of connections pools related issues.
0
NFR key for Veeam Agent for Linux

Veeam is happy to provide a free NFR license for one year.  It allows for the non‑production use and valid for five workstations and two servers. Veeam Agent for Linux is a simple backup tool for your Linux installations, both on‑premises and in the public cloud.

 
Starr DuskkASP.NET VB.NET DeveloperAuthor Commented:
My original question was:
How can I check if my DB has enough memory to handle the extra connections?

I thought perhaps someone would tell me how I can know if going to 200 pool size, what kind of memory is required.

I have 15 GB memory on this server, so I'm just going to assume that will be enough, since no one seems to know. I have to get this done now.

thanks for your suggestions, I'm closing this out.
0
 
Starr DuskkASP.NET VB.NET DeveloperAuthor Commented:
Thanks. I have no intention of using MARS at this point, but maybe down the road it will be useful. I looked at the page and it will take some major changes to implement and that wasn't my original question.
0
 
Racim BOUDJAKDJICommented:
<<I thought perhaps someone would tell me how I can know if going to 200 pool size, what kind of memory is required.>>
Your question is based on the assumption that the resources consumed on SQL Server depend only on the number of connections : this assumption is false.  

The number of connections plays one part but what plays on the SQL Side is the size of each individual connection has to mobilize in SQL Server memory on average.  You may have 10 connections requiring a cache of 45GB and 2000 connections requiring 50MB.  So there is no way to answer this straight away.  The only way to bring up an answer your situation is to either evaluate the average of cache that has to be maintained to sustain your client pool to evaluate your server pool OR to proactively reduce the footprint and hope for the best.  Which is why I suggested MARS as the simplest way to get your mind to ease.  If you can not do that then you will have to evaluate how much data should be requested each sec on average from your client then evaluate how much RAM is required on the data cache to sustain all the data required.

Another way to say this is that the client pool size does say anything about the server pool only when you know how much each connection requires on average.  Sorry I could not tell you more but there is not much more to say given the information you gave us.

Good luck for the rest of the sizing and hope for the best.  Let us know if we can help further on this.
0

Featured Post

Free Tool: IP Lookup

Get more info about an IP address or domain name, such as organization, abuse contacts and geolocation.

One of a set of tools we are providing to everyone as a way of saying thank you for being a part of the community.

  • 3
  • 2
Tackle projects and never again get stuck behind a technical roadblock.
Join Now