[Last Call] Learn how to a build a cloud-first strategyRegister Now

x
  • Status: Solved
  • Priority: Medium
  • Security: Public
  • Views: 563
  • Last Modified:

How could exceeding max_user_connections bring down a website?

Part One

I've recently experienced site availability problems, whereby my host could not be reached either by trying to load web pages OR by accessing the MySql database through the admin interface OR by trying to send emails through their server.

When the site came back up, we noted for a few minutes messages such as "error 1040 too many connections" and "error 1203 exceeded max_user_connections". These were visible in areas of the site that required connections with the MySql database, such as where we provide polls or guestbook facilities for visitors.

My hosting company tells me that the problem is simply that we have outgrown a shared-server environment and that we need to upgrade to a dedicated server. I guess this makes SOME sense, but doesn't seem to me to be consistent with all the facts of what's been happening. Basically, I just can't see why too many open connections to MySql should affect the server's ability to serve web pages or simply route emails for us.

Part Two (obviously related to Part One)

Right now, I don't actively close connections to the database as a user navigates my site. There are many pages on the site, and many navigation paths through it which require reads and writes from the database, and my understanding from the manuals was that actively closing the connections is unnecessary as PHP does this automatically when appropriate. I guessed that closing them manually, for example each time a user leaves one web page and goes to another, would simply slow things down as a new connection would have to be made within a second of closing one down.

Is my understanding of this wrong? It seems relevant to ask, as I'd like to find a way of avoiding "maxing out" even if the "maxing out" in itself is not the reason for the rest of the site going down.

These might be simple questions to those of you far more skilled than I am, but I'd really like a comprehensive answer, and I'd really like to permanently resolve the issues over the next couple of days. Accordingly, I think this is worth 500 points.

Any help greatly appreciated. I had a pretty bad experience of not managing to get an effective answer to a question (not related to this problem) on this forum on my last attempt, so I'm looking to restore my faith in the usefulness of EE.

Many thanks

Grummy

0
Grummy
Asked:
Grummy
  • 6
  • 5
  • 4
  • +4
4 Solutions
 
fiboCommented:
Grummy,

I assume that most of your page display a content extracted from your database... which would imply that each time you open a page you currently create a new connection to the db.
Of course, mysql and php close thes open links after some time... but in the meantime the connections do accumulate.
So avoiding that is certainly a route to explore, whatever the "size" of your server.
At the same time, you might have a problem with the number of sessions you have simultaneously open.
I think I would start from here, checking that I properly reuse sessions as needed.
The I would explore 2 directions for MySQL:
- use a "permanent connection" to MySQl is something I do not master personnally... but it should help you to have a look at http://fr3.php.net/manual/fr/function.mysql-pconnect.php. This would allow you to open just one connection per session (and user)
- explicitly close the connection when possible. This would limit the number of simultenous open connections.
My preferred solution would be the last one, since it should accept a higher number of simultaneous users.
0
 
PromethylCommented:
Also consider caching page content so the database calls occur less often. You can cache it to disk, or an acceleration program (memory.)(
0
 
PromethylCommented:
PS: If you have a host that's maxed out, ask to switch to a less loaded box. If they refuse, switch webhosts. You get what you pay for.
0
What is SQL Server and how does it work?

The purpose of this paper is to provide you background on SQL Server. It’s your self-study guide for learning fundamentals. It includes both the history of SQL and its technical basics. Concepts and definitions will form the solid foundation of your future DBA expertise.

 
MenritCommented:
try to configure your my.cnf to support more connections as follows :

try changing the next 2 parameters such that they fit your needs

max_user_connections
max_connections
0
 
PromethylCommented:
"I've recently experienced site availability problems, whereby my host could not be reached either by trying to load web pages O"

When he says, "my host" the assumption is made he's on shared hosting, and these options are not available to him.
0
 
fiboCommented:
As additional work, you might also consider improving your mysql queries, so that everything is on youur side and you can tell your host provider "I've done my homwork, now do yours".
Usual good candidates to improvement are reconsidering indexes according to your most heavy queries: try the "explain" command followed by your usual mysql query, it might give you some clues to possible huge improvements.

Looking at the "php.net URL" I gave you: it includes FR in several places, which is something rather automatic with php.net. Chances are quite high that the URL I've given will in fact be changed and adpted to your own settings.
0
 
PromethylCommented:
My recommendation is eAccelerator. We're having a conversation parellel to this one about php file and db caching.

http://www.experts-exchange.com/Web/Web_Languages/PHP/Q_21383087.html#13747020

Has docs + example. It's worth a shot.
0
 
GrummyAuthor Commented:
Thanks guys, I've tried to explore the things you've mentioned, although I can't seem to find a pconnect reference that isn't full of French - which I don't understand too well!

I guess the first knotty question though is whether I'm being told the truth by my current host. As you say, I'm in a shared environment, which prevents me from simply changing the max_user_connections and max_connections parameters. Even so, it doesn't make a whole lot of logical sense to me that "maxing out" on db connections would bring down my whole website. Why wouldn't the pages load as normal, other than throwing errors for the page elements that require MySql connections?

Tech support at my host seems to think that they have a setting of 30 max_user_connections, which seems pretty dismal to me. Is that normal in an shared environment?

My hunting around the php manuals (the ones in English) seems to suggest that I shouldn't have to worry too much about actively closing the connections. We pull about 2,500 visitors per day right now, and the site appears to be quite "sticky", so it doesn't seem unreasonable to have 30+ simultaneous active connections to the database. I suppose what I mean by that is this doesn't look like evidence that my code is so inefficient that I'm leaving "dead" connections open unnecessarily.

Does that logic make sense to you guys?

Thanks again for your input.

Grummy
0
 
PromethylCommented:
Persistant connects are just that. If two people view your website one after another or at the same time, it only uses 1 mysql connection. The old  method uses 2.

What I think is happening is the domino effect. Failed queries cause more load on the PHP program (error messages, etc). I think load on your Apache is too high. With load too high on the box, yes, no page at all will come up.
 
30 is fine. What's the chances you'll ever go over that? Unless, of course, you have many many users viewing simul. All the means is you can only have 30 viewers at any given time. (May indicate they're cheap.)

The connection closes automatically. Closing the connection manually may save you a few moment only.  

I would say you are working logically so far.

Try pconnect first, since it requires only one change.

Then consider optimizing SQL queries, caching SQL queries.

I really did mean you get what you pay for. Try to fix it, and then consider your options.
0
 
fiboCommented:
1 - the exact link would come from http://www.php.net/manual/function.mysql-pconnect.php , and would be adpted to your own language.
2 - If connections are used for a long time AND you have "sticky users", then you need to actively go the mysql_close route....
0
 
virmaiorCommented:
Promethyl - I'm somewhat sure that pconnecting HAS the opposite impact as what you're suggesting
what's probably happening is that you are using many mysql_connect() calls in your code...
if they are all to the same DB, then I would recommend just putting it at the top of the page, it will automatically close when your page is done loading.  a pconnect won't close when your page is done loading and thus just makes the problem worse.

0
 
GrummyAuthor Commented:
I have to say I was drawn to the same conclusion - it could make matters worse - as I read through the php manual in a bit more depth. But I figured that you guys know a lot more about this than I do, so I've been kinda scratching my head wondering where I was going wrong.

To go to Promethyl's comment (3 up from here), my understanding was that a connection was closed once the php script calling it had completed, therefore actively closing it seemed unnecessary. But if I use pconnect, I guess that same connection will be held open for as long as the user's session remains active, whether or not she navigates from page to page on the site. Right now, I think (I'm still not sure) that a user going from one page to another will open and close connections as each page is displayed.

Hence, my hunch was that pconnect seemed like a step in the wrong direction. But, on the other hand, I have little doubt that Promethyl knows a lot more about this than I do!

Puzzling...

G
0
 
virmaiorCommented:
well i tried using pconnect to fix another problem (I wanted to use temp tables to produce a certain type of result without having to recalculate on every page load), but the net result was that it just kept adding connections that wouldn't die appropriately and that I couldn't find when I was looking for them.  (I couldn't find a way to send the connection through a session variable).

do you have multiple mysql_connect() in your page?
I had that for a while and was running into strange issues so I just centralized it as the first function in my include page.  Since then, it has been clean sailing.  (since even when there are many users, the number of openly serving pages isn't that high).

0
 
GrummyAuthor Commented:
The way I do it currently is to use a function to establish a connection_id returned by mysql_connect, then check for the existence of connection_id before each subsequent database read/write on the same page.

That way, I figured that I'd open one connection that would persist for the duration of the script, then it would close itself without my having to worry about it. I've had no problems with this until my site started crashing last week.

Truth is, I don't know if 30 open connections seems like a lot for a site that's having maybe 2,500 users a day hitting it. It doesn't sound like a lot to me at first thought, but on the other hand - if the connections are active ONLY while the php script runs before the page is served to the browser (a few milliseconds?) - then maybe it is actually a heck of a lot!

I'm trying to get a "feel" for what might be going wrong here, and I think I must conclude I've lost a bit of trust in my web host to tell me the truth.

G
0
 
virmaiorCommented:
that seems somewhat likely to be honest.
30 might be plenty.
since it sounds like you're implementing the right sort of system
0
 
PromethylCommented:
I stand corrected on pconnect useage. You would want to retain it's connection ID, and reuse it ? Makes sense to me.

I still would say use caching wherever possible to light load on the mysql server in particular and the box in general.

I used systems like this on my box, and it was the correct/right action at the time.
0
 
ShelfieldCollegeCommented:
I also agree that caching pages wherever possible is definately the way to go. PHP will close MySQL connections automatically at the end of processing the current file (as far as I know) unless your using a persistent connection obviously.  One idea would be to open your connection, retrieve data from the database then close the connection before you continue processing the data, this way the connection doesn't remain open until the end of the file, it's closed as soon as it's no longer needed.

Probably the prefered option would be to use persistent connections and caching.

Cheers
0
 
fiboCommented:
I believe hayt you should not rely on php closing the connection "at the end of the page", since in fact the server has no way to know that you have opened another page (eg, there are includes etc).
So it uses some default behaviour: my "blind guess" would be that after some minutes it closes the link... but this can certainly be imporved by closing explicitly the link...
We all know that "implicit behaviours" have random effects, since they cah have different implementations on different machines....
Another question to Grummy: are your pages "semi-static", ie with few changes from one day to the other?
0
 
virmaiorCommented:
it closes the moment its done processing the page
since php is only running during a page request.
so I don't think we're talking minutes, more like milliseconds.
0
 
GrummyAuthor Commented:
Milliseconds was what I was thinking too. So, if I've got about 2,500 people hitting the site per day, that's one every 30 seconds on average. If each script takes a few milliseconds to run, what are the chances of "maxing out" on connections if max_user_connections is set at 30?  About the same as winning the lottery, I would have thought!

So, with your collective help, I think I'm coming to the conclusion that I can't have genuinely "maxed out", and "maxing out" is not the real reason for my site having crashed. But it's what my host is telling me... hmm.

To answer fibo's question, I wouldn't call my pages semi-static in the sense that they remain unchanged for many days. During development and tuning of the site, I guess I change stuff every day!  Why do you ask?

So, just to kind of get us back to the orginal two-part question, I think this is what we've collectively concluded:

1. The site's crashing was probably NOT caused by too many simultaneous users causing the max-user_connections to trip out. Which means that something else caused the crashes, and a side-effect was that the system started throwing error messages suggesting that it had maxed out. Which in turn suggests that the crash may have prevented connections from closing in some way?

2. There appears to be no need for me to change my code to close the connections manually, because each connection is only open for a few milliseconds and the chances of 30+ users sharing those same few milliseconds is very small.

Do those conclusions seem right to you guys?

Finally, I think I should conclude that my existing host is not keen to get to the bottom of this issue, which to me is a critical one because it brought my site down and lost us some customers. The answers they've given to my questions don't seem to stack up. Is this commonplace in a shared hosting environment, or did I just make an unfortunate choice of host?

Grummy
0
 
cpucciCommented:
I believe the site being down and the max_connections to be two totally seperate issues. I run a hosting company as well as many large websites and have dealt with this before. In a shared hosting enviroment the max number of connections is usually set pretty low to prevent malicious activity. It could very well be the truth that your site creates too many connections to the database and therefore you recieve the error messages you mentioned. The site being totally unreachable however is an issue that should not be dependant on your database. If the site is down and you can't ping the server, it's your hosts problem not the amount of connections you are making to the database.
0
 
GrummyAuthor Commented:
First of all, apologies for the hitatus in this thread - it's taken a lot of time to get any sort of reasonably concerned response from my web host.

By a combination of making adjustments based on some of the good advice from several of you in the exchanges above AND my web host's moving my site to a less-loaded box, the issue seems either to have gone away or at least gone into remission. The problem has not re-occurred for a couple of weeks now.

So I'll close this thread now with much gratitude. It's a difficult one in which to award appropriate points to appropriate people, so I hope those of you who contributed at all will accept my genuine gratitude for your involvement.

Grummy
0

Featured Post

Transaction-level recovery for Oracle database

Veeam Explore for Oracle delivers low RTOs and RPOs with agentless transaction log backup and transaction-level recovery of Oracle databases. You can restore the database to a precise point in time, even to a specific transaction.

  • 6
  • 5
  • 4
  • +4
Tackle projects and never again get stuck behind a technical roadblock.
Join Now