Link to home
Start Free TrialLog in
Avatar of Grummy
Grummy

asked on

How could exceeding max_user_connections bring down a website?

Part One

I've recently experienced site availability problems, whereby my host could not be reached either by trying to load web pages OR by accessing the MySql database through the admin interface OR by trying to send emails through their server.

When the site came back up, we noted for a few minutes messages such as "error 1040 too many connections" and "error 1203 exceeded max_user_connections". These were visible in areas of the site that required connections with the MySql database, such as where we provide polls or guestbook facilities for visitors.

My hosting company tells me that the problem is simply that we have outgrown a shared-server environment and that we need to upgrade to a dedicated server. I guess this makes SOME sense, but doesn't seem to me to be consistent with all the facts of what's been happening. Basically, I just can't see why too many open connections to MySql should affect the server's ability to serve web pages or simply route emails for us.

Part Two (obviously related to Part One)

Right now, I don't actively close connections to the database as a user navigates my site. There are many pages on the site, and many navigation paths through it which require reads and writes from the database, and my understanding from the manuals was that actively closing the connections is unnecessary as PHP does this automatically when appropriate. I guessed that closing them manually, for example each time a user leaves one web page and goes to another, would simply slow things down as a new connection would have to be made within a second of closing one down.

Is my understanding of this wrong? It seems relevant to ask, as I'd like to find a way of avoiding "maxing out" even if the "maxing out" in itself is not the reason for the rest of the site going down.

These might be simple questions to those of you far more skilled than I am, but I'd really like a comprehensive answer, and I'd really like to permanently resolve the issues over the next couple of days. Accordingly, I think this is worth 500 points.

Any help greatly appreciated. I had a pretty bad experience of not managing to get an effective answer to a question (not related to this problem) on this forum on my last attempt, so I'm looking to restore my faith in the usefulness of EE.

Many thanks

Grummy

Avatar of Bernard Savonet
Bernard Savonet
Flag of France image

Grummy,

I assume that most of your page display a content extracted from your database... which would imply that each time you open a page you currently create a new connection to the db.
Of course, mysql and php close thes open links after some time... but in the meantime the connections do accumulate.
So avoiding that is certainly a route to explore, whatever the "size" of your server.
At the same time, you might have a problem with the number of sessions you have simultaneously open.
I think I would start from here, checking that I properly reuse sessions as needed.
The I would explore 2 directions for MySQL:
- use a "permanent connection" to MySQl is something I do not master personnally... but it should help you to have a look at http://fr3.php.net/manual/fr/function.mysql-pconnect.php. This would allow you to open just one connection per session (and user)
- explicitly close the connection when possible. This would limit the number of simultenous open connections.
My preferred solution would be the last one, since it should accept a higher number of simultaneous users.
Also consider caching page content so the database calls occur less often. You can cache it to disk, or an acceleration program (memory.)(
PS: If you have a host that's maxed out, ask to switch to a less loaded box. If they refuse, switch webhosts. You get what you pay for.
Avatar of Menrit
Menrit

try to configure your my.cnf to support more connections as follows :

try changing the next 2 parameters such that they fit your needs

max_user_connections
max_connections
"I've recently experienced site availability problems, whereby my host could not be reached either by trying to load web pages O"

When he says, "my host" the assumption is made he's on shared hosting, and these options are not available to him.
SOLUTION
Avatar of Bernard Savonet
Bernard Savonet
Flag of France image

Link to home
membership
This solution is only available to members.
To access this solution, you must be a member of Experts Exchange.
Start Free Trial
My recommendation is eAccelerator. We're having a conversation parellel to this one about php file and db caching.

https://www.experts-exchange.com/questions/21383087/global-persistent-variables-and-objects.html#13747020

Has docs + example. It's worth a shot.
Avatar of Grummy

ASKER

Thanks guys, I've tried to explore the things you've mentioned, although I can't seem to find a pconnect reference that isn't full of French - which I don't understand too well!

I guess the first knotty question though is whether I'm being told the truth by my current host. As you say, I'm in a shared environment, which prevents me from simply changing the max_user_connections and max_connections parameters. Even so, it doesn't make a whole lot of logical sense to me that "maxing out" on db connections would bring down my whole website. Why wouldn't the pages load as normal, other than throwing errors for the page elements that require MySql connections?

Tech support at my host seems to think that they have a setting of 30 max_user_connections, which seems pretty dismal to me. Is that normal in an shared environment?

My hunting around the php manuals (the ones in English) seems to suggest that I shouldn't have to worry too much about actively closing the connections. We pull about 2,500 visitors per day right now, and the site appears to be quite "sticky", so it doesn't seem unreasonable to have 30+ simultaneous active connections to the database. I suppose what I mean by that is this doesn't look like evidence that my code is so inefficient that I'm leaving "dead" connections open unnecessarily.

Does that logic make sense to you guys?

Thanks again for your input.

Grummy
SOLUTION
Link to home
membership
This solution is only available to members.
To access this solution, you must be a member of Experts Exchange.
Start Free Trial
1 - the exact link would come from http://www.php.net/manual/function.mysql-pconnect.php , and would be adpted to your own language.
2 - If connections are used for a long time AND you have "sticky users", then you need to actively go the mysql_close route....
Promethyl - I'm somewhat sure that pconnecting HAS the opposite impact as what you're suggesting
what's probably happening is that you are using many mysql_connect() calls in your code...
if they are all to the same DB, then I would recommend just putting it at the top of the page, it will automatically close when your page is done loading.  a pconnect won't close when your page is done loading and thus just makes the problem worse.

Avatar of Grummy

ASKER

I have to say I was drawn to the same conclusion - it could make matters worse - as I read through the php manual in a bit more depth. But I figured that you guys know a lot more about this than I do, so I've been kinda scratching my head wondering where I was going wrong.

To go to Promethyl's comment (3 up from here), my understanding was that a connection was closed once the php script calling it had completed, therefore actively closing it seemed unnecessary. But if I use pconnect, I guess that same connection will be held open for as long as the user's session remains active, whether or not she navigates from page to page on the site. Right now, I think (I'm still not sure) that a user going from one page to another will open and close connections as each page is displayed.

Hence, my hunch was that pconnect seemed like a step in the wrong direction. But, on the other hand, I have little doubt that Promethyl knows a lot more about this than I do!

Puzzling...

G
ASKER CERTIFIED SOLUTION
Link to home
membership
This solution is only available to members.
To access this solution, you must be a member of Experts Exchange.
Start Free Trial
Avatar of Grummy

ASKER

The way I do it currently is to use a function to establish a connection_id returned by mysql_connect, then check for the existence of connection_id before each subsequent database read/write on the same page.

That way, I figured that I'd open one connection that would persist for the duration of the script, then it would close itself without my having to worry about it. I've had no problems with this until my site started crashing last week.

Truth is, I don't know if 30 open connections seems like a lot for a site that's having maybe 2,500 users a day hitting it. It doesn't sound like a lot to me at first thought, but on the other hand - if the connections are active ONLY while the php script runs before the page is served to the browser (a few milliseconds?) - then maybe it is actually a heck of a lot!

I'm trying to get a "feel" for what might be going wrong here, and I think I must conclude I've lost a bit of trust in my web host to tell me the truth.

G
that seems somewhat likely to be honest.
30 might be plenty.
since it sounds like you're implementing the right sort of system
I stand corrected on pconnect useage. You would want to retain it's connection ID, and reuse it ? Makes sense to me.

I still would say use caching wherever possible to light load on the mysql server in particular and the box in general.

I used systems like this on my box, and it was the correct/right action at the time.
I also agree that caching pages wherever possible is definately the way to go. PHP will close MySQL connections automatically at the end of processing the current file (as far as I know) unless your using a persistent connection obviously.  One idea would be to open your connection, retrieve data from the database then close the connection before you continue processing the data, this way the connection doesn't remain open until the end of the file, it's closed as soon as it's no longer needed.

Probably the prefered option would be to use persistent connections and caching.

Cheers
I believe hayt you should not rely on php closing the connection "at the end of the page", since in fact the server has no way to know that you have opened another page (eg, there are includes etc).
So it uses some default behaviour: my "blind guess" would be that after some minutes it closes the link... but this can certainly be imporved by closing explicitly the link...
We all know that "implicit behaviours" have random effects, since they cah have different implementations on different machines....
Another question to Grummy: are your pages "semi-static", ie with few changes from one day to the other?
it closes the moment its done processing the page
since php is only running during a page request.
so I don't think we're talking minutes, more like milliseconds.
Avatar of Grummy

ASKER

Milliseconds was what I was thinking too. So, if I've got about 2,500 people hitting the site per day, that's one every 30 seconds on average. If each script takes a few milliseconds to run, what are the chances of "maxing out" on connections if max_user_connections is set at 30?  About the same as winning the lottery, I would have thought!

So, with your collective help, I think I'm coming to the conclusion that I can't have genuinely "maxed out", and "maxing out" is not the real reason for my site having crashed. But it's what my host is telling me... hmm.

To answer fibo's question, I wouldn't call my pages semi-static in the sense that they remain unchanged for many days. During development and tuning of the site, I guess I change stuff every day!  Why do you ask?

So, just to kind of get us back to the orginal two-part question, I think this is what we've collectively concluded:

1. The site's crashing was probably NOT caused by too many simultaneous users causing the max-user_connections to trip out. Which means that something else caused the crashes, and a side-effect was that the system started throwing error messages suggesting that it had maxed out. Which in turn suggests that the crash may have prevented connections from closing in some way?

2. There appears to be no need for me to change my code to close the connections manually, because each connection is only open for a few milliseconds and the chances of 30+ users sharing those same few milliseconds is very small.

Do those conclusions seem right to you guys?

Finally, I think I should conclude that my existing host is not keen to get to the bottom of this issue, which to me is a critical one because it brought my site down and lost us some customers. The answers they've given to my questions don't seem to stack up. Is this commonplace in a shared hosting environment, or did I just make an unfortunate choice of host?

Grummy
SOLUTION
Link to home
membership
This solution is only available to members.
To access this solution, you must be a member of Experts Exchange.
Start Free Trial
Avatar of Grummy

ASKER

First of all, apologies for the hitatus in this thread - it's taken a lot of time to get any sort of reasonably concerned response from my web host.

By a combination of making adjustments based on some of the good advice from several of you in the exchanges above AND my web host's moving my site to a less-loaded box, the issue seems either to have gone away or at least gone into remission. The problem has not re-occurred for a couple of weeks now.

So I'll close this thread now with much gratitude. It's a difficult one in which to award appropriate points to appropriate people, so I hope those of you who contributed at all will accept my genuine gratitude for your involvement.

Grummy