Limit concurrent active requests per user on apache

Is there a way or module that allows to limit the number of concurrent active connections per user (defined based on the Cookie PHPSESSID header) with Apache httpd on the apache level.
I can't do this on the PHP level, since php sessions are locked, so there are actually no concurrently executed page requests - one is executed, all others are waiting until the active session closes.
If there are more then the defined limit number of active requests I'm fine with rejecting any further requests - until the number of concurrent active requests goes below the limit per user.

If there is a way to exclude page requests with certain request URL patterns (eg. jpg, css, js) from this limit that would be ok with me.

The issue we are running into is that sometimes randomly single user's browsers open hundreds of connections that show up as "W" status in the apache status, but did not send a single byte to the browser, I assume thats because of the PHP session locking while the first request is slow or stuck.
Who is Participating?

[Product update] Infrastructure Analysis Tool is now available with Business Accounts.Learn More

I wear a lot of hats...

"The solutions and answers provided on Experts Exchange have been extremely helpful to me over the last few years. I wear a lot of hats - Developer, Database Administrator, Help Desk, etc., so I know a lot of things but not a lot about one thing. Experts Exchange gives me answers from people who do know a lot about one thing, in a easy to use platform." -Todd S.

Please detail more - are you using Linux or Windows? Which apache version? Can we see phpinfo() and apache status with hostnames/IPs hidden?
SWB-ConsultingAuthor Commented:
Even though, I believe the OS, apache & php version (I have seen this happen with PHP 5.2, 5.3 and 5.4) are pretty irrelevant to the issue and a solution, attached a screenshot of parts of the apache status window with hostnames, urls and IPs blurred out.
The ones with the yellow background are ones are coming from one IP address and all going to the same url. Basically all workers in W status are from requests from that IP address and that browser.
This buildup all happened within a little over 5 minutes.
This randomly happens with different random PHP pages that use PHP sessions. It does not happen with static files.
That's why I'm looking to limit the number of concurrent "active" connections using the same PHP session or rather using the same PHPSESSID cookie in the request header.

Attached also a screenshot of graphs from our server monitoring. On the left Web server stats, on the right db server stats. Other than the massive increase of workers in "sending reply" (W) status between 12:40 and 12:45 there is not much else going on. The little dip in memory utilization is from restarting apache.
You are using RedHat 6 or clone, and that is completely relevant to the problem.
You must enable (change the line) KeepAlive On in /etc/httpd/conf/httpd.conf and run "apachectl graceful" (dont need to restart apache fully for small change)
After clients will need less connections (and apache workers) and bad clients will not leave (so many) lingering connections behind.
Please save apache status and "netstat -anp | grep httpd" before change and after 1h to see if it fixes the problem.

If that does not bring enough relief we can try modules
mod_evasive would reject (403) requests past some thresholds
mod_qos can operate on cookies or URLs
Both are available for your Red Hat 6 from Fedora EPEL
Neither will address first low-level problem/RedHat deficiency.
The 7 Worst Nightmares of a Sysadmin

Fear not! To defend your business’ IT systems we’re going to shine a light on the seven most sinister terrors that haunt sysadmins. That way you can be sure there’s nothing in your stack waiting to go bump in the night.

SWB-ConsultingAuthor Commented:
Keepalive is already enabled as you can tell from the worker status board (There are some in K status) of the previously attached apache status.

The problem occurs randomly and rarely, sometimes not for weeks, sometimes it happens more than once per day. Sometimes during low traffic time (at night) and sometimes during the day in high traffic times (>20k php page requests per hour)

I have looked through the mod_evasive and mod_qos documentation and was not able to find any reference how to limit the number of concurrent requests per visitor based on session, not just IP
Indeed you found is right about modules... There is none that limits connections pre cookie in request, as request comes 1 packet too late - when connection is accepted.
I think evasive could help - measure for a day "maximum number of connections per IP", then allow few more. In this case the connection over the jar will be short-circuited with 403

If using prefork -> heard of worker
If using worker -> heard of fcgid
If using fcgid -> heard of nginx
SWB-ConsultingAuthor Commented:
Your solution would still be IP based. On some of our installations we use cloudflare, as a result multiple different connections may come from the same IP address, but are actually requests by completely independent  visitors.

What about mod_interval_limit ( with mod_usertrack (and memcached)? It seems it allows limiting number of connections within a timeframe per client based on a session cookie
Multiple connections from same IP or same cookie are absolutely normal and acceptable by http standard.
Your module has one contributor. It works for them only, try others. Youy can even liberate cloudflare from all limits.
SWB-ConsultingAuthor Commented:
I know and understand that multiple connections from the same IP and cookie are absolutely normal and acceptable, but I need a solution to keep this within certain limits.
Limiting number of connections per IP address does not seem to be a solution for me.
If this module only has one contributor, are there any other solutions?
Install mod_evasive, set it to something like 20 per IP, check logs next morning - whom you denied? I dont get more connections than that from wild nokia pdf reader over gprs...

Experts Exchange Solution brought to you by

Your issues matter to us.

Facing a tech roadblock? Get the help and guidance you need from experienced professionals who care. Ask your question anytime, anywhere, with no hassle.

Start your 7-day free trial
SWB-ConsultingAuthor Commented:
I will give that a try.
Thank you.
Try to improve efficiency http:#40965035 too. Should not be a problem to handle 100 requests at the time for any PHP setup.
SWB-ConsultingAuthor Commented:

The server handles easily 400 PHP requests at a time. This is not a problem of handling PHP requests, since except for the first one, all others are stuck due to the PHP session lock and not using any CPU time.
I read somewhere else it could be a symptom of a slowloris attack as well
It's more than this solution.Get answers and train to solve all your tech problems - anytime, anywhere.Try it for free Edge Out The Competitionfor your dream job with proven skills and certifications.Get started today Stand Outas the employee with proven skills.Start learning today for free Move Your Career Forwardwith certification training in the latest technologies.Start your trial today
Apache Web Server

From novice to tech pro — start learning today.