Solved

Limit the number of simultaneous connectios that an athenticated user can do in Apache 1.3.29

Posted on 2004-09-04
1
518 Views
Last Modified: 2010-03-04

Hi all!

I have a server with apache 1.3.29 and I would like to limit the number of concurrent conections that an authenticated user can do simultaneously (a logged in member) to prevent the overhead of download managers placing i.e. 20 connections per download file.

So let's say I would like to limit the maximum number of concurrent connections to 4 per user. How can this be done?

Thanks in advance!!
Demien
0
Comment
Question by:demienx
[X]
Welcome to Experts Exchange

Add your voice to the tech community where 5M+ people just like you are talking about what matters.

  • Help others & share knowledge
  • Earn cash & points
  • Learn & ask questions
1 Comment
 
LVL 15

Accepted Solution

by:
samri earned 500 total points
ID: 11983457
Hi Demienx,

You may want to take a look at mod_throttle (http://www.snert.com/Software/mod_throttle/), which should be able to do what you are looking at.

Another options would bw bwshare : http://www.topology.org/src/bwshare/README.html

I would personally would opt for mod_throttle.

For other apache modules : http://modules.apache.org/

Cheers.
0

Featured Post

Industry Leaders: We Want Your Opinion!

We value your feedback.

Take our survey and automatically be enter to win anyone of the following:
Yeti Cooler, Amazon eGift Card, and Movie eGift Card!

Question has a verified solution.

If you are experiencing a similar issue, please ask a related question

Suggested Solutions

Title # Comments Views Activity
erroer in installing php7.0.7 on an apache 2 machine 4 156
LAMP problem identifier tool ? 9 131
Apache Issues 9 101
Help installing Laravel app on MAMP on MAC 7 56
In my time as an SEO for the last 2 years and in the questions I have assisted with on here I have always seen the need to redirect from non-www urls to their www versions. For instance redirecting http://domain.com (http://domain.com) to http…
As Wikipedia explains 'robots.txt' as -- the robot exclusion standard, also known as the Robots Exclusion Protocol or robots.txt protocol, is a convention to prevent cooperating web spiders and other web robots from accessing all or part of a websit…
With Secure Portal Encryption, the recipient is sent a link to their email address directing them to the email laundry delivery page. From there, the recipient will be required to enter a user name and password to enter the page. Once the recipient …

734 members asked questions and received personalized solutions in the past 7 days.

Join the community of 500,000 technology professionals and ask your questions.

Join & Ask a Question