Solved

Apache not using it's full download potential.

Posted on 2004-08-03
5
197 Views
Last Modified: 2010-03-04
It seems that every download request won't download a file at a rate higher than 40kb/s.  I have a program called getright that makes multiple requests to a server to make a faster download speed.  Using getright and 2 requests, I go 80kb/s.  Obviously, there's some sort of cap, but I'm not sure how to change that.  40kb/s seems a bit slow...
0
Comment
Question by:gunghoassassin
  • 3
  • 2
5 Comments
 
LVL 9

Expert Comment

by:ronan_40060
ID: 11712753
hello
Well its due to the load on the download server at http://apache.org 
Since Apache is a very popular web server , too many ppl download it every second so you bound to have sometimes high and low connection.
Ys Getright is a good downloading program.
It also depends upon your type of internet connection that you have .

ronan
0
 

Author Comment

by:gunghoassassin
ID: 11715834
I'm talking about my server, that I've installed out of the box so to speak.
0
 
LVL 9

Expert Comment

by:ronan_40060
ID: 11717850
Well As far as I know it all depends on the load on ur web server and the internet connection
May be other expert out here has better suggestion
0
 

Author Comment

by:gunghoassassin
ID: 11717861
It's on a 100MB connection. and the load wasn't high at all because I was doing a test download.
0
 
LVL 9

Accepted Solution

by:
ronan_40060 earned 500 total points
ID: 11718128
Thats good internet connection.
Well What OS are U having.
Well for the apache performance tuning the good documentation is available at
http://httpd.apache.org/docs/misc/perf-tuning.html
0

Featured Post

Efficient way to get backups off site to Azure

This user guide provides instructions on how to deploy and configure both a StoneFly Scale Out NAS Enterprise Cloud Drive virtual machine and Veeam Cloud Connect in the Microsoft Azure Cloud.

Question has a verified solution.

If you are experiencing a similar issue, please ask a related question

Suggested Solutions

As Wikipedia explains 'robots.txt' as -- the robot exclusion standard, also known as the Robots Exclusion Protocol or robots.txt protocol, is a convention to prevent cooperating web spiders and other web robots from accessing all or part of a websit…
If you've heard about htaccess and it sounds like it does what you want, but you're not sure how it works... well, you're in the right place. Read on. Some Basics #1. It's a file and its filename is .htaccess (yes, with a dot in the front). #…
I designed this idea while studying technology in the classroom.  This is a semester long project.  Students are asked to take photographs on a specific topic which they find meaningful, it can be a place or situation such as travel or homelessness.…
Delivering innovative fully-managed cloud services for mission-critical applications requires expertise in multiple areas plus vision and commitment. Meet a few of the people behind the quality services of Concerto.

929 members asked questions and received personalized solutions in the past 7 days.

Join the community of 500,000 technology professionals and ask your questions.

Join & Ask a Question

Need Help in Real-Time?

Connect with top rated Experts

20 Experts available now in Live!

Get 1:1 Help Now