Web Server Woes - Corrupted communication or config error?

OK, basic background first.

I run a pretty simple web hosting business.  Fairly small as they go.  I have 6 servers at a datacenter in Florida.  I'm currently having a problem with 3 of the 6 servers.

2 of the 3 servers are dedicated to a single user.  Each is running Fedora Core 8.

Here's the versions of relevant software on each:

Server 1 (sugar):
Apache 2.2.14 (worker MPM)
PHP 5.2.11 as fcgi apache module

Server 2 (lily)
Apache 2.2.15 (worker MPM)
PHP 5.2.13 as fcgi apache module

The third server is a shared server with about 20 users.  It is also running Fedora Core 8.

Server 3 (aurora)
Apache 2.2.8 (worker MPM)
PHP 5.2.6 (running as CGI but without fcgi)

I'm experiencing nearly the same problem on all three servers.  Let me preface by saying that, prior to about a week ago, there were NO reported problems.  However, since last week, I've been getting reports of the following:

ZIP file downloads are corrupted

  Web pages, whether static or dynamic, will sometimes render improperly (see image sugar1.JPG)

  Images embedded in web pages will sometimes render improperly (see image sugar2.JPG)

  A few times, FireFox would give message "Content Encoding error" (see image sugar3.jpg)  Please note that ALL compression on the server is disabled- the apache deflate module is disabled, and php zlib compression is disabled.  In all cases of this error, refreshing the page caused it to load properly.

Now, what's really strange is that neither I, nor anyone else I know who has a computer, are able to reproduce this problem.  Of the thousands of users viewing these sites, these issues only seem to happen to a handful of people.  In some cases, upgrading the end-user's browser has solved the issue, but in many it hasn't.  We've cleared cookies, cache, rebooted cable modem's and DSL modems.

None of the server software has been updated recently.  There are no auto-updates enabled.

As you can see from the images, it has to be pretty low level to cause that kind of corruption in a simple HTML only webpage.

In addition, I've engaged the datacenter support staff who've run every test they know on the network equipment and found no issue, and even moved one of the servers to a different rack on a different switch which didn't solve the problem.

I'm thinking it almost has to be one of 2 things:  Either a common network segment issue, or a common server configuration issue.  Though I'm still at a loss as to why it would have only started manifesting recently.

I'm at my wits end here and would greatly appreciate ANY help or suggestions.

Jason
sugar1.JPG
sugar2.jpg
sugar3.jpg
eoh-jasonAsked:
Who is Participating?

[Product update] Infrastructure Analysis Tool is now available with Business Accounts.Learn More

x
I wear a lot of hats...

"The solutions and answers provided on Experts Exchange have been extremely helpful to me over the last few years. I wear a lot of hats - Developer, Database Administrator, Help Desk, etc., so I know a lot of things but not a lot about one thing. Experts Exchange gives me answers from people who do know a lot about one thing, in a easy to use platform." -Todd S.

StefanLambdaCommented:
You can try to analyse HTTP data with wireshark. (http://www.wireshark.org/) on the client (and on the server)?
0
eoh-jasonAuthor Commented:
Stefan;

I could run it on the server, but it'd be like looking for a needle in a haystack.  These servers get hundreds if not thousands of page views per minute.  Since I can't replicate the problem, I would need to find a way to get an end user who is experiencing the problem to let me remote to their machine and install wireshark.  In short, pretty much impossible.
0
StefanLambdaCommented:
I think the problem come with GZip or x-gzip compression (x-gzip). Did you tried with several web browser?

Check the following settings : http://www.tech-problems.com/gzip-compression-in-apache-2-on-fedora/
0
Determine the Perfect Price for Your IT Services

Do you wonder if your IT business is truly profitable or if you should raise your prices? Learn how to calculate your overhead burden with our free interactive tool and use it to determine the right price for your IT services. Download your free eBook now!

eoh-jasonAuthor Commented:
Stefan;

As posted in my original question, both apache compression and php compression are DISABLED on all servers.

In general, the people experiencing the problem have it work fine sometimes in 3 different browsers, and then a minute later it will have a problem in 3 different browsers.  There are times when the same page loaded in IE, FF, and Chrome will display just fine in IE, but not in FF or Chrome, then after refreshing the page, it displays fine in Chrome, but be corrupted in IE and FF.  In short, apparently totally random and totally browser independent.

Jason
0
StefanLambdaCommented:
Sorry, I think I will not help full for you (not enough strong but I tried). Another idea it's to implement a caching proxy (like http://www.squid-cache.org/) to discharge your web server.

Good luck,

Stefan
0
eoh-jasonAuthor Commented:
OK, some more updated information

After much testing, I have learned something very important. Not sure yet what it means- I am exhausted and will investigate further in the morning. Just thought I'd pass it along in case you guys have an idea.

Here's what I found. I now have access to a computer that is able to replicate the initial issue of downloads/webpages being corrupted. After many long hours, I have determined that the problems only happen to traffic on port 80. If I try and download a ZIP file over the standard port 80, it is corrupted EVERY time. However, if I download it over SSL/https, it downloads perfectly every time. The same exact file. Also, if I set apache to also listen on port 281, and download it via normal http but over port 281, then it downloads perfectly every time. So, the problem only happens on port 80.

Ready? Set? Go!!! (G'night)

Jason
0
StefanLambdaCommented:
Good news...

I got some tips from (http://serverfault.com/questions/245057/apache-wont-serve-images-larger-than-2k):

For testing you can use "wget http://your.website.com:80/path/to/image.png"
  -->then you can test from everywhere on one request (on the server too)
  -->you can test on several size of files

You can try this :
  "EnableMMAP off" to /etc/apache2/httpd.conf
  "EnableSendfile off" in the Apache config
  restarting apache
 
0
eoh-jasonAuthor Commented:
With the new info, going to repost a more concise question.
0

Experts Exchange Solution brought to you by

Your issues matter to us.

Facing a tech roadblock? Get the help and guidance you need from experienced professionals who care. Ask your question anytime, anywhere, with no hassle.

Start your 7-day free trial
eoh-jasonAuthor Commented:
Closing question to open a new one with more concise info from lessons learned
0
It's more than this solution.Get answers and train to solve all your tech problems - anytime, anywhere.Try it for free Edge Out The Competitionfor your dream job with proven skills and certifications.Get started today Stand Outas the employee with proven skills.Start learning today for free Move Your Career Forwardwith certification training in the latest technologies.Start your trial today
Linux Distributions

From novice to tech pro — start learning today.