[Last Call] Learn how to a build a cloud-first strategyRegister Now


invisible apache

Posted on 2002-04-08
Medium Priority
Last Modified: 2012-06-27
I'm running Apache on a firewall box (I know, bad, bad, but I don't have another machine) and so to be as paranoid as possible, I'd like it to NOT send out 404 messages.  If someone probes my box with a generic port 80 request for say, index.html (which does not exist) instead of the exact name of one of my html files, Apache should not respond in any way.  I've spent some time digging through the httpd.conf file, but the closest I can find is an option to make your own message rather than Apache's generic 404 message.  I'd like no message at all.
Question by:magarity
  • 4
  • 3

Expert Comment

ID: 6926270
Do you want to redirect to another page base on that error?
If you do then here is a solution
ErrorDocument 404 http://www.domain.com/page.html(What ever you want)

Other than that I don't know.
Probably you are not looking for this solution

LVL 15

Expert Comment

ID: 6927256
Another alternative is to use mod_rewrite to redirect the failing request to another webserver.

look under "Redirect Failing URLs To Other Webserver" section.

--- exact copy from that section;

A typical FAQ about URL rewriting is how to redirect failing requests on webserver A to webserver B. Usually this is done via ErrorDocument CGI-scripts in Perl, but there is also a mod_rewrite solution. But notice that this is less performant than using a ErrorDocument CGI-script!
The first solution has the best performance but less flexibility and is less error safe: RewriteEngine on
RewriteCond   /your/docroot/%{REQUEST_FILENAME} !-f
RewriteRule   ^(.+)                             http://webserverB.dom/$1


The problem here is that this will only work for pages inside the DocumentRoot. While you can add more Conditions (for instance to also handle homedirs, etc.) there is better variant:

RewriteEngine on
RewriteCond   %{REQUEST_URI} !-U
RewriteRule   ^(.+)          http://webserverB.dom/$1


This uses the URL look-ahead feature of mod_rewrite. The result is that this will work for all types of URLs and is a safe way. But it does a performance impact on the webserver, because for every request there is one more internal subrequest. So, if your webserver runs on a powerful CPU, use this one. If it is a slow machine, use the first approach or better a ErrorDocument CGI-script.

I would think the ErrorDocument handler is much simpler.  But if you want more control on where the diversion, mod_rewrite would be good.
LVL 13

Author Comment

ID: 6929167
I tried redirecting it to both /dev/NULL and but neither worked; I still get an error message about the page missing rather than 'can't find server'
Technology Partners: We Want Your Opinion!

We value your feedback.

Take our survey and automatically be enter to win anyone of the following:
Yeti Cooler, Amazon eGift Card, and Movie eGift Card!

LVL 15

Expert Comment

ID: 6930255
I would presume you adopt the ErrorDocument approach.

The directive will take an HTML document, or a CGI scripts, or anything that is understood by the Web client.  Ie, the data returned must be "web-enabled", following the standard so called.

At this point I kinda confused on what you are trying to achive.  Do you have a legitimate website hosted on that machine that you want public internet to be able to reach?  Which you want to block from robots?

Or you do not want people to probe you port 80 at all.

If you happens to have any web hosting running behind the firewall, and the firewall should be opening port 80, then the protection method you are looking at really depends on how the firewall "Secure" the port.  Method that I can think of now are;

(1) Relay TCP/80 to some internal server.
 - Direct datastream from remote is allowed to go thru to your internal server only to port 80. OR
(2) Proxy the service for the internal server.
 - The firewall is doing the "fetching" and sends back the pages/data to remote client.

Please elaborate
LVL 13

Author Comment

ID: 6930418
"Do you have a legitimate website hosted on that machine that you want public internet to be able to reach?"

Yes and no.  It's just for my investment club.  General public doesn't neeed to know what I am doing with my and my friends' money with how much.

"Which you want to block from robots?"

Robots, spiders, and mainly script kiddies trying to hijack the box to use for DDoS fun.

"(1) Relay TCP/80 to some internal server.

The only other machine is a power hungry TBird.  I'm hoping to use the low power firewall box.  It is already set to a funky port, not 80, but a general scan by nmap shows the port open and accepting html requests.  THAT is what I don't want; someone's random nmap should only show the ssh port, which is more secure than a web server port.

"(2) Proxy the service for the internal server."

Hmm, I'm not sure what you mean by this, nevermind how to do it.

Thanks for the suggestions so far.
LVL 15

Accepted Solution

samri earned 200 total points
ID: 6930502
Personally, If you run a web server it will bind to certain TCP ports, ie 80 default to HTTP.  If you are using a different port, 8000 for example, it will still be visible if people attempt to do port scanning.

One approach to control such access is to use firewall (like what you are looking at), or use TCP wrapper that only accept connection from a predefined hosts.  Depending on make/model/software, this can be achieved.  Some firewall even can do prediction on attacks.  It might be able to observe the request, and if it looks like port-scanning activity, it will block any incoming traffic from the originating address.  

Back to initial queries, ErrorDocument will take

There is a detail docs on Apache's website.
In the event of a problem or error, Apache can be configured to do one of four things,

1. output a simple hardcoded error message
2. output a customized message
3. redirect to a local URL-path to handle the problem/error
4. redirect to an external URL to handle the problem/error

In other words, redirecting to /dev/null or will not work.

The four (4) options availabe will still expose to the remote scanner, as HTTP server, since the whatever webserver will have a uniform 501, or 404 error.

If you are really paranoid (as you mentioned), try looking at TCP Wrapper, or Firewall solution.

Hope this shed some light (or am I totally out of the subject).

LVL 13

Author Comment

ID: 6935087
Doesn't look like there's an Apache way to do it, so I'm accepting this part of samri's comment as answer because that's what I'll end up doing:

"use firewall ... that only accept connection from a predefined hosts"

I was afraid this would end up being the answer before asking; i't a pain to figure out all the club members' IPs and enter them in the iptables script.
LVL 15

Expert Comment

ID: 6935851
Yes, you can do it in apache also;

Add the following directive in Global Server config,

    <Location />
        Order deny,allow
        Deny from all  
  #      Allow from
        Allow from

hope this gives you another alternative.  But, those IP which are denied will still be able to connect to your Apache, but will be getting 403 Forbidden message.


Featured Post

What does it mean to be "Always On"?

Is your cloud always on? With an Always On cloud you won't have to worry about downtime for maintenance or software application code updates, ensuring that your bottom line isn't affected.

Question has a verified solution.

If you are experiencing a similar issue, please ask a related question

In my time as an SEO for the last 2 years and in the questions I have assisted with on here I have always seen the need to redirect from non-www urls to their www versions. For instance redirecting http://domain.com (http://domain.com) to http…
If you are running a LAMP infrastructure, this little code snippet is very helpful if you are serving lots of HTML, JavaScript and CSS-related information. The mod_deflate module, which is part of the Apache 2.2 application, provides the DEFLATE…
This Micro Tutorial will teach you how to add a cinematic look to any film or video out there. There are very few simple steps that you will follow to do so. This will be demonstrated using Adobe Premiere Pro CS6.
As many of you are aware about Scanpst.exe utility which is owned by Microsoft itself to repair inaccessible or damaged PST files, but the question is do you really think Scanpst.exe is capable to repair all sorts of PST related corruption issues?
Suggested Courses
Course of the Month17 days, 14 hours left to enroll

829 members asked questions and received personalized solutions in the past 7 days.

Join the community of 500,000 technology professionals and ask your questions.

Join & Ask a Question