Welcome to Experts Exchange

Add your voice to the tech community where 5M+ people, just like you, are talking about what matters.

  • Help others & share knowledge
  • Earn cash & points
  • Learn & ask questions
Solved

invisible apache

Posted on 2002-04-08
8
490 Views
Last Modified: 2012-06-27
I'm running Apache on a firewall box (I know, bad, bad, but I don't have another machine) and so to be as paranoid as possible, I'd like it to NOT send out 404 messages.  If someone probes my box with a generic port 80 request for say, index.html (which does not exist) instead of the exact name of one of my html files, Apache should not respond in any way.  I've spent some time digging through the httpd.conf file, but the closest I can find is an option to make your own message rather than Apache's generic 404 message.  I'd like no message at all.
0
Comment
Question by:magarity
  • 4
  • 3
8 Comments
 
LVL 2

Expert Comment

by:prokni
ID: 6926270
Do you want to redirect to another page base on that error?
If you do then here is a solution
ErrorDocument 404 http://www.domain.com/page.html(What ever you want)

Other than that I don't know.
Sorry
Probably you are not looking for this solution

0
 
LVL 15

Expert Comment

by:samri
ID: 6927256
Another alternative is to use mod_rewrite to redirect the failing request to another webserver.

http://httpd.apache.org/docs/misc/rewriteguide.html
look under "Redirect Failing URLs To Other Webserver" section.


--- exact copy from that section;

Description:
A typical FAQ about URL rewriting is how to redirect failing requests on webserver A to webserver B. Usually this is done via ErrorDocument CGI-scripts in Perl, but there is also a mod_rewrite solution. But notice that this is less performant than using a ErrorDocument CGI-script!
Solution:
The first solution has the best performance but less flexibility and is less error safe: RewriteEngine on
RewriteCond   /your/docroot/%{REQUEST_FILENAME} !-f
RewriteRule   ^(.+)                             http://webserverB.dom/$1

 

The problem here is that this will only work for pages inside the DocumentRoot. While you can add more Conditions (for instance to also handle homedirs, etc.) there is better variant:

RewriteEngine on
RewriteCond   %{REQUEST_URI} !-U
RewriteRule   ^(.+)          http://webserverB.dom/$1

 

This uses the URL look-ahead feature of mod_rewrite. The result is that this will work for all types of URLs and is a safe way. But it does a performance impact on the webserver, because for every request there is one more internal subrequest. So, if your webserver runs on a powerful CPU, use this one. If it is a slow machine, use the first approach or better a ErrorDocument CGI-script.

I would think the ErrorDocument handler is much simpler.  But if you want more control on where the diversion, mod_rewrite would be good.
0
 
LVL 13

Author Comment

by:magarity
ID: 6929167
I tried redirecting it to both /dev/NULL and 127.0.0.1 but neither worked; I still get an error message about the page missing rather than 'can't find server'
0
Efficient way to get backups off site to Azure

This user guide provides instructions on how to deploy and configure both a StoneFly Scale Out NAS Enterprise Cloud Drive virtual machine and Veeam Cloud Connect in the Microsoft Azure Cloud.

 
LVL 15

Expert Comment

by:samri
ID: 6930255
I would presume you adopt the ErrorDocument approach.

The directive will take an HTML document, or a CGI scripts, or anything that is understood by the Web client.  Ie, the data returned must be "web-enabled", following the standard so called.

At this point I kinda confused on what you are trying to achive.  Do you have a legitimate website hosted on that machine that you want public internet to be able to reach?  Which you want to block from robots?

Or you do not want people to probe you port 80 at all.

If you happens to have any web hosting running behind the firewall, and the firewall should be opening port 80, then the protection method you are looking at really depends on how the firewall "Secure" the port.  Method that I can think of now are;

(1) Relay TCP/80 to some internal server.
 - Direct datastream from remote is allowed to go thru to your internal server only to port 80. OR
(2) Proxy the service for the internal server.
 - The firewall is doing the "fetching" and sends back the pages/data to remote client.

Please elaborate
0
 
LVL 13

Author Comment

by:magarity
ID: 6930418
"Do you have a legitimate website hosted on that machine that you want public internet to be able to reach?"

Yes and no.  It's just for my investment club.  General public doesn't neeed to know what I am doing with my and my friends' money with how much.

"Which you want to block from robots?"

Robots, spiders, and mainly script kiddies trying to hijack the box to use for DDoS fun.

"(1) Relay TCP/80 to some internal server.

The only other machine is a power hungry TBird.  I'm hoping to use the low power firewall box.  It is already set to a funky port, not 80, but a general scan by nmap shows the port open and accepting html requests.  THAT is what I don't want; someone's random nmap should only show the ssh port, which is more secure than a web server port.

"(2) Proxy the service for the internal server."

Hmm, I'm not sure what you mean by this, nevermind how to do it.

Thanks for the suggestions so far.
0
 
LVL 15

Accepted Solution

by:
samri earned 50 total points
ID: 6930502
Personally, If you run a web server it will bind to certain TCP ports, ie 80 default to HTTP.  If you are using a different port, 8000 for example, it will still be visible if people attempt to do port scanning.

One approach to control such access is to use firewall (like what you are looking at), or use TCP wrapper that only accept connection from a predefined hosts.  Depending on make/model/software, this can be achieved.  Some firewall even can do prediction on attacks.  It might be able to observe the request, and if it looks like port-scanning activity, it will block any incoming traffic from the originating address.  

Back to initial queries, ErrorDocument will take

There is a detail docs on Apache's website.
http://httpd.apache.org/docs-2.0/mod/core.html#errordocument
--extract.
In the event of a problem or error, Apache can be configured to do one of four things,

1. output a simple hardcoded error message
2. output a customized message
3. redirect to a local URL-path to handle the problem/error
4. redirect to an external URL to handle the problem/error

In other words, redirecting to /dev/null or 127.0.0.1 will not work.

The four (4) options availabe will still expose to the remote scanner, as HTTP server, since the whatever webserver will have a uniform 501, or 404 error.

If you are really paranoid (as you mentioned), try looking at TCP Wrapper, or Firewall solution.


Hope this shed some light (or am I totally out of the subject).

0
 
LVL 13

Author Comment

by:magarity
ID: 6935087
Doesn't look like there's an Apache way to do it, so I'm accepting this part of samri's comment as answer because that's what I'll end up doing:

"use firewall ... that only accept connection from a predefined hosts"

I was afraid this would end up being the answer before asking; i't a pain to figure out all the club members' IPs and enter them in the iptables script.
0
 
LVL 15

Expert Comment

by:samri
ID: 6935851
Yes, you can do it in apache also;

Add the following directive in Global Server config,

    <Location />
        Order deny,allow
        Deny from all  
  #      Allow from 164.137.139.219 164.137.158.119
        Allow from 1.2.3.4 10.2.3.0/255.255.255.0
    </Location>

hope this gives you another alternative.  But, those IP which are denied will still be able to connect to your Apache, but will be getting 403 Forbidden message.

cheers
0

Featured Post

Free Tool: IP Lookup

Get more info about an IP address or domain name, such as organization, abuse contacts and geolocation.

One of a set of tools we are providing to everyone as a way of saying thank you for being a part of the community.

Question has a verified solution.

If you are experiencing a similar issue, please ask a related question

Suggested Solutions

Introduction As you’re probably aware the HTTP protocol offers basic / weak authentication, which in combination with the relevant configuration on your web server, provides the ability to password protect all or part of your host.  If you were not…
If your site has a few sections that need to be secure when data is transmitted between the server and local computer, such as a /order/ section for ordering or /customer/ which contains customer data, etc it would of course be recommended to secure…
Email security requires an ever evolving service that stays up to date with counter-evolving threats. The Email Laundry perform Research and Development to ensure their email security service evolves faster than cyber criminals. We apply our Threat…

839 members asked questions and received personalized solutions in the past 7 days.

Join the community of 500,000 technology professionals and ask your questions.

Join & Ask a Question