?
Solved

Apache 404 and .htaccess files

Posted on 2007-03-28
4
Medium Priority
?
1,568 Views
Last Modified: 2008-02-26
Hello All,

We have a bizzare problem which I hope somebody can shed some light on.  We have a content management system which we built, which uses the apache error diective and a .htaccess to control the 404 errors produced by apache.  Basically when a page / resource is not found, run a file and see if the user wants to create it.

Now the problem is that I believe the 404 errors are still produced by apache, so our rankings on google are going down the tubes.

My question is if we use the AccessFile Directive will apache still produce a 404 error even though we tell it to run another file?
0
Comment
Question by:seanostephens
[X]
Welcome to Experts Exchange

Add your voice to the tech community where 5M+ people just like you are talking about what matters.

  • Help others & share knowledge
  • Earn cash & points
  • Learn & ask questions
  • 2
4 Comments
 
LVL 51

Expert Comment

by:Steve Bink
ID: 18819697
AFAIK, the 404 would still be generated if you are allowing Apache to determine if the file exists or not.  What is the URL flow for a non-existant page being requested from your server?  For example, mine goes through mod_rewrite and sends all traffic to a PHP handler....the PHP file will determine where to send the user from there.

If you are simply replacing the standard 404 error page with one of your own designed to do work, that should not prevent the 404 from being generated...in fact, I would think that strategy would DEPEND on it being generated.  

Finally, you can download Firefox and an extension called "Tamper Data".  This will allow you to see what headers your web server is generating in real time.
0
 
LVL 15

Expert Comment

by:m1tk4
ID: 19705154
No comment has been added to this question in more than 21 days, so it is now classified as abandoned.

I will leave the following recommendation for this question in the Cleanup Zone:

  Accept: routinet {http:#18819697}

Any objections should be posted here in the next 4 days. After that time, the question will be closed.

m1tk4
Experts Exchange Cleanup Volunteer
0
 
LVL 51

Accepted Solution

by:
Steve Bink earned 2000 total points
ID: 19719231
Current follow-up:

As of my last writing, my strategy was to prevent a 404 from being generated at all.  This was because none of the pages on my site actually exist - they are dynamically generated by PHP/MySQL each time they are called - and I did not want a genuine page to show up with a 404 header.  Since then, I've rewritten the code to produce a 404 once the handler determines there is no page to build.  Below you'll find the relevant snippets from my server conf files and loader code.  

# From httpd.conf
# intercepts file requests that do not actually exist
# this is a rather slow method involving a sub-request
# using an ErrorDocument for 404 will likely be much quicker
# rewrite for db_build_loader.php
RewriteCond %{LA-U:REQUEST_FILENAME} !-f
RewriteCond %{SCRIPT_FILENAME} !^/db_build_loader.php$
RewriteCond %{QUERY_STRING} !^.*loadreq.*$
RewriteRule ^/(.+)\.(html?|php)$ /db_build_loader.php?loadreq=%{REQUEST_FILENAME} [QSA,NC,NS,L]

<?
// inside db_build_loader.php
// mod_rewrite lands here if the page does not exist
// if parse instructions are not in db, this is a real 404
if (!($pagerow = $result->fetch_assoc())) {
      error_log("Could not locate $path");
      error_log(print_r($_SERVER,1));
      header("HTTP/1.0 404 Not Found");
      header("location:/404.php");
      die();
}
?>
0

Featured Post

Free Tool: Subnet Calculator

The subnet calculator helps you design networks by taking an IP address and network mask and returning information such as network, broadcast address, and host range.

One of a set of tools we're offering as a way of saying thank you for being a part of the community.

Question has a verified solution.

If you are experiencing a similar issue, please ask a related question

As Wikipedia explains 'robots.txt' as -- the robot exclusion standard, also known as the Robots Exclusion Protocol or robots.txt protocol, is a convention to prevent cooperating web spiders and other web robots from accessing all or part of a websit…
If you are a web developer, you would be aware of the <iframe> tag in HTML. The <iframe> stands for inline frame and is used to embed another document within the current HTML document. The embedded document could be even another website.
In this brief tutorial Pawel from AdRem Software explains how you can quickly find out which services are running on your network, or what are the IP addresses of servers responsible for each service. Software used is freeware NetCrunch Tools (https…
How to fix incompatible JVM issue while installing Eclipse While installing Eclipse in windows, got one error like above and unable to proceed with the installation. This video describes how to successfully install Eclipse. How to solve incompa…
Suggested Courses

771 members asked questions and received personalized solutions in the past 7 days.

Join the community of 500,000 technology professionals and ask your questions.

Join & Ask a Question