The requested URL /ghjkl was not found on this server.
Additionally, a 404 Not Found error was encountered while trying to use an ErrorDocument to handle the request.
Apache Web ServerLinuxLinux Distributions
Last Comment
Ess Kay
8/22/2022 - Mon
Dave Baldwin
No, that's not going to happen. One web site is not allowed to access content from another one. It's a security restriction. While a programming language like PHP may be allowed greater access under some circumstances, I don't think it is going to happen the way you want. And the '/' means you should access the file in the root directory of the current site.
Jan Bacher
I would update the page for the 404 and configure it globally.
gr8gonzo
You could use hard links to make all of the websites use a central 404 file. That way, when you update the file, all the other ones get updated, too. That requires the file system to support it, though.
You might also need to specify in the Apache config that links are allowed.
Not sure what you mean, I changed the config in the server proporties, which should have access to the web tree,
as opposed to the htaccess for a particular web folder.
@ gonzo, i currently have over 50 domains, not counting subdomains, I dont want to add each one individually.
@jan, i think thats what i have and its not working
Ess Kay
ASKER
except its not /err/
I wrote that here or ease of use
The httpd.conf has the following
ErrorDocument 404 /../errdocs/error404.html
So first create /home/www/errdocs/404.html and then:
cd /home/www
for i in `ls -1d *.*`; do ln /home/www/errdocs/404.html $i/404.html; done;
(note that it's using backticks ` not single quotes)
That should loop through each folder in /home/www that has a period in it, and then create the hard link.
Dave Baldwin
Your statement means to me that you defined it as a separate site in /etc/apache2/sites-available/ . If you did, then access from the other sites will be blocked. If on the other hand, you define it as a file in httpd.conf (or apache2.conf depending on your distribution) , you may be able to define a single file for the 404 error message for all sites. Note that if you do that, ErrorDocument in htaccess can over-ride that setting.
PS: Put the file in the 'www' directory, not in one of the site directories.
Jan Bacher
I don't know why you're working so hard on this.
/error should already be aliased in the default httpd.conf.
all you should have to do is put your 404 document within that directory, uncomment the line and optionally change the filename. if you want to use a different directory, add it, alias it, configure it in httpd.conf and reload httpd.
i would never consider dumping a link in every virtual web root.
Actually... if you remove the ErrorDocument 404 from all of the htaccess files, you would then see the default Apache 404 error message. Plain text but adequate.
gr8gonzo
The reason I suggested a link over the error folder alias is that I've seen several clients that end up using their own /error folder and then they're puzzled as to why it's not being used (because the alias takes precedence, but it's not visible in the filesystem).
To me, it's all a dozen ways to skin the same cat - if one works best for him, then great.
Ess Kay
ASKER
@gonzo, trying to avoid using a loop, because all future subdomains, and site, i will have to manually or through a loop add them.
I want to make like a 'catchall', that regardless of the somain/subdomain..etc it will use one err, unless the htaccess specifies otherwise
@dave, most of the domain folders are empty. no htaccess, or any files inside them.
Also, te httpd is here /etc/apache2/sites-available/server
all sitesare on this server, it controls the http port
The reason I have this is, I just moved my shared-hosting to a vps.
So this is relatively new to me,
Also, all the data was wiped, Until i add actual websited to them, i would like the 404 to be displayed.
The 404 should be a php file, which gets the web address location and spits out a message based on the top level domain name.
<html><body>
<br>Welcome to <?php $_SERVER["SERVER_NAME"] ?>!
<br>you have reached <?php echo "http://$_SERVER[HTTP_HOST]$_SERVER[REQUEST_URI]"; ?> in error.
<br>We are under construction.
<br>Try back soon..
</body><html>
Well, I would hope that you'd have a template that you use for creating the new structure of new sites, and the link could be included in it, but if not, then the next best way would be to use a directory alias in the config, as Jan mentions.
gr8gonzo
As a side note, I would caution you against using a PHP page (or any document that requires the server to spin up a large, separate module) for your 404 document. One site will get tons of random attacks - bots that are basically doing the spray-and-pray approach to hacking and trying to access common locations for vulnerable scripts. This can result in a lot of 404 codes for one site.
This impact is multiplied by each additional site you host, and you probably don't have much memory if you're on a VPS.
Even if the script itself is simple, Apache has to spin up a child thread that loads the entire PHP interpreter into memory in order to spit out that simple page. If you use the common PHP build (the popular modules), then that interpreter is going to be fairly large. A single shotgun attack by a bot is going to generate dozens of 404s at the same time, meaning Apache is going to have to spin up dozens of instances of PHP to handle all the 404s, which is going to likely have a decent impact on server resources (which again, are likely limited if you're on VPS).
You can negate some of this by using PHP in FastCGI mode, but that's not the common way to set it up (mostly because people who write server config guides are lazy and take the easy approach of using mod_php with Apache).
While a custom 404 page might look a little more appealing than the default page, if it doesn't really serve any functional purpose, then you're just eating up valuable resources. The end user can see the HTTP_HOST and REQUEST_URI in their address bar, so that information is not adding any info that the end user doesn't already have.
You could use PHP to pre-generate static HTML files containing the SERVER_NAME and then use that HTML file for the 404 without the similar drain (all custom 404s use extra resources), but I'd highly recommend against using PHP for one. You're just opening yourself up to problems and a higher risk of DDOS attacks (just slam non-existent URLs enough times and you'll eventually tip the server over from resource consumption).
Ess Kay
ASKER
Thanks gonzo, can we try that in english ?
Once again, I am used to being spoonfed the hosting, so I dont know what these terms mean, how how to go about doing them, as i have recently moved from a control panel to controlling the server.
As for your second comment,
I'm not worried about the hacking attempts at the moment, as the sites are not large yet.
By the time they get large, we will have moved on to better hosting.
So the first comment:
You typically have some kind of process you follow when you're setting up new sites. Many hosts will want to use the same folder structure for all new sites (e.g. everyone has a public_html folder, an ftp folder, or whatever). So instead of manually creating that folder structure every time, they usually have a folder structure that acts as a template. This might look like:
When you set up a new site, you simply copy that folder structure and give it a new name like foobar.com.
/www/foobar.com
/public_html
/ftp
...etc...
So if the link is part of that folder structure that you're copying from, then it'll get copied, too, meaning that any new sites you create will automatically have that link.
That said, I was suggesting that if you don't set up new sites by copying over a template kind of folder structure, then the next best thing would be to use the approach that Jan has been suggesting in #40645577, which uses an aliased folder (basically the idea is that when there's a request for /xyz/404.html, then the server won't look for a real folder called /xyz, but instead will look at another place (that you define in configuration) for 404.html.
My second comment:
As part of your transition away from spoonfed hosting, you need to dump that mindset of "well, we're not large enough, so that's not a problem and we'll deal with it later." That thinking will make your life a lot more complicated later on. The hardest performance problems are the ones that are the sum of several smaller problems that you forgot about, and now are far more complex to fix because fixing them could introduce the risk of breaking things.
Also, a lot of the random hacking attempts don't limit themselves to large targets - they will simply spam domains with attacks, so it's equal-opportunity hacking. In fact, it's often easier to target SMALLER sites specifically because so many people have that mindset and they leave their platforms open to attacks.
Don't rely on "better hosting" to ever fix problems. Problems are problems, and they will still be problems on "better hosting" and they will eat into the value of what you end up paying for that better hosting.
Obviously, you can do whatever you want, but I hope you will put some stock into what I'm saying and really take a good look at what you're gaining versus the resource usage and the risk you're introducing.
Dave Baldwin
I agree with @gr8gonzo. I had some spammer POST 38,000+ times to one of my pages last month that I didn't adequately protect. They didn't break in because the rest of the code was good. I had to delete all those 38,000+ rows from the database and fix the page code to reject the kind of entries they were making.
Ive re read the answers,
Jan seems to hit it. The issue:: i was not using var/www/err as the link
I was using home/www/err which was giving me an error
Once i set up an alias, err --> /VAR/www/err, everything worked great.
Thanks for the help guys
Also, special thanks to gozo for the tangent answer on web security and funemental server php page generation function.