Want to protect your cyber security and still get fast solutions? Ask a secure question today.Go Premium

x
  • Status: Solved
  • Priority: Medium
  • Security: Public
  • Views: 13700
  • Last Modified:

virtual host 403 Forbidden error

I am using Apache 2.0.52 and Fedora 3. I have setup a virtualhost with the following detail

Listen 80
Listen 8080
NameVirtualhost 172.30.3.9:8080

<VirtualHost 172.30.3.9:8080>
DocumentRoot /www/enliven.com/html
ServerName ww2.enliven.com
<Directory "/www/enliven.com/html">
allow from all
Options +Indexes
</Directory>
Options Indexes
ServerPath /www/enliven.com/html
ServerAdmin consult@enliven.com
DirectoryIndex index.htm index.html one.htm
</VirtualHost>

I had disable SElinux, as well as set chmod 644 to /www/enliven.com/html and chown the directory to apache and I still get 403 permission error from my browser when I try to access to index.htm or index.html.

If I were to put the website files to /var/www/html there is no problem for my browser to access the webpage even though the file permission setting is 644 and chown to root instead of apache.

Any idea how to resolve this problem?



0
ghgan
Asked:
ghgan
  • 5
  • 3
  • 3
1 Solution
 
yuzhCommented:
It is the dir/file permission problem, you can set the dir/file permission to
750 as long as apache user has read permission, eg, if apache user belong to nobody
group, make all the dir/files to nobody group.

see my answer in http:Q_21217643.html
0
 
ahoffmannCommented:
750? grrr  (sorry, have to disagree with that)
permission should be 550 or 500 for directories and 440 or 400 for files
only executables files (scripts) need 550 or 500 also

ghgan, please post the error messages you get in error_log
0
 
yuzhCommented:
Hi ahoffmann,

    Not sure why you use 550 or 500, a website should allow update files and dirs.
   I alway use 750 + "chmod g+s" (set to nobody group) for my webservers. I'm running online course webserver, the lecturers/professors have to take care of their own course materials.
   The dirs/files can owned by different user, but the permission all
set to "nobody" group.
0
VIDEO: THE CONCERTO CLOUD FOR HEALTHCARE

Modern healthcare requires a modern cloud. View this brief video to understand how the Concerto Cloud for Healthcare can help your organization.

 
ahoffmannCommented:
yuzh, a website should never have write permissions, there're too much vulnerabilities out in the wild
Sorry, I'm security paranoid ;-)

In detail I'd recommend settings as follows:

# webserver with virtual hosts (probably each a own website):
  chown -R root:agroup documentroot
  find documentroot -type d -exec chmod  4550 {} \;
  find documentroot -type f -exec chmod  4440 {} \;
httpd runs as a user which is in group "agroup"

# one webserver (process) per website:
  chown -R user1:group1 documentroot
  find documentroot -type d -exec chmod  4550 {} \;
  find documentroot -type f -exec chmod  4400 {} \;
httpd runs as user1 and group1 only contains user1

(similar settings for cgi-bin)

If someone needs to update the site with ftp; I'd recommend to change the hoster! ftp is obsolete ;-)

Hope this helps the questioner too.
0
 
periwinkleCommented:
ahoffmann - no offense, but if they aren't going to use FTP, what is it that you are suggesting that they are going to use?
0
 
ahoffmannCommented:
scp, sftp, rsync -e ssh, scponly
0
 
periwinkleCommented:
outside of sftp (which is really just the next generation FTP), I would suggest that most clients wouldn't know about these other technologies.  I agree, though, that sftp should be used in preference to ftp.
0
 
periwinkleCommented:
to clarify my statement above - sftp is 'secure ftp', or ftp via ssh technology, which is the next generation of ftp (much in the way that ssh has replaced telnet).
0
 
ahoffmannCommented:
<off topic>
sftp is not realy "secure ftp" but more a workaround for using scp with plain old ftp syntax (for those ignorant people never willing to release insecure ftp ;-)
</off topic>
0
 
yuzhCommented:
Back to comment http:#13950911, sometime it is not pratical to make the site READ only,
unless only one person dealing with the website update, and the person is FULL-TIME look
after the Webserver.

Whatever tools (sftp, ftp) you use for update the website, the file/dir need WRITE
permissions to do the job.

eg, my Webservers, we have highly configurable firewall, the Webswerver only has http,
https open to the world. we do alow ssh/sftp (port 22) conection inside LAN, some
dirs in my website only accessable from soem IP range inside my network. system user
(root, adm, sys, bin, ....) can not do ftp at all. and I also have script running at the backgroud to monitor the user logins. if a user has fail to type in his passwd 3 times, the connetion is drop and I get an email notice.

The online course almost change daily by teaching staff, not sure how to handle it if you make it READ only.



0
 
ahoffmannCommented:
I agrre that there are situation where read-only is to hard, but you need to know how to monitor ;-)

Also I made a mistake in my comment http:#13950911
permisions must not be 4xxx but 2xxx
Sorry
0

Featured Post

What does it mean to be "Always On"?

Is your cloud always on? With an Always On cloud you won't have to worry about downtime for maintenance or software application code updates, ensuring that your bottom line isn't affected.

  • 5
  • 3
  • 3
Tackle projects and never again get stuck behind a technical roadblock.
Join Now