File permissions security questions

I am planning to set up a spider trap that will automatically add all bad robots IPs to my .htaccess file. The problem is that in order to achieve that, I will need to make my .htaccess file word-writable (together with another directory which will usually stay empty). Naturally, I don't feel comfortable with that. The .htaccess file is made non-readable for web browsers via httpd.conf (deny from all), but I assume it would still be an unsafe arrangement. Any ideas how I can make it safer without recompiling apache?
yosmcAsked:
Who is Participating?
 
MysidiaConnect With a Mentor Commented:
Instead of having a web script edit your .htaccess file, have the script add their host to a special "data file"
that only contains a list of hostnames that were detected and possibly timestamps

And you setup a cron job to periodically verify the integrity of the "bad hosts file", and update the .htaccess file to ban the new hosts if its timestamp is newer than the .htaccess file.

And there are other ways like Unix sockets and IPC.
0
 
MysidiaCommented:
Another alternative is to run scripts setuid... or use apache SuExec to run CGI scripts and the 'User' directive in the
server's httpd.conf so the scripts for certain hosts run as a different user.
See http://httpd.apache.org/docs/suexec.html
0
 
jlevieCommented:
Have the "spider trap" drop the bad IP's into a file and run a job from cron, say every 5 minutes, that retrieves any found IP's and adds them to the .htaccess file. The cron job would run as the user that owns the .htaccess file and thus it wouldn't need to be world writable.

Another approach would be to have the "spider trap" write the IP's to a FIFO implemented in say Perl that modifies the .htaccess file.
0
Improved Protection from Phishing Attacks

WatchGuard DNSWatch reduces malware infections by detecting and blocking malicious DNS requests, improving your ability to protect employees from phishing attacks. Learn more about our newest service included in Total Security Suite today!

 
ahoffmannCommented:
chmod 400 .htaccess

then write a script which does (pseudo code follows):
  chmod 600 .htaccess && echo "deny from IP" >> .htaccess; chmod 400 .htaccess

run this scipt as the same user as apache runs, probaly with cron as jlevie suggested
0
 
xDamoxCommented:
why not make your script store the IP somewere and have a crontab
add the IPs to the file :) that way it will be more secure
0
 
MysidiaCommented:
Few/none of these methods ensures that it's impossible for a third party to inject ip addresses into your ban list, if that's one of the attack models you need to secure against, then you need to use SuExec or a script setuid to a user others can't access or something similar in order to create a privilege isolation between your web scripts and any CGI scripts run/created by other users.

(Or configure apache so only trusted users can deploy CGI scripts)

note that things like 'chmod 600 file && write && chmod 400 file' create a race condition... not that mode 400 as oposed to 600 provides any assurance security, it provides none against any attack model where the user is able to run arbitrary code as the Apache user: like anyone who can deploy their own CGI scripts on the system can.

As for cron script reading a list of ips collected by the web script, here's a possibility...

#!/usr/bin/perl

my %badips;

# New badips list, to be writable by the web script
my $new_badip_file = '/home/blah/new_badips.dat';

# Stored database
my $old_badip_file = '/home/blah/banned_ips.dat';

read_badips($new_badip_file);
read_badips($old_badip_file);
write_badips($old_badip_file);

write_htaccess("/path/to/.htaccess.new");

sub write_htaccess
{
  my $file = shift;
  my $temp = $file . ".tmp$$";

  open HTACCESS, ">$temp" || die "Unable to open $temp";
  print HTACCESS q/
      <LIMIT GET POST>
        order deny, allow
  /;

  for(keys %badips) {
      print HTACCESS "deny from " . $_ . "\n";
  }

  print HTACCESS q%
       </LIMIT>
  %;

  close HTACCESS;
  rename $temp, $file;
}

sub read_badips
{
  my $file = shift;
  open IPS, $file || die "Unable to open file $file";

  while ($line = <IPS>)
  {
     chomp($line);
     if ($line =~ /^([0-9]+\.){3,}[0-9]+$/) {
         $badips{$line} = 1;
     }
  }
  close IPS;
}


sub write_badips
{
    my $out = shift;
    my $temp = "$out" . ".tmp$$";
    open(IPS, ">$temp") || die "Unable to open file $temp";

    for(keys %badips) {
        print IPS $_ . "\n";
    }
    close IPS;
    rename $temp, $out;
}

0
 
yosmcAuthor Commented:
Just an update that I am still working on the problem (and thanks for the suggestions so far). I was a little reluctant to enable .htaccess files (I'm the only user and I normally put everything into httpd.conf - makes things faster), so I was looking into perl modules (Apache::BlockAgent, Apache::BlockIP), but unfortunately couldn't get them to work.

Now I'm still looking into my options (maybe there are other more current modules; I also saw an approach that uses mySQL - but should I waste database resources to keep out the bots?) which is why I can't say yet which method for adding IPs will work best.
0
Question has a verified solution.

Are you are experiencing a similar issue? Get a personalized answer when you ask a related question.

Have a better answer? Share it in a comment.

All Courses

From novice to tech pro — start learning today.