Apache mod_evasive seems not to be working

Does anyone have any experience with Apache mod_evasive? I've just installed that to try and mitigate DoS attacks, but it's not working as expected. My config is:
LoadModule evasive20_module lib64/httpd/modules/mod_evasive20.so

<IfModule mod_evasive20.c>
    DOSHashTableSize    3097
    DOSPageCount        2
    DOSSiteCount        100
    DOSPageInterval     1
    DOSSiteInterval     1
    DOSBlockingPeriod   10
  DOSEmailNotify        sysadmin@mydom.com
</IfModule>

Open in new window

As you can see, I've set the SiteCount to 100. I've had it at the default of 50, and tried 10 as well. No matter what I set it to it seems to generate the Blacklist message after some very few accesses. For example, it just blacklisted an IP that shows in the access_log only 9 times and over a 4 second period (not the supposed 1 second as defined by DOSSiteInterval).

As I said, it seems to blacklist with about this number of accesses regardless of what I set the DOSSiteCount to.
LVL 1
jmarkfoleyAsked:
Who is Participating?
 
David FavorLinux/LXD/WordPress/Hosting SavantCommented:
Dr. Klan is right. Trying to block an attack with mod_evasive is a poor choice.

Here's how I do this with WordPress sites.

1) I only deploy client sites which can survive 1,000,000 simultaneous visits via a load test. Here's an example of output to look for...

h2load -ph2c -t16 -c16 -m16 -n1000000 https://foo.com/ | egrep -e ^finished -e ^status
finished in 56.77s, 17616.12 req/s, 815.28MB/s
status codes: 1000000 2xx, 0 3xx, 0 4xx, 0 5xx

Open in new window


I look for... well... 1st the site should just survive. 2nd PHP + MariaDB should have nearly no load (caching working). 3rd site should run between 10K-20K reqs/sec.

2) If you're running sustained traffic at above levels (1M-ish reqs/minute), then best tune Apache logging to only record non-200 (something wrong) + pages (page/post slugs, including ?utm_source tagged pages/posts).

3) Write a fail2ban recipe to count pages/IP + if pages/IP doesn't make sense, a Bot is visiting + block it for an hour or day.

So for example, if average on-site time is... say 5 pages + on-page time is 30 seconds, then anyone visiting... say... 10 pages in (5 * 30) 150secs is likely a bot + can be blocked.

This isn't foolproof + can require a bit of tuning to get right.

Also if you're driving paid traffic (like Outbrain + Taboola), you'll have to read their docs + whitelist their content quality Bots, which must be allowed to run (if you block these, then traffic can cease).

Also whitelist Google Bots, by IP range, not User Agent, which can easily be forged.

This means all your DOS/DDOS mitigation occurs at the Kernel level (iptables), so once a Bot is identified + blocked, you have zero resource drain, for whatever block interval you choose.
1
 
Dr. KlahnPrincipal Software EngineerCommented:
mod_evasive would not be my choice to mitigate DDoS attacks, as the server still handles every request that comes in.  Thus the load on the server is not decreased significantly.

The only reliable DDoS mitigation is to host with a company that provides external DDoS mitigation as a service, either built into their hosting or as an extra cost add-on.  Then when a DDoS attack is detected the server doesn't even see the unwanted requests and life goes on as normal.

You might also look into mod_spamhaus, mod_honeypot and mod_torcheck.   And if your audience is only North America, then banning South America, Africa and Asia via iptables or mod_cidrblock will cut all abusive accesses down significantly.
1
 
jmarkfoleyAuthor Commented:
This is a self-hosted business webpage, so external hosting is not an option. I do already have geoblocking for China and Russia through iptables.

I have a script that checks the access_log for mod_evasive entries and will add the offending IP to a block list, so httpd won't have to handle every request. My problem here is that mod_evasive does not seem to pay any attention to the DOSSiteCount setting. I see the blacklist message after a dozen or so entries in access_log regardless of what I set the DOSSiteCount value to.

I'll look at your suggested mod_spamhaus, mod_honeypot and mod_torcheck, but mod_evasive was given as a recommendation for what I want and, since I coded supplemental scripts around it already, I'd like to see if I'm doing something wrong in the implementation.
0
 
jmarkfoleyAuthor Commented:
You are both right. mod_evasive doesn't work as advertised. It doesn't seem to matter what I set DOSSiteCount or DOSSiteInterval to, it always generates a blacklist entry after about a dozen accesses, and spanning more than one second. So, that's out. I am trying a script a the moment.

Maybe this should be a new post, but here is what I do to block/limit attempts on port 22 in iptables. Would a similar idea work for port 80/443?
    /usr/sbin/iptables -N logdrop
    /usr/sbin/iptables -A logdrop -j LOG --log-level 6 --log-prefix "SSH Break-in attempt "
    /usr/sbin/iptables -A logdrop -j DROP
    
    /usr/sbin/iptables -N checkcount
    /usr/sbin/iptables -A checkcount -m recent --set
    /usr/sbin/iptables -A checkcount -m recent --rcheck --hitcount 12 -j logdrop
    /usr/sbin/iptables -A checkcount -j RETURN

    /usr/sbin/iptables -A INPUT -p tcp --syn --dport 22 -i eth0 -j checkcount
    /usr/sbin/iptables -A INPUT -p tcp --syn -m limit --limit 1/s --limit-burst 3 -i eth0 --dport 22 -j ACCEPT

Open in new window

0
 
jmarkfoleyAuthor Commented:
Thanks to you both. I've added the module limitipconn_module to limit simultaneous connections per IP to 10, and have implemented a script (similar to fail2ban) to check for reasonableness of connections per 10 seconds. That seems to be catching bots and should catch DoS, though not immediately.
0
Question has a verified solution.

Are you are experiencing a similar issue? Get a personalized answer when you ask a related question.

Have a better answer? Share it in a comment.

All Courses

From novice to tech pro — start learning today.