Link to home
Start Free TrialLog in
Avatar of lenamtl
lenamtlFlag for Canada

asked on

Need to block BOTS for website

Hi,
One site get a lot of bandwith because to many bots reach it everyday.
I'm looking for other way to stop this for now.
I edit the HTACCESS file to deny some ip address and some country.
The site will be completely removed and recreated because it's use old unsafe code.
I'm wondering if the bots will still come after even if the site is recreate?

Any advice is welcome
Thanks




ASKER CERTIFIED SOLUTION
Avatar of torimar
torimar
Flag of Germany image

Link to home
membership
This solution is only available to members.
To access this solution, you must be a member of Experts Exchange.
Start Free Trial
Avatar of PsiCop
As torimar says, you can use a robots.txt file to influence (not control, but influence) "legitimate" bots and crawlers.

There is absolutely nothing that forces any bot/crawler to obey a robots.txt file. Think of it as asking someone to be polite. They can still be a jerk.

Bots can come back even after a site is re-written. Chances are they will.

If you want to block specific IP sub-nets/countries, I'd do it at the firewall. If you don't want those IPs scanning your website, why allow them to waste your bandwidth (making all those HTTP requests) and CPU (parsing and utilizing .htaccess)? Drop the packets at the firewall and sooner or later they'll get the hint - in the mean time, you minimize the negative impact on your systems.