Want to win a PS4? Go Premium and enter to win our High-Tech Treats giveaway. Enter to Win

x
?
Solved

What is the best way to counteract spiders, crawlers, and bots on our website?

Posted on 2006-07-13
8
Medium Priority
?
638 Views
Last Modified: 2010-04-11
Folks,

We're running Windows Small Business Server 2003, and we're having problems with various crawlers sucking up bandwidth (particularly Googlebot, MSNBot, and Yahoo's Inktomisearch).  What are the best ways to counteract their usage?

We've started blocking IP ranges, but that seems to help only a little, and I figure it's not a permanent solution anyways.

We've got robots.txt set properly as well as the Meta tags in the header of each page.

I've read about using traps like a 1 x 1 px transparent bitmap image link to another page that has redirects back into itself with like a 20-second delay.  Is this still a good solution, or have spiders been made smarter?  Any other ways to make bad bots pay for their crimes?

I'm not the main network person here, but I am his b----, so let me know if I can provide any more information.

--J
0
Comment
Question by:jammerms
[X]
Welcome to Experts Exchange

Add your voice to the tech community where 5M+ people just like you are talking about what matters.

  • Help others & share knowledge
  • Earn cash & points
  • Learn & ask questions
8 Comments
 
LVL 9

Expert Comment

by:blandyuk
ID: 17104853
Are you running ASP pages? You could read the "User-Agent" header in the HTTP request. Most spiders specify a link to a page with regards to spidering pages like Google:

http://www.google.com/bot.html

It would look something like:

User-Agent: Mozilla/5.0 (compatible; MSIE 6.0; Windows NT 5.1, http://www.google.com/bot.html)

Once you have compiled a database of spiders, you can simply search for them in the header and simply "Response.End()" so saving bandwidth.

Not an easy method but at least you wouldn't have to worry about finding out all the IP ranges they have, which I can imagine is a lot!
0
 
LVL 1

Expert Comment

by:PugnaciousOne
ID: 17108044
Most spiders (not all) respect the robots.txt file as well.  You can create one to disallow specific bots. here's an easy tool.  ( http://www.mcanerin.com/EN/search-engine/robots-txt.asp )
0
 

Author Comment

by:jammerms
ID: 17110743
PugnaciousOne,
We'be got the robots.txt set.  It seems that Inktomisearch and msnbot are the big culprits.  The googlebots seem to repect robots.txt.

blandyuk,

I'll definitely follow through with this suggestion if I can.  That's an interesting approach.




Keep the good ideas a-comin'.

--J
0
Automating Your MSP Business

The road to profitability.
Delivering superior services is key to ensuring customer satisfaction and the consequent long-term relationships that enable MSPs to lock in predictable, recurring revenue. What's the best way to deliver superior service? One word: automation.

 
LVL 38

Accepted Solution

by:
Rich Rumble earned 400 total points
ID: 17111506
There are a number of files you can add, or meta tags... no index, no follow, robots.txt  http://www.robotstxt.org/wc/faq.html#prevent all can and are ignored by spiders, maybe not by default, but they can be set to do so. Detection, account locking out (if possible), and IP blocking are the tried and true methods. Our corporation looked into this extensively, it's all about detection and reaction. We lock out accounts of abusers, and block ip's indefinately, and per the contract they've signed, we get paid to allow them back in if.
Here is some interesting approaches also: http://palisade.plynt.com/issues/2006Jul/anti-spidering/
http://www.robotstxt.org/wc/meta-user.html
-rich
0
 

Author Comment

by:jammerms
ID: 17111700
richrumble,

We've got robots.txt set properly as well as the Meta tags in the header of each page.


That palisade.plynt.com link is really interesting.



Everyone,
I've read about using traps like a 1 x 1 px transparent bitmap image link to another page that has redirects back into itself with like a 20-second delay.  Is this still a good solution, or have spiders been made smarter?  Any other ways to make bad bots pay for their crimes?

Thanks again for the input.
0
 

Author Comment

by:jammerms
ID: 17111943
richrumble,

I see the part about traps in the Palisade article.  Thanks again for the pointer.




I'll give this over the weekend to see if any new ideas get posted in the meantime.

Thanks,
J
0
 
LVL 9

Assisted Solution

by:blandyuk
blandyuk earned 1200 total points
ID: 17113989
With regards to the ASP code to get the User-Agent:

Request.ServerVariables("HTTP_USER_AGENT")

You could simply do an "InStr" on the User-Agent for particular strings which associate with bots. If it's greater than 1, Response.End() it. 3 easy one's to block:

http://www.google.com/bot.html
stumbleupon.com
Girafabot;

Here are some User-Agents I've taken from my tracking logs which are clearly bots:

Mozilla/4.0 (compatible; MSIE 6.0; Windows NT 5.1; stumbleupon.com 1.926; VNIE5 RefIE5; .NET CLR 1.1.4322)
Mozilla/4.0 (compatible; MSIE 6.0; Windows NT 4.0; Girafabot; girafabot at girafa dot com; http://www.girafa.com)

I'll post some more when I find them.

Obviously your going to have to be careful on what you specify as you could easily block actual users :( If you are specific, you shouldn't have a problem.
0
 

Author Comment

by:jammerms
ID: 17124658
It turns out we're just doing HTML for our website, so the ASP solutions will have to wait.

I did notice that our robots.txt had a capital R, so I changed it to lowercase to see if that would help.

Thanks for the pointers, people.
0

Featured Post

Threat Trends for MSPs to Watch

See the findings.
Despite its humble beginnings, phishing has come a long way since those first crudely constructed emails. Today, phishing sites can appear and disappear in the length of a coffee break, and it takes more than a little know-how to keep your clients secure.

Question has a verified solution.

If you are experiencing a similar issue, please ask a related question

Check out what's been happening in the Experts Exchange community.
What we learned in Webroot's webinar on multi-vector protection.
Nobody understands Phishing better than an anti-spam company. That’s why we are providing Phishing Awareness Training to our customers. According to a report by Verizon, only 3% of targeted users report malicious emails to management. With compan…
Sometimes it takes a new vantage point, apart from our everyday security practices, to truly see our Active Directory (AD) vulnerabilities. We get used to implementing the same techniques and checking the same areas for a breach. This pattern can re…

604 members asked questions and received personalized solutions in the past 7 days.

Join the community of 500,000 technology professionals and ask your questions.

Join & Ask a Question