Link to home
Start Free TrialLog in
Avatar of cdillon
cdillon

asked on

differences between bots and real browsers

For the custom reports on traffic to our site, I need to be able to determine if a hit is from bots like googlebot or from real people looking at the site.  What is the best way to do this?  I'm currently using a list of useragents that I manually mark as human or bot.  Is there a better way?
Avatar of shooksm
shooksm

The hard thing is that I can write a bot that mimics a common browser.  Here are couple of suggestions for some other filters.

Bots should look for a robots.txt on the root of your server.  You could mark any IP addresses that request for a robots.txt as being a bot.

You could set a threshold of what percent of a site has been viewed.  Your average users are going to go to the page or quickly click to the piece of information they want then leave.  For a site with 100 pages, I highly doubt someone who has viewed 75 of those pages in one session is a real person.

You can cross off strange requests or vunerability checks as bots too.  For instance the common check to see if the cmd.com is accessible.

Just a couple of ideas.  Although I think your current method will work for the majority of requests.
Alot of bots/spiders will have their Identity along side the browser like this:

Mozilla/4.0 (compatible; FastCrawler3, support-fastcrawler3@fast.no)


A good ref:
http://www.psychedelix.com/agents1.html

How are you tracking the reports or parsing yoru logs?

CJ
Avatar of cdillon

ASKER

We screen out bots that have their identity stated in the browser string.  The problem is that the list is changing/growing and then when a new bot finds our site, the reports suddenly show many more hits.
how are you screening them?  Is it inclusive or exclusive?

CJ
Avatar of cdillon

ASKER

We exclude browsers which have a user_agent with the words googlebot or scooter or ask jeeves or ....
Instead of excluding those, why not have an inclusive list of tracked browsers.  New browsers can easily be added, but it allows for you to account for any of the new bots without needing to keep track of them.

CJ
Avatar of cdillon

ASKER

Every toolbar adds it's own portion to the browser, so do site providers and others.  I started keeping track and so far we've had over 10,000 distinct browsers looking at our site.  It's not very handy to have to determine if every new browser is bot or not.
10,000 distinct browsers - does that include the bots?  Or are you saying that legit browsers have over 10K variations.

CJ
Avatar of cdillon

ASKER

mostly legit browsers and some bots mixed in.  By distinct browsers, I mean that the cgi.user_agent string is different.
this is a tough one.  you should be able to look through history and find a robust set of CGI.user_agent strings that you want to track.  And go with that.  Being comprehensive would be very difficult.

CJ
Avatar of cdillon

ASKER

I recommend point refund.
ASKER CERTIFIED SOLUTION
Avatar of modulo
modulo

Link to home
membership
This solution is only available to members.
To access this solution, you must be a member of Experts Exchange.
Start Free Trial