In my site I used a robots.txt to prevent crawlers' visit to be recorded in the database.
I used this code
$allrobots = file_get_contents( 'allrobots.txt' ); //robot-name:
preg_match_all( '/(?<=robot-id:\s).*(?=$)/im', $allrobots, $crawlers );
if ( !in_array( strtolower( $_SERVER['HTTP_USER_AGENT'] ), $crawlers ) )
//here write to the database the visitor's data
But this seems to fail since I still get recorded visits from crawlers: for instance I still se in database visits of no more existent pages from Mountain View (that is by Google, isn't it?)
So what is the best way to accomplish my goal?
Thanks to all for any advice.