Solved

Google Bots - Killing the Server

Posted on 2014-04-14
7
489 Views
Last Modified: 2014-04-15
Hi

We do web hosting and some of the websites hosted at our servers have something that Google Bots keep working with them so often and for very long time that the server becomes slow. Sometimes it is over a million hits to one website by Google bot in a day.

We are trying to find out a solution that even if our customer has not configured its website correctly with Google bot/Webmaster tools Google cannot do that much hits to our server.
Currently in such cases we block the Google Bot IPs in IPTables and the servers become very good but in that case the customers having good websites also suffer.

Can someone please suggest a solution to this?

We are running CentOS 6.5 64bit and using NginX and Apache at our servers.
0
Comment
Question by:sysautomation
[X]
Welcome to Experts Exchange

Add your voice to the tech community where 5M+ people just like you are talking about what matters.

  • Help others & share knowledge
  • Earn cash & points
  • Learn & ask questions
7 Comments
 
LVL 25

Expert Comment

by:Zephyr ICT
ID: 39998621
Did you already look into the Webmaster Tools of Google? You'll have to create an account if you don't have one though

You can limit the Googlebot Crawling rate: https://support.google.com/webmasters/answer/48620?hl=en

Might be worth checking out?
0
 
LVL 53

Expert Comment

by:Scott Fell, EE MVE
ID: 39998753
That sounds odd for google.  Is there a special app you have created?  Or is there one domain that has an issue?  I would look for the page that is causing the problem and send a note to the domain owner to fix their page / limit google or be turned off.

It sounds like they must have a dynamic page with a lot of links and the queries they use take up a lot of resources.  

In any case, it you probably  have to have the domain owner take care of it or limit/shut off their service.
0
 

Author Comment

by:sysautomation
ID: 39998830
Yes it is dynamic. We are hosting Oracle Apex applications and have little control over customers except to force them when the server is in problem. But what I really look for is some preventive measure.
0
What Is Transaction Monitoring and who needs it?

Synthetic Transaction Monitoring that you need for the day to day, which ensures your business website keeps running optimally, and that there is no downtime to impact your customer experience.

 
LVL 53

Accepted Solution

by:
Scott Fell,  EE MVE earned 500 total points
ID: 39998849
I think as the server owner, all you can do is determine which customer and send them a notice they are using up more than their allotted resources.    Googlebot listens to the domains, not the server.  If you have control of the domain, then you can use webmaster tools to limit or you can use robots.txt to prevent googlebot from crawling a folder or you can use a noindex tag on a page to prevent it from crawling the page https://support.google.com/webmasters/answer/93708

You can also set up your serverside programming to prevent one user from paging through too many pages in a certain time.

In any case, this is a domain function and not a server function as far as being able to tell googlebot what to do.
0
 
LVL 15

Expert Comment

by:Giovanni Heward
ID: 40000140
Bear in mind the user-agent can easily be spoofed, so the bot may not actually belong to google.  (Verify the IP with ARIN to confirm.)
0
 
LVL 53

Expert Comment

by:Scott Fell, EE MVE
ID: 40000210
That is a great point!  

Like I said http:#a39998753, this did not sound right for google.
0
 
LVL 83

Expert Comment

by:Dave Baldwin
ID: 40000293
Here's what Google says about verifying their Googlebots: https://support.google.com/webmasters/answer/80553?hl=en
0

Featured Post

[Webinar] How Hackers Steal Your Credentials

Do You Know How Hackers Steal Your Credentials? Join us and Skyport Systems to learn how hackers steal your credentials and why Active Directory must be secure to stop them. Thursday, July 13, 2017 10:00 A.M. PDT

Question has a verified solution.

If you are experiencing a similar issue, please ask a related question

In Solr 4.0 it is possible to atomically (or partially) update individual fields in a document. This article will show the operations possible for atomic updating as well as setting up your Solr instance to be able to perform the actions. One major …
SEO can be a real minefield to navigate, but there are three simple ways to up your SEO game just be re-assessing your content output.
This tutorial walks through the best practices in adding a local business to Google Maps including how to properly search for duplicates, marker placement, and inputing business details. Login to your Google Account, then search for "Google Mapmaker…
This Micro Tutorial will demonstrate how to add subdomains to your content reports. This can be very importing in having a site with multiple subdomains.

688 members asked questions and received personalized solutions in the past 7 days.

Join the community of 500,000 technology professionals and ask your questions.

Join & Ask a Question