Solved

Google Bots - Killing the Server

Posted on 2014-04-14
7
477 Views
Last Modified: 2014-04-15
Hi

We do web hosting and some of the websites hosted at our servers have something that Google Bots keep working with them so often and for very long time that the server becomes slow. Sometimes it is over a million hits to one website by Google bot in a day.

We are trying to find out a solution that even if our customer has not configured its website correctly with Google bot/Webmaster tools Google cannot do that much hits to our server.
Currently in such cases we block the Google Bot IPs in IPTables and the servers become very good but in that case the customers having good websites also suffer.

Can someone please suggest a solution to this?

We are running CentOS 6.5 64bit and using NginX and Apache at our servers.
0
Comment
Question by:sysautomation
7 Comments
 
LVL 25

Expert Comment

by:Zephyr ICT
ID: 39998621
Did you already look into the Webmaster Tools of Google? You'll have to create an account if you don't have one though

You can limit the Googlebot Crawling rate: https://support.google.com/webmasters/answer/48620?hl=en

Might be worth checking out?
0
 
LVL 52

Expert Comment

by:Scott Fell, EE MVE
ID: 39998753
That sounds odd for google.  Is there a special app you have created?  Or is there one domain that has an issue?  I would look for the page that is causing the problem and send a note to the domain owner to fix their page / limit google or be turned off.

It sounds like they must have a dynamic page with a lot of links and the queries they use take up a lot of resources.  

In any case, it you probably  have to have the domain owner take care of it or limit/shut off their service.
0
 

Author Comment

by:sysautomation
ID: 39998830
Yes it is dynamic. We are hosting Oracle Apex applications and have little control over customers except to force them when the server is in problem. But what I really look for is some preventive measure.
0
PRTG Network Monitor: Intuitive Network Monitoring

Network Monitoring is essential to ensure that computer systems and network devices are running. Use PRTG to monitor LANs, servers, websites, applications and devices, bandwidth, virtual environments, remote systems, IoT, and many more. PRTG is easy to set up & use.

 
LVL 52

Accepted Solution

by:
Scott Fell,  EE MVE earned 500 total points
ID: 39998849
I think as the server owner, all you can do is determine which customer and send them a notice they are using up more than their allotted resources.    Googlebot listens to the domains, not the server.  If you have control of the domain, then you can use webmaster tools to limit or you can use robots.txt to prevent googlebot from crawling a folder or you can use a noindex tag on a page to prevent it from crawling the page https://support.google.com/webmasters/answer/93708

You can also set up your serverside programming to prevent one user from paging through too many pages in a certain time.

In any case, this is a domain function and not a server function as far as being able to tell googlebot what to do.
0
 
LVL 14

Expert Comment

by:Giovanni Heward
ID: 40000140
Bear in mind the user-agent can easily be spoofed, so the bot may not actually belong to google.  (Verify the IP with ARIN to confirm.)
0
 
LVL 52

Expert Comment

by:Scott Fell, EE MVE
ID: 40000210
That is a great point!  

Like I said http:#a39998753, this did not sound right for google.
0
 
LVL 83

Expert Comment

by:Dave Baldwin
ID: 40000293
Here's what Google says about verifying their Googlebots: https://support.google.com/webmasters/answer/80553?hl=en
0

Featured Post

Windows Server 2016: All you need to know

Learn about Hyper-V features that increase functionality and usability of Microsoft Windows Server 2016. Also, throughout this eBook, you’ll find some basic PowerShell examples that will help you leverage the scripts in your environments!

Question has a verified solution.

If you are experiencing a similar issue, please ask a related question

In Solr 4.0 it is possible to atomically (or partially) update individual fields in a document. This article will show the operations possible for atomic updating as well as setting up your Solr instance to be able to perform the actions. One major …
FAQ pages provide a simple way for you to supply and for customers to find answers to the most common questions about your company. Here are six reasons why your company website should have a FAQ page
This tutorial walks through the best practices in adding a local business to Google Maps including how to properly search for duplicates, marker placement, and inputing business details. Login to your Google Account, then search for "Google Mapmaker…
This Micro Tutorial will demonstrate how to add subdomains to your content reports. This can be very importing in having a site with multiple subdomains.

777 members asked questions and received personalized solutions in the past 7 days.

Join the community of 500,000 technology professionals and ask your questions.

Join & Ask a Question