?
Solved

GoogleBot Timeout errors

Posted on 2010-11-15
8
Medium Priority
?
1,044 Views
Last Modified: 2012-05-10
Hello Exports,
Google has stopped indexing my site for some reason, I moved to host my stie on my own servers a few weeks ago and Google as stopped indexing my site.
I get the following error:
We encountered an error while trying to access your Sitemap. Please ensure your Sitemap follows our guidelines and can be accessed at the location you provided and then resubmit.
I am able to access botht he sitemmap and the robots.txt file without any problems no timeouts errors.
I am thinking my firewall in blocking Googlebot from accessing my site.
Is anyone know which ports need to be open to allow Googlebot to index my site?
My site is locate in the DMZ, my firewall is Palo Alto Firewall.
Thank you
Roy
0
Comment
Question by:rfinaly
  • 4
  • 4
8 Comments
 
LVL 84

Accepted Solution

by:
Dave Baldwin earned 2000 total points
ID: 34141710
Googlebot comes in on port 80 like a web browser would.  Can you post your web address so we can check it out?
0
 
LVL 84

Expert Comment

by:Dave Baldwin
ID: 34148408
Your 'robots.txt' file doesn't look right to me.  'Bad robots' typically Ignore 'robots.txt'.  You may be blocking Googlebot because it comes in as:

Mozilla/5.0 (compatible; Googlebot/2.1; +http://www.google.com/bot.html)

0
The new generation of project management tools

With monday.com’s project management tool, you can see what everyone on your team is working in a single glance. Its intuitive dashboards are customizable, so you can create systems that work for you.

 

Author Comment

by:rfinaly
ID: 34150023
I logged in to Google Web Master and Generated a new robots.txt file using Google Tools.
Here is the file http://www.usuniversity.edu/robots.txt
When testing the robots.txt file also with Google tooles I get:
http://www.usuniversity.edu/ Allowed by line 2: Allow: /
Detected as a directory; specific files may have different restrictions.

I am assuming it is good? I will resubmit my sitemaps files and see what happens.
Thank you
Roy
0
 

Author Comment

by:rfinaly
ID: 34150472
I also tested a few pages with Google Fetch as Googlebot tool and this is what I get:
This is how Googlebot fetched the page.

URL: http://www.usuniversity.edu/

Date: Tue Nov 16 13:46:57 PST 2010

Googlebot Type: Web

When Submitting the sitemap.html file I get:
URL timeout: robots.txt timeout
http://www.usuniversity.edu/sitemap.html

Should I remove the robots.txt all together?
Roy
0
 
LVL 84

Expert Comment

by:Dave Baldwin
ID: 34151227
I'm lost at this point.  Click on "Request Attention" and get some others to look at your question.
0
 

Author Comment

by:rfinaly
ID: 34177788
I was able to resolve the problem, it was my firewall that blocks WEB-CRALERS.
After openning the port everything want back to normal.
Thank you
Roy
0
 
LVL 84

Expert Comment

by:Dave Baldwin
ID: 34177870
Cool, thanks.
0

Featured Post

Never miss a deadline with monday.com

The revolutionary project management tool is here!   Plan visually with a single glance and make sure your projects get done.

Question has a verified solution.

If you are experiencing a similar issue, please ask a related question

How do you create a user-centered user experience on your website? And what are some things you should consider in the process?
These seven tips can help you create an extraordinary website, one that captivates audiences and has them wanting to return regularly for more. Keep reading to find out what your site is missing and what you need to add!
The viewer will learn how to create and use a small PHP class to apply a watermark to an image. This video shows the viewer the setup for the PHP watermark as well as important coding language. Continue to Part 2 to learn the core code used in creat…
The viewer will get a basic understanding of what section 508 compliance can entail, learn about skip navigation links, alt text, transcripts, and font size controls.
Suggested Courses
Course of the Month5 days, 16 hours left to enroll

589 members asked questions and received personalized solutions in the past 7 days.

Join the community of 500,000 technology professionals and ask your questions.

Join & Ask a Question