• Status: Solved
  • Priority: Medium
  • Security: Public
  • Views: 591
  • Last Modified:

GoogleBot shoes site unreadable (500)

Hello all,

Over a week ago we switched our site from ASP to ASP.NET, and 2 days ago I discovered that Google has not been able to index our site.  In Webmaster tools both the site map XML and the googlebot are reporting site unreachable.  

I checked the robot.tx and Google is able to see that, I even have it now with no disable links. When I try the fetch as googlebot, it shows unreadable.  I put a html page into the fetch and it finds it fine.

I looked into my web logs and I see the 500 error happening when teh browser is the googlebot.

I found web posts about the issue with the ASP.net 2.0 Mozilla detection hole and followed those steps.  Do not think that is the issues though. I loaded the plugging into FF (User Agent Switcher) and I can browse the site with no errors.  

I am not sure what else I am missing or where else to check.

My site is www.planetshoes.com

Any help would be great.
0
planet_scott
Asked:
planet_scott
  • 2
1 Solution
 
edster9999Commented:
The DNS entries are wrong.

On my server I did a dig command to see how it resolved and it worked ok.  It had 2 CNAME entries which is not exactly good practice but should work.
The top level host (normally the company you registered the name with) was akamaiedge.net

I then tried it on a web based version and this went to different top level DNS hosts

http://mydnscheck.com/index.php?op=diagnose&domain=www.planetshoes.com&ns1=&ns2=&ns3=&ns4=&ns5=&ns6=

Nameservers for PLANETSHOES.COM are as follows:
NS89.WORLDNIC.COM
NS90.WORLDNIC.COM

and this fails at the end with no nameservers for the www. bit

So first step is to check the NS records for PLANETSHOES.COM.  This should point to 2 (or more) servers that can tell anyone where the rest of the servers are for this domain.
0
 
planet_scottAuthor Commented:
Nothing has changed on our DNS in 6 months.  We use Akamai for CDN.  Google had no issue with our DNS before the move to ASP.net.

I will look into the findings above, but This can not be the issue, since Google can find our site to load the sitemap.  It can find the robot.txt and also I can fetch as Google, any page on the server accept the .net pages.

I even put an classic ASP page back on the site and I was able to do a fetch by Googlebot.

So I think this issue is internal to my web server.
0
 
planet_scottAuthor Commented:
I might have figured out the issue. We have the following code:
 User_Prefer_Langs = System.Web.HttpContext.Current.Request.ServerVariables["HTTP_ACCEPT_LANGUAGE"].ToString();  
arrLang = User_Prefer_Langs.Split(',');
User_Prefer_Lang = arrLang[0];   // always get the first language

This might have been the issue.  I was trying to get the language the browser is set to.
We changed to just set to "EN".  Looks like it might have fixed the issues.
I hope to see soon.
0
Question has a verified solution.

Are you are experiencing a similar issue? Get a personalized answer when you ask a related question.

Have a better answer? Share it in a comment.

Join & Write a Comment

Featured Post

Free Tool: Port Scanner

Check which ports are open to the outside world. Helps make sure that your firewall rules are working as intended.

One of a set of tools we are providing to everyone as a way of saying thank you for being a part of the community.

  • 2
Tackle projects and never again get stuck behind a technical roadblock.
Join Now