GoogleBot shoes site unreadable (500)

Hello all,

Over a week ago we switched our site from ASP to ASP.NET, and 2 days ago I discovered that Google has not been able to index our site.  In Webmaster tools both the site map XML and the googlebot are reporting site unreachable.  

I checked the robot.tx and Google is able to see that, I even have it now with no disable links. When I try the fetch as googlebot, it shows unreadable.  I put a html page into the fetch and it finds it fine.

I looked into my web logs and I see the 500 error happening when teh browser is the googlebot.

I found web posts about the issue with the 2.0 Mozilla detection hole and followed those steps.  Do not think that is the issues though. I loaded the plugging into FF (User Agent Switcher) and I can browse the site with no errors.  

I am not sure what else I am missing or where else to check.

My site is

Any help would be great.
Who is Participating?
planet_scottConnect With a Mentor Author Commented:
I might have figured out the issue. We have the following code:
 User_Prefer_Langs = System.Web.HttpContext.Current.Request.ServerVariables["HTTP_ACCEPT_LANGUAGE"].ToString();  
arrLang = User_Prefer_Langs.Split(',');
User_Prefer_Lang = arrLang[0];   // always get the first language

This might have been the issue.  I was trying to get the language the browser is set to.
We changed to just set to "EN".  Looks like it might have fixed the issues.
I hope to see soon.
The DNS entries are wrong.

On my server I did a dig command to see how it resolved and it worked ok.  It had 2 CNAME entries which is not exactly good practice but should work.
The top level host (normally the company you registered the name with) was

I then tried it on a web based version and this went to different top level DNS hosts

Nameservers for PLANETSHOES.COM are as follows:

and this fails at the end with no nameservers for the www. bit

So first step is to check the NS records for PLANETSHOES.COM.  This should point to 2 (or more) servers that can tell anyone where the rest of the servers are for this domain.
planet_scottAuthor Commented:
Nothing has changed on our DNS in 6 months.  We use Akamai for CDN.  Google had no issue with our DNS before the move to

I will look into the findings above, but This can not be the issue, since Google can find our site to load the sitemap.  It can find the robot.txt and also I can fetch as Google, any page on the server accept the .net pages.

I even put an classic ASP page back on the site and I was able to do a fetch by Googlebot.

So I think this issue is internal to my web server.
Question has a verified solution.

Are you are experiencing a similar issue? Get a personalized answer when you ask a related question.

Have a better answer? Share it in a comment.

All Courses

From novice to tech pro — start learning today.