GoogleBot shoes site unreadable (500)

Hello all,

Over a week ago we switched our site from ASP to ASP.NET, and 2 days ago I discovered that Google has not been able to index our site.  In Webmaster tools both the site map XML and the googlebot are reporting site unreachable.  

I checked the robot.tx and Google is able to see that, I even have it now with no disable links. When I try the fetch as googlebot, it shows unreadable.  I put a html page into the fetch and it finds it fine.

I looked into my web logs and I see the 500 error happening when teh browser is the googlebot.

I found web posts about the issue with the ASP.net 2.0 Mozilla detection hole and followed those steps.  Do not think that is the issues though. I loaded the plugging into FF (User Agent Switcher) and I can browse the site with no errors.  

I am not sure what else I am missing or where else to check.

My site is www.planetshoes.com

Any help would be great.
planet_scottAsked:
Who is Participating?
I wear a lot of hats...

"The solutions and answers provided on Experts Exchange have been extremely helpful to me over the last few years. I wear a lot of hats - Developer, Database Administrator, Help Desk, etc., so I know a lot of things but not a lot about one thing. Experts Exchange gives me answers from people who do know a lot about one thing, in a easy to use platform." -Todd S.

edster9999Commented:
The DNS entries are wrong.

On my server I did a dig command to see how it resolved and it worked ok.  It had 2 CNAME entries which is not exactly good practice but should work.
The top level host (normally the company you registered the name with) was akamaiedge.net

I then tried it on a web based version and this went to different top level DNS hosts

http://mydnscheck.com/index.php?op=diagnose&domain=www.planetshoes.com&ns1=&ns2=&ns3=&ns4=&ns5=&ns6=

Nameservers for PLANETSHOES.COM are as follows:
NS89.WORLDNIC.COM
NS90.WORLDNIC.COM

and this fails at the end with no nameservers for the www. bit

So first step is to check the NS records for PLANETSHOES.COM.  This should point to 2 (or more) servers that can tell anyone where the rest of the servers are for this domain.
0
planet_scottAuthor Commented:
Nothing has changed on our DNS in 6 months.  We use Akamai for CDN.  Google had no issue with our DNS before the move to ASP.net.

I will look into the findings above, but This can not be the issue, since Google can find our site to load the sitemap.  It can find the robot.txt and also I can fetch as Google, any page on the server accept the .net pages.

I even put an classic ASP page back on the site and I was able to do a fetch by Googlebot.

So I think this issue is internal to my web server.
0
planet_scottAuthor Commented:
I might have figured out the issue. We have the following code:
 User_Prefer_Langs = System.Web.HttpContext.Current.Request.ServerVariables["HTTP_ACCEPT_LANGUAGE"].ToString();  
arrLang = User_Prefer_Langs.Split(',');
User_Prefer_Lang = arrLang[0];   // always get the first language

This might have been the issue.  I was trying to get the language the browser is set to.
We changed to just set to "EN".  Looks like it might have fixed the issues.
I hope to see soon.
0

Experts Exchange Solution brought to you by

Your issues matter to us.

Facing a tech roadblock? Get the help and guidance you need from experienced professionals who care. Ask your question anytime, anywhere, with no hassle.

Start your 7-day free trial
It's more than this solution.Get answers and train to solve all your tech problems - anytime, anywhere.Try it for free Edge Out The Competitionfor your dream job with proven skills and certifications.Get started today Stand Outas the employee with proven skills.Start learning today for free Move Your Career Forwardwith certification training in the latest technologies.Start your trial today
Web Development

From novice to tech pro — start learning today.