Hi,
I created a website (
www.thefirstaidbox.co.uk ) a few months ago & it still has not been crawled by a googlebot properly. I looked at my logs & googlebot has visited twice - each time it did a GET (only) for the following :
/robots.txt - Expected this (& I guess I should create one)
/search.html - No such page (So I'll probably create one!)
/cgi-bin/ocb/ocb.cgi
/quikstore.html
/quikcode.html+
The last three appear to be related to some e-commerce software - which this site doesn't use (we use a different package)
I'm very confused as to why the googlebot asked for these pages & why it didn't try & crawl from index.htm(l) - The site homepage WAS submitted to google.
I'm aware that the site is short of both links and content - which is being worked on - but that doesn't explain the odd googlebot behaviour.
Any ideas?