Learn how to a build a cloud-first strategyRegister Now

x
  • Status: Solved
  • Priority: Medium
  • Security: Public
  • Views: 362
  • Last Modified:

Want to remove URL from Googlebot search but can't recreate exact URL

I'm in charge of maintaining a website that has three underlying CMS MS Access databases and recently deleted a number of old records from the CMS as they were either very old news stories or job vacancies being publicised.

Google Webmaster Tools is showing that we have severe health issues with the site citing a number of currently "unreachable" sites (500 error).

I have gone to use the "Remove URL" tool but Google's support pages state:

To remove a page or image, you must do one of the following:
      1•      Make sure the content is no longer live on the web. Requests for the page must return an HTTP 404 (not found) or 410 status code.
      2•      Block the content using a robots.txt file.
      3•      Block the content using a meta noindex tag.

These are my perceived issues in meeting the above criteria:

1.  Can't create a page with URLs in questions and give them generic 404 error/410 error content as part of the URL is made up of the database record's ID field.  The field in question has an AutoNumber datatype so I can't manually enter new IDs to recreate the URLs.

2.  I am not keen on blocking with robots.txt as the URLs are sequentially numbered and (my understanding is) if I state that I want to block URL http://www.example.com/news/news_details.asp?id=7 then anything with this URL that has a sequential number starting with 7 will be blocked going forward, i.e: ?id=70, ?id=700 etc and I don't wish for those to be blocked.

3.  Don't think I can block content using a meta noindex tag for the reasons given in 1.

I would appreciate guidance with this as I don't want the website's ranking to suffer in anyway due to erroneous URLs.

Many thanks in advance.
0
CDIT_Solutions
Asked:
CDIT_Solutions
1 Solution
 
COBOLdinosaurCommented:
I assume you are sending the 500 responses because the query to the database gets an error.  Change the query so that on error it re-directs to your 404 page.
0
 
freshcontentCommented:
Another strategy that you could use would be to send a 301 redirect response code and then redirect Google/Bing search engine bots AND actual users from that "old" page that you've removed to your website's home page, or to a page created to take advantage of users coming into your website.

If your "old" pages had any search engine link juice with other websites pointing to those pages, this will redirect some of that "juice" within Google/Bing's algorithm to help out your website.

This is generally a good (best) method to enhance the Search Engine Optimization (SEO) value of those old pages in case anyone has links out on the internet pointing to them.

Let us know of any other follow-up questions...
0

Featured Post

Concerto Cloud for Software Providers & ISVs

Can Concerto Cloud Services help you focus on evolving your application offerings, while delivering the best cloud experience to your customers? From DevOps to revenue models and customer support, the answer is yes!

Learn how Concerto can help you.

Tackle projects and never again get stuck behind a technical roadblock.
Join Now