I'm in charge of maintaining a website that has three underlying CMS MS Access databases and recently deleted a number of old records from the CMS as they were either very old news stories or job vacancies being publicised.
Google Webmaster Tools is showing that we have severe health issues with the site citing a number of currently "unreachable" sites (500 error).
I have gone to use the "Remove URL" tool but Google's support pages state:
To remove a page or image, you must do one of the following:
1• Make sure the content is no longer live on the web. Requests for the page must return an HTTP 404 (not found) or 410 status code.
2• Block the content using a robots.txt file.
3• Block the content using a meta noindex tag.
These are my perceived issues in meeting the above criteria:
1. Can't create a page with URLs in questions and give them generic 404 error/410 error content as part of the URL is made up of the database record's ID field. The field in question has an AutoNumber datatype so I can't manually enter new IDs to recreate the URLs.
2. I am not keen on blocking with robots.txt as the URLs are sequentially numbered and (my understanding is) if I state that I want to block URL http://www.example.com/news/news_details.asp?id=7
then anything with this URL that has a sequential number starting with 7 will be blocked going forward, i.e: ?id=70, ?id=700 etc and I don't wish for those to be blocked.
3. Don't think I can block content using a meta noindex tag for the reasons given in 1.
I would appreciate guidance with this as I don't want the website's ranking to suffer in anyway due to erroneous URLs.
Many thanks in advance.