• Status: Solved
  • Priority: Medium
  • Security: Public
  • Views: 145
  • Last Modified:

non existant pages still getting spidered


i while back i created a dynamic page template which generated lots of pages fomr entries in a DB.

i added the page addresses to my google site map and uploaded it.

i then decided to change the urls of these pages and again uploaded my site map with the new urls in it - making sure i removed the old ones.

however, the old urls are still getting targeted by the googlebots and other spiders.

i have a email error report set up to tell me when errors in my site occur or a page can not be found and whenever a spider tries to access the no longer existent pages i get an error report.

there are no links to these old urls anywhere in my site and my google site map does not contain any reference to them.

the main ip address (but there are others) is and i understand that is a googlebot.

how can they still be getting spidered?

  • 2
  • 2
1 Solution
Do not worry.

If nothing is linking to the pages then they would be removed automatically.

Or you may mention that in your robots.txt

naifyboy123Author Commented:
hi ashishjvw

any idea how long i have to wait? it has been weeks already
Hey !

If you wish to remove then please use robots.txt .

I have some files on google which have been removed for months and are still there.

naifyboy123Author Commented:
ok thanks -

Featured Post

Sign your company up to try the MB 660 headset now

Take control and stay focused in noisy open office environments with the MB 660. By reducing background noise, you can revitalize your office and improve concentration.

  • 2
  • 2
Tackle projects and never again get stuck behind a technical roadblock.
Join Now