non existant pages still getting spidered

hi

i while back i created a dynamic page template which generated lots of pages fomr entries in a DB.

i added the page addresses to my google site map and uploaded it.

i then decided to change the urls of these pages and again uploaded my site map with the new urls in it - making sure i removed the old ones.

however, the old urls are still getting targeted by the googlebots and other spiders.

i have a email error report set up to tell me when errors in my site occur or a page can not be found and whenever a spider tries to access the no longer existent pages i get an error report.

there are no links to these old urls anywhere in my site and my google site map does not contain any reference to them.

the main ip address (but there are others) is 66.249.66.162 and i understand that is a googlebot.

how can they still be getting spidered?

thanks
naifyboy123Asked:
Who is Participating?
 
ashishjvwCommented:
Do not worry.

If nothing is linking to the pages then they would be removed automatically.

Or you may mention that in your robots.txt

Regards,
ASHISH T.
0
 
naifyboy123Author Commented:
hi ashishjvw

any idea how long i have to wait? it has been weeks already
0
 
ashishjvwCommented:
Hey !

If you wish to remove then please use robots.txt .

I have some files on google which have been removed for months and are still there.


ASHISH
0
 
naifyboy123Author Commented:
ok thanks -
0
Question has a verified solution.

Are you are experiencing a similar issue? Get a personalized answer when you ask a related question.

Have a better answer? Share it in a comment.

All Courses

From novice to tech pro — start learning today.