i while back i created a dynamic page template which generated lots of pages fomr entries in a DB.
i added the page addresses to my google site map and uploaded it.
i then decided to change the urls of these pages and again uploaded my site map with the new urls in it - making sure i removed the old ones.
however, the old urls are still getting targeted by the googlebots and other spiders.
i have a email error report set up to tell me when errors in my site occur or a page can not be found and whenever a spider tries to access the no longer existent pages i get an error report.
there are no links to these old urls anywhere in my site and my google site map does not contain any reference to them.
the main ip address (but there are others) is 126.96.36.199 and i understand that is a googlebot.
how can they still be getting spidered?