Want to protect your cyber security and still get fast solutions? Ask a secure question today.Go Premium

x
  • Status: Solved
  • Priority: Medium
  • Security: Public
  • Views: 257
  • Last Modified:

How to pevent search engines from finding my sites? Robots.txt not working!

I have a number of websites on a development server that are showing up in Google searches.

I added robots.txt files (see code below)  to the web root of each site months ago...but the sites are still appearing in Google results.

Is there anyway I can hide these sites using Apache? Firewalls? etc?

FYI - All sites are in a folder path like this: /var/www/mysites/...

Any ideas?

# Robots.txt
User-agent: *
Disallow: /

Open in new window

0
bearclaws75
Asked:
bearclaws75
1 Solution
 
caterham_wwwCommented:
It could take several moths until the sites drop-out of a search engine's index.

> but the sites are still appearing in Google results.

and they're still spidered? Or does, e.g., googlebot request your robots.txt (lower-case r) but not any subsequent files?

You may also submit a removal request to speed things up: http://www.google.com/webmasters/tools/removals

ref: http://www.google.com/support/webmasters/bin/answer.py?answer=61062&ctx=sibling
0
 
bearclaws75Author Commented:
We added the Robots.txt a few weeks after the pages were originally published so it is likely that pages have not been re-cached.

To be safe, I submitted a removal request using webmaster tools.

Thanks!
0

Featured Post

Independent Software Vendors: We Want Your Opinion

We value your feedback.

Take our survey and automatically be enter to win anyone of the following:
Yeti Cooler, Amazon eGift Card, and Movie eGift Card!

Tackle projects and never again get stuck behind a technical roadblock.
Join Now