Solved

How to pevent search engines from finding my sites? Robots.txt not working!

Posted on 2009-03-31
2
203 Views
Last Modified: 2012-05-06
I have a number of websites on a development server that are showing up in Google searches.

I added robots.txt files (see code below)  to the web root of each site months ago...but the sites are still appearing in Google results.

Is there anyway I can hide these sites using Apache? Firewalls? etc?

FYI - All sites are in a folder path like this: /var/www/mysites/...

Any ideas?

# Robots.txt

User-agent: *

Disallow: /

Open in new window

0
Comment
Question by:bearclaws75
2 Comments
 
LVL 27

Accepted Solution

by:
caterham_www earned 250 total points
ID: 24033106
It could take several moths until the sites drop-out of a search engine's index.

> but the sites are still appearing in Google results.

and they're still spidered? Or does, e.g., googlebot request your robots.txt (lower-case r) but not any subsequent files?

You may also submit a removal request to speed things up: http://www.google.com/webmasters/tools/removals

ref: http://www.google.com/support/webmasters/bin/answer.py?answer=61062&ctx=sibling
0
 

Author Comment

by:bearclaws75
ID: 24090788
We added the Robots.txt a few weeks after the pages were originally published so it is likely that pages have not been re-cached.

To be safe, I submitted a removal request using webmaster tools.

Thanks!
0

Featured Post

Complete Microsoft Windows PC® & Mac Backup

Backup and recovery solutions to protect all your PCs & Mac– on-premises or in remote locations. Acronis backs up entire PC or Mac with patented reliable disk imaging technology and you will be able to restore workstations to a new, dissimilar hardware in minutes.

Join & Write a Comment

Article by: kevp75
Hey folks, 'bout time for me to come around with a little tip. Thanks to IIS 7.5 Extensions and Microsoft (well... really Windows 8, and IIS 8 I guess...), we can now prime our Application Pools, when IIS starts. Now, though it would be nice t…
It is possible to boost certain documents at query time in Solr. Query time boosting can be a powerful resource for finding the most relevant and "best" content. Of course the more information you index, the more fields you will be able to use for y…
This tutorial walks through the best practices in adding a local business to Google Maps including how to properly search for duplicates, marker placement, and inputing business details. Login to your Google Account, then search for "Google Mapmaker…
This Micro Tutorial will demonstrate how to add subdomains to your content reports. This can be very importing in having a site with multiple subdomains.

758 members asked questions and received personalized solutions in the past 7 days.

Join the community of 500,000 technology professionals and ask your questions.

Join & Ask a Question

Need Help in Real-Time?

Connect with top rated Experts

19 Experts available now in Live!

Get 1:1 Help Now