Restrict from search engines?

Is there a way, meta tags or otherwise, to keep the search engines from indexing a directory on my site?

Thank you in advance for your help and superlative brain power!

-- Scott
scottb50Asked:
Who is Participating?

Improve company productivity with a Business Account.Sign Up

x
 
humeniukConnect With a Mentor Commented:
The most effective way of doing this is with a robots.txt file - see www.robotstxt.org.

You can also use a noindex meta tag (www.robotstxt.org/meta.html), but is not quite as reliable as the robots.txt.
0
 
scottb50Author Commented:
Just by answering, you officially rock.
0
 
Bernard S.Connect With a Mentor CTOCommented:
Note that you have no real way to prevent all robots to spider this directory. Using robots.txt will give "clean robots" indication not to spider the directory... but it should be considered as a "red flag" for some inquisitive robots.
--> use exclusion by robots.txt, this will give the "well-behaving" robots (of which the lots important ones) indications of not indexing this directory and so not loosing useless time.
--> do NOT hope that this would protect a directory; so, if the directory content needs to be protected, use htaccess to restrict access; this would keep away "bad-behaving" (but well intentioned) robots from indexing your directory, and would keep away bad-intentioned spiders.
0
Get 10% Off Your First Squarespace Website

Ready to showcase your work, publish content or promote your business online? With Squarespace’s award-winning templates and 24/7 customer service, getting started is simple. Head to Squarespace.com and use offer code ‘EXPERTS’ to get 10% off your first purchase.

 
Server_JockeyConnect With a Mentor Commented:
Hi,

1. Create a file named robots.txt
2. Place the following inside the file

User-agent: *
Disallow: /

3. Save the file and upload it to the root of your website.

The two lines above basically denies any spiders or crawlers to your website.

Good Luck,

Server Jockey

*** url sig deleted per - www.experts-exchange.com/help.jsp#hi22 ***


0
 
Bernard S.CTOCommented:
as an additional safety measure against curiosity, check that all of your "to be protected" directories do have an index.htm (or index.php, or...) that protects them.
The minimum  is an empty "index.htm" file which at least prevents http / html browsing of the directory and of its files.
Smarter is an index.htm file with a http AND javascript redirects to your home page.
Of course, if you already use an index.* file in this directory this cannot be done directly... but you might decide to rename this index.htm file (remember, it is opened by default if you go to the directory without specifying the page name) to some other difficult-to-guess name (eg, xedni.htm) AND putting a protective index.htm.

This would NOT protect you against direct access to a file in the directory (eg, a link coming from a "normal page" to a page in the directory) BUT at least it would prevent inquisitive badly-educated robots to explore all the file in a directory defined in a robots.txt file as a "secret directory".
0
 
Bernard S.CTOCommented:
Scottb,

Remember closing the question by allocating the points to one or several of the experts whoo answered
0
Question has a verified solution.

Are you are experiencing a similar issue? Get a personalized answer when you ask a related question.

Have a better answer? Share it in a comment.

All Courses

From novice to tech pro — start learning today.