Restrict bots from crawling directories

Google has index one of my directories and showing in search results, to stop the directory and its sub directories I was thinking to restrict bots from accessing the directory, how can I do that?
LVL 1
sahanzAsked:
Who is Participating?

Improve company productivity with a Business Account.Sign Up

x
 
stermeauConnect With a Mentor Commented:
You can use a robots.txt file.
There are some simple examples here : http://www.robotstxt.org/orig.html

But you should also modify your web server configuration to disable directory listing.
0
 
marektechCommented:
You could also us the following:

<meta name="robots" content="noindex,nofollow">

http://www.heritage-tech.net/188/alternative-to-using-robotstxt/
0
 
marektechCommented:
More information about the robots metatag:

http://www.robotstxt.org/meta.html
0
The 14th Annual Expert Award Winners

The results are in! Meet the top members of our 2017 Expert Awards. Congratulations to all who qualified!

 
sahanzAuthor Commented:
if I add those lines to the index file of the directory, will it stop from crawling sub directories?
0
 
marektechConnect With a Mentor Commented:
You can use the robots.txt option on the root of your website and specify the directories which should be no go areas.

For example:

User-Agent: Googlebot
Disallow: /private/private.htm
Disallow: /secret/

Or via the Meta tag method the tag should be present on each page which is not to be indexed.

<meta name="robots" content="noindex,nofollow">
0
 
sahanzAuthor Commented:
Thanks
0
Question has a verified solution.

Are you are experiencing a similar issue? Get a personalized answer when you ask a related question.

Have a better answer? Share it in a comment.

All Courses

From novice to tech pro — start learning today.