Solved

Robots.txt just to allow google and sharepoint crawls

Posted on 2009-07-05
4
2,119 Views
Last Modified: 2012-05-07
I have a robots.txt file in one of my sharepoint sites.
I want it to allow only google and sharepoint itself to crawl this site.

I have a
User-agent  google

at the top and
 
User-agent: *
Disallow: /

at the bottom.
The last part supposedly stops the sharepoint crawl as well as the other
crawlers.
Does anybody know how to allow only
Google and sharepoint in your robots.txt file.

kind regards
0
Comment
Question by:WTFISTHIS
[X]
Welcome to Experts Exchange

Add your voice to the tech community where 5M+ people just like you are talking about what matters.

  • Help others & share knowledge
  • Earn cash & points
  • Learn & ask questions
  • 2
4 Comments
 
LVL 11

Expert Comment

by:Andrei Teodorescu
ID: 24793725
first of all, you need to update some registry entries:
http://sharepoint.microsoft.com/blogs/LKuhn/Lists/Posts/Post.aspx?List=29310d0a%2D1eda%2D4834%2Dbb4c%2D06ee575a40c3&ID=49

then us the newly user agent string creted in your robotx.txt file.
0
 

Author Comment

by:WTFISTHIS
ID: 24799026
Thanks for your response.

I just want to know what the user agent for sharepoint 2007 is.
is it "MS Search 5.0 Robot".

Because it looks like Sharepoint is not searching the sites when I
am using my current Robot.txt file (Mentioned in my question).
So all I want to do is allow Sharepoint and google and disallow everything else.

0
 
LVL 11

Accepted Solution

by:
Andrei Teodorescu earned 250 total points
ID: 24800931
try this way

User-agent: *
Disallow: /
User-agent: Googlebot
Allow: /
User-agent: Mozilla/4.0 (compatible; MSIE 4.01; Windows NT; MS Search 5.0 Robot)
Allow: /
0

Featured Post

Independent Software Vendors: We Want Your Opinion

We value your feedback.

Take our survey and automatically be enter to win anyone of the following:
Yeti Cooler, Amazon eGift Card, and Movie eGift Card!

Question has a verified solution.

If you are experiencing a similar issue, please ask a related question

The vision: A MegaMenu for a SharePoint portal home page The mission: Make it easy to maintain. Allow rich content and sub headers as well as standard links. Factor in frequent changes without involving developers or a lengthy Dev/Test/Prod rel…
In case you ever have to remove a faulty web part from a page , add the following to the end of the page url ?contents=1
This video Micro Tutorial shows how to password-protect PDF files with free software. Many software products can do this, such as Adobe Acrobat (but not Adobe Reader), Nuance PaperPort, and Nuance Power PDF, but they are not free products. This vide…
Monitoring a network: why having a policy is the best policy? Michael Kulchisky, MCSE, MCSA, MCP, VTSP, VSP, CCSP outlines the enormous benefits of having a policy-based approach when monitoring medium and large networks. Software utilized in this v…

705 members asked questions and received personalized solutions in the past 7 days.

Join the community of 500,000 technology professionals and ask your questions.

Join & Ask a Question