Robots.txt just to allow google and sharepoint crawls

I have a robots.txt file in one of my sharepoint sites.
I want it to allow only google and sharepoint itself to crawl this site.

I have a
User-agent  google

at the top and
User-agent: *
Disallow: /

at the bottom.
The last part supposedly stops the sharepoint crawl as well as the other
Does anybody know how to allow only
Google and sharepoint in your robots.txt file.

kind regards
Who is Participating?
Andrei TeodorescuConnect With a Mentor Business OwnerCommented:
try this way

User-agent: *
Disallow: /
User-agent: Googlebot
Allow: /
User-agent: Mozilla/4.0 (compatible; MSIE 4.01; Windows NT; MS Search 5.0 Robot)
Allow: /
Andrei TeodorescuBusiness OwnerCommented:
first of all, you need to update some registry entries:

then us the newly user agent string creted in your robotx.txt file.
WTFISTHISAuthor Commented:
Thanks for your response.

I just want to know what the user agent for sharepoint 2007 is.
is it "MS Search 5.0 Robot".

Because it looks like Sharepoint is not searching the sites when I
am using my current Robot.txt file (Mentioned in my question).
So all I want to do is allow Sharepoint and google and disallow everything else.

Question has a verified solution.

Are you are experiencing a similar issue? Get a personalized answer when you ask a related question.

Have a better answer? Share it in a comment.

All Courses

From novice to tech pro — start learning today.