robots.txt

Could someone kindly post a sample robots.txt file that would encourage all the spiders?   and then just as a sample, a second file that would exclude one particular page named, say, xyz.html

thanks
linqueAsked:
Who is Participating?
 
duzConnect With a Mentor Commented:
Hi linque -

The following allows all robots to visit all files because the wildcard "*" specifies all robots.

User-agent: *
Disallow:

This one bars all robots from the the images directory.

User-agent: *
Disallow: /images/

This one bars googlebot from getting at the xyz.html file.

User-agent: googlebot
Disallow: xyz.html

- duz
0
 
linqueAuthor Commented:
THANK YOU DUZ!
0
Question has a verified solution.

Are you are experiencing a similar issue? Get a personalized answer when you ask a related question.

Have a better answer? Share it in a comment.

All Courses

From novice to tech pro — start learning today.