macuser777
asked on
Robots.txt to stop just one link being spidered?
Hi,
Is there a way to wrap some kind of robots.txt code around just one link one page to stop it being spidered by Google et al, or some otherway of doing the same?
I had a client link on a site and it's a really good demo for my software, but he wanted it removed because it showed up in google as being on my site.
<a href="http://www.this-site-i-don't-want-spidered.co.uk">this-site</a><br>
thanks
Is there a way to wrap some kind of robots.txt code around just one link one page to stop it being spidered by Google et al, or some otherway of doing the same?
I had a client link on a site and it's a really good demo for my software, but he wanted it removed because it showed up in google as being on my site.
<a href="http://www.this-site-i-don't-want-spidered.co.uk">this-site</a><br>
thanks
ASKER CERTIFIED SOLUTION
membership
This solution is only available to members.
To access this solution, you must be a member of Experts Exchange.
ASKER
<!-- robots content="nofollow" --><a href="http://www.this-site-i-don't-want-spidered.co.uk/"> Do not follow this</a><!-- /robots -->
hi,
pardon my ignorance, but i thought <!-- something --> turned the code into a comment and effectively disabled it. does the term 'robots' have an exclusion on this?
<a href="..." rel="nofollow">Do not follow link</a> - looks nice and simple anyhow - thx i'll use that when i get home. but i'd like to understand the <!-- thing -> thing.
macuser
hi,
pardon my ignorance, but i thought <!-- something --> turned the code into a comment and effectively disabled it. does the term 'robots' have an exclusion on this?
<a href="..." rel="nofollow">Do not follow link</a> - looks nice and simple anyhow - thx i'll use that when i get home. but i'd like to understand the <!-- thing -> thing.
macuser
<!--#include file .... --> will mean something to a server side include enabled server
There are many html comments that are read by something
There are many html comments that are read by something
ASKER
Thank you for that. Good to know.
macuser
macuser
User-agent: *
Disallow: /
and in the page that links to it you can have
<meta name="robots" content="noindex,nofollow"
or have this around the link
<!-- robots content="nofollow" --><a href="http://www.this-site-i-don't-want-spidered.co.uk/"> Do not follow this</a><!-- /robots -->
Michel