Link to home
Start Free TrialLog in
Avatar of macuser777
macuser777Flag for United Kingdom of Great Britain and Northern Ireland

asked on

Robots.txt to stop just one link being spidered?

Hi,

Is there a way to wrap some kind of robots.txt code around just one link one page to stop it being spidered by Google et al, or some otherway of doing the same?
I had a client link on a site and it's a really good demo for my software, but he wanted it removed because it showed up in google as being on my site.

<a href="http://www.this-site-i-don't-want-spidered.co.uk">this-site</a><br>

thanks
Avatar of Michel Plungjan
Michel Plungjan
Flag of Denmark image

In "http://www.this-site-i-don't-want-spidered.co.uk/robots.txt you can have
User-agent: *
Disallow: /

and in the page that links to it you can have

<meta name="robots" content="noindex,nofollow">

or have this around the link

  <!-- robots content="nofollow" --><a href="http://www.this-site-i-don't-want-spidered.co.uk/"> Do not follow this</a><!-- /robots -->


Michel
ASKER CERTIFIED SOLUTION
Avatar of Michel Plungjan
Michel Plungjan
Flag of Denmark image

Link to home
membership
This solution is only available to members.
To access this solution, you must be a member of Experts Exchange.
Start Free Trial
Avatar of macuser777

ASKER

<!-- robots content="nofollow" --><a href="http://www.this-site-i-don't-want-spidered.co.uk/"> Do not follow this</a><!-- /robots -->

hi,

pardon my ignorance, but i thought <!--  something --> turned the code into a comment and effectively disabled it. does the term 'robots' have an exclusion on this?

<a href="..." rel="nofollow">Do not follow link</a> - looks nice and simple anyhow - thx i'll use that when i get home. but i'd like to understand the <!-- thing -> thing.

macuser
<!--#include file .... --> will mean something to a server side include enabled server

There are many html comments that are read by something
Thank you for that. Good to know.

macuser