I understand the ROBOTS.TXT file that people to prevent bots from accessing certain pages on your site, but I don't want them to not visit all my pages. In fact, I want them to visit all my pages and see all the reciprocal links I use on one of my sites... but just NOT actually click them as though they were a user.
Is there a way to prevent a bot from pretending to be a user and clicking through a text link on my website?
I have (for example) text links like sponsored by abc company where abc company is a hyperlink to abc company's website. I also have a text hyperlink that says CLICK HERE IF YOU WANT TO REPORT THIS USER on a blog section of my site. Every day I get at least 6 emails from the script that is executed when someone clicks the REPORT THIS USER link. I don't want it to be a form where I validate a human (captcha) to report the user, because nobody will take the time.
In addition, when these bots click by SPONSORED BY links, it messes up the statistics of clickthroughs I'm tracking.
I've found an interesting HTTP_REFFERER snippet that someone said they use, but don't know the semantics on how to get that to work.