Ok guys, here is the problem, I know that exist out there a lots of spider programs that go on your website and open all pages and get your content, or maybe an hacker that wants to get your content he can use this kind of program to get it with no problem.
So how can we stop this thing. I know when those programs go to get the website they open one page after another, maybe have an function that checks the rate the user opens pages, lets say 1 pages / 5 second is an spider, so redirect the user to an page that he have to type an authentication using some image? The point is, what is the best way to stop this without losing also performance and letting others crawlers like google pass those authentication so we can have our website indexed and also block unwanted crawlers.
What you guys say?