Link to home
Start Free TrialLog in
Avatar of itcdr
itcdrFlag for United States of America

asked on

What to do about automated queries against your site?

We use to have RSS feeds on our site, however they have been discontinued for months. Unfortunately there are still requests every couple seconds from automated RSS readers. All there getting back is a 404 page not found, but still continues and it's creating tons of errors in our logs.

anyway to block requests for certain urls?
Avatar of Smart_Man
Flag of Egypt image

?! they are already blocked i guess as they ave the 404. maybe you would like better to redirect them or try something better to organize or read your logs. as long as they are not hacking attacks (DoS).

otherwise please explain more what do you mean by blocking them ? as to redirect them to another page ? or what and would not the action of blocking them be considered as a log entry itself ?

waiting for your reply
Avatar of Steve Bink
Steve Bink
Flag of United States of America image

Link to home
This solution is only available to members.
To access this solution, you must be a member of Experts Exchange.
Start Free Trial
redirect them to a vaild small html page. this will get them removed from the error log.

<html><body>RSS discontinued</body></html>

or you could take the ip addresses that continue to hit it and completely ignore the request, thereby letting the client timeout.

or if you want to annoy them,  make a php program that returns an rss entry of "The current time of day is... " every time a request is received, when the user see 700 new entries with "The current time is..." they will probably delete the url from their software.
Avatar of itcdr


@routinet, thanks. that is pretty much what I was looking for. A way to block the url without sending them our error page or displaying in our error_logs. So it's the miminal info sent.

^/rss/(.*)$ [L]