I am running an Apache (1.13) httpd server on a Unix
box. The system is constantly being probed by various
annoying robots that ignore robots.txt. I would like a
Perl CGI script that tarpits those robots by limiting the
bandwidth of the response to some small value.
In other words, here's a little bit of data, Mr. Bot (not necessarily what you requested, either), stall, stall, here's
a little bit more, stall, stall ... and let's see how long we
can keep you teergrubed here with this tempting big file.
It would be entertaining, though certainly not necessary,
if the script tracked how long it was able to keep a bot
on the hook, and kept a Top 10 list of same.
I have looked through the cpan archives, and can't find
anything that quite fits, and I don't trust my own Perl
skill enough to handle exception conditions such as an
unexpected remote disconnect during transfer.