Link to home
Start Free TrialLog in
Avatar of style-sheets
style-sheets

asked on

How to force one thread per download

Hi,

I have a php script that serve files dynamically.

Is there a reliable way to force anyone who call my script (ie. download file served) to use only 1 thread (ie. especially those using flashget / download managers that create multiple threads per download)

Thanks!
Avatar of Ray Paseur
Ray Paseur
Flag of United States of America image

You might try putting a cookie on their browser.  Use an expiration of zero, so the cookie goes away when they close the browser.  If you find the cookie in the request, you could issue a message or die or something like that.

Of course for this to work, the downloading application would need to act like a well-behaved browser.  Hackers probably will not do that.

You could also require some kind of login before permitting a download.  That would enable you to tighten up the cookie environment so the other part of this would work more reliably.
Avatar of style-sheets
style-sheets

ASKER

Thanks, yes file access is indeed restricted & does require a login.

With that said, I don't want to block flashget / download accelerators. What I want is to block the option that those softwares open 5-10 connection to the same file (which does break my script logic).
Avatar of Darude1234
I think you have to limit the connection serverside and not in your script.
I don't know if you are using shared hosting or a private server. In case of shared hosting this will not going to work, but when using a private server you can limit the connections using iptables. See here for an example.

Another way when you are using apache, you can install a script for apache like this: http://dominia.org/djao/limitipconn.html
You can do this manually... Use a session or cookie or whatever to uniquely identify the user. When they start downloading a file, put a record in a table with their sessionid and the fileid an a timestamp. When the file is done clear the record.

Kinda a pain, can slow down your site if not done carefully, would need to be very careful with additional timeouts and error checking so as not to block access with orphaned entries
@Darude1234

Thanks for the suggestion. I have a dedicated server. I'm going to take a look @iptables and will get back to you.

@aarontomosky Thank you, but the cookie has already been suggested by Ray_Paseur (please take a look at my reply)
Just curious - how big are the things that people download from your site?
@Ray_Paseur

File size greatly vary, it could be several GBs :(
Messing with IPtables is also going to limit web browser connections for the regular web site.  When a user loads a page many threads can be opened to improve load time.  You don't want to block this.

To limit downloads to one thread per IP, use Apache with mod_limitipconn

mod_limitipconn: http://dominia.org/djao/limitipconn2.html

Then in your httpd.conf, for example, you can set this:
<IfModule mod_limitipconn.c>
   <Location /mydownloadfiles>
   MaxConnPerIP 1
   </Location>
</IfModule>

Open in new window

Thank you burnsj2, does this allows two different downloads at the same time (one different download = one connection)?

I'm asking because since it's dynamic, and since served files can be large (streamed in batch), I'm worried that users can only initiate one download at a time, which is a fairly serious limitation for them.
If you only want one download per user per file I don't think there is an easy iptables solution. I could be wrong...
ASKER CERTIFIED SOLUTION
Avatar of burnsj2
burnsj2

Link to home
membership
This solution is only available to members.
To access this solution, you must be a member of Experts Exchange.
Start Free Trial
@burnsj2 So, if I understood you correctly, with MaxConnPerIP set to 1, only *one* download is possible at a time, is that correct?
That partial download blocker is slick! It will however stop the user from resuming partial downloads rIght?
style-sheets: correct, mod_limitipconn is not going to be able to block multiple downloads of the same file while still allowing multiple downloads of different files, but you may want to use it anyway to establish some reasonable limit of simultaneous downloads, say "MaxConnPerIP 5".

aarontomosky is right, the RewriteRule is going to block all partial downloads.

@burnsj2 I'm not sure we're talking about the same thing, so please let me re-explain the situation and hopefully clarify what I meant earlier:

Let's say I have this directive in http.conf:

<IfModule mod_limitipconn.c>
   <Location /download-files.php>
   MaxConnPerIP 1
   </Location>
</IfModule>

Open in new window


Please keep in mind that files are user-specific, they are not shared among users.

If I understood you correctly, according to the directive above:

1.    User A can download file_1.zip using FlashGet, and while downloading this file, s/he

    a.  Still can browse website normally (because of the "Location" condition)
    b.  Cannot initiate another download of file_2.zip because s/he already reached the limit
    c.  FlashGet / download managers can only initiate 1 connection because of the RewriteCond %{HTTP:Range} !^$ directive.

2.    IF I set MaxConnPerIP to 5, user A can download file_1.zip using FlashGet, and while downloading this file, s/he

    a.  Still can browse website normally (because of the "Location" condition)
    b.  Can initiate up to 4 other downloads (of the same or different files)
    c.  FlashGet / download managers can only initiate 1 connection because of the RewriteCond %{HTTP:Range} !^$ directive.

is that correct?

Not being able to resume partial downloads is a real shame, but it's reasonable trade off for now.

Thanks!
That's exactly correct.  The only thing I see wrong is "Location /download-files.php".  This wouldn't work because "location needs to be a directory of files rather than a script.