I want to retrieve smallish files from a remote ftp server (7k - 20k). A 7k file takes 1.3 seconds. Is there any way to speed this up? There are about 800G of these small files and they will only be retrieved a handful at a time, so I don't want to put them on the web server.
Do you notice any difference in the time it takes to retrieve a 7K file and a 20K file? Also, what are the connection speeds along the path? I ask because the answers will help us know whether this 1.3 seconds is mostly overhead (I expect it may be) or whether there is something systemic slowing it down.
Are you sure that the remote server does not have the file retrievals "throttled back" to prevent overload? I know that many APIs do this - at least all the ones I have written do it. We throw in a one-second wait per request. It's generally tolerable to the application and keeps our load levels reasonable.
Any way to retrieve more than one file at a time?
Those are some of my thoughts. Best regards, ~Ray
geomouchet
ASKER
A 21k file took 10.6 seconds the first time. Maybe the server was busy at that moment? Additional requests for the same 21k file took 1.4 seconds. You're probably right about the throttling back. I'll check on that.
I'm not sure how to retrieve more than one file at a time. I guess I could use the ftp functions, but if the delay is per file, that wouldn't help.
Are you sure that the remote server does not have the file retrievals "throttled back" to prevent overload? I know that many APIs do this - at least all the ones I have written do it. We throw in a one-second wait per request. It's generally tolerable to the application and keeps our load levels reasonable.
Any way to retrieve more than one file at a time?
Those are some of my thoughts. Best regards, ~Ray