Solved

How to prioritize Web Access between threads

Posted on 2013-01-12
4
463 Views
Last Modified: 2013-01-13
I have a Windows C++ application that needs to access the web for a few kbytes of data every few minutes.  Latency for those bytes must be reasonably low--less than a second.

The application creates many files, each around 1 Mb.  If I upload those files while the my application is running using Filezilla,  my application suffers from occaisional time-outs, even with time-out set to as long as five seconds.

Can I  add a thread to my application to upload files in such a way that would avoid impacting latency on getting those 1 Kbyte web accesses?  Would it be sufficient to just have the thread that does the uploading be at a lower priority, or are there shared resources (Wifi, and everything else between my pc and the cloud) that could be consumed by the file upload, even it's thread is lower priority?
0
Comment
Question by:DaveThomasPilot
  • 2
  • 2
4 Comments
 
LVL 31

Expert Comment

by:Frosty555
ID: 38770914
I don't think this is a thread priority issue, it's a quality-of-service issue on your network.

FileZilla is uploading a file. It is most likely saturating your available Internet upload bandwidth. When this happens there is a long lag for any other applications that are accessing the internet, in particular anything that requires a back-and-forth handshake like an HTTP Request. Your application's HTTP Request is queued behind a mile of Filezilla packets all waiting to be transmitted.

The simplest thing to do is simply throttle Filezilla's upload rate to something that is less than your ISP's upload bandwidth limit. Check your speeds at www.speedtest.net, and give Filezilla maximum of something like 50-60% of the upload speed.

A better solution is to have a router which is capable of QoS / bandwidth priority. Configure it so that HTTP traffic has high priority and FTP traffic has low priority. But this needs to be done at the Router level, because the lag is not being caused by your computer - it's being caused by your network.

Realistically, because the Internet is a best-effort service and because lags on the network DO happen (it may not necessarily be your computer causing the problem, somebody else on the network could be saturating the bandwidth), you really have to have a 30-60 second timeout for web requests.
0
 

Author Comment

by:DaveThomasPilot
ID: 38771719
I've tried used the "speed limits" on FileZilla then checking if the ping response to unrelated sites (like google.com) make a difference.  It didn't seem to have any impact.  
Maybe the FileZilla speed throttling mechanism reduces data transfer on average, but the "time constant" over which the slower average upload speed  is too long to be useful for a tiny Http request to get a crack at the bandwidth with low latency.

Generally, I'll have no control over the router.  The applicatiion is used at many different venues, with public wifi networks.  Are you suggesting I have a dedicated router for my pc that connects to the public wifi?  Would there be no way to do something on my pc to eliminate the need for the external router?  And how would that work anyway, there's only one Mac address for my laptop from which both the ftp transfers and the http requests are coming.


Instead, I'm thinking along the lines of doing something like an ftp from a lower priority thread (C++ code), but transfer partial files instead of a regular file. Target the file fragment size so that it rarely takes more than about a second for the file fragment to upload.  I  After the end of each fragment upload  (or time out if the upload took more than a time out period), do a "SwitchToThread()". or an equivalent call to relinquish the threads timeslot to the application's latency critical thread.

So, something like an ftp, but with extra parameters to limit the number of bytes transfer, a time-out, and the ability to keep track of how much of the file got uploaded so the next transfer attempt could start from that place in the file.

That would mean something would have to run on the host side, to "reassemble" the file fragments and create/save the file when all the data had arrived.

I was hoping that I was just ignorant of some common file transfer technique that was like ftp, but had this type of additional functionality.

?
0
 
LVL 31

Accepted Solution

by:
Frosty555 earned 500 total points
ID: 38772545
Hi Dave,

Check this out:

http://www.seriousbit.com/netbalancer/

You might be able to use this to control what processes on your computer get higher priority on the network, and schedule your HTTP packets to go ahead of your FTP packets. That will fix the problem.

A few things regarding what you mentioned in your post:

I'm not suggesting you need a dedicated router, if you don't have control over the network (e.g. you're on a public wifi), then it won't be feasible for your to implement QoS on the network... but that is what is needed here. You need some kind of arbitrator to decide which packets get to go first. Normally the QoS on a router does this, but NetBalancer might be able to do it for you too.

- What you are describing for file transfer - "ftp, but with extra parameters to limit the number of bytes transfer", is basically the exact definition of FileZilla's speed limit feature. Re-implementing it yourself isn't going to help.

- I think that the throttling in Filezilla may not be working because Filezilla uses "bursting" - it allows faster-than-the-limit file transfers for a short period time and then gradually brings the speed down to the set limit. That might explain why you are still experiencing problems even after throttling your upload bandwidth. Try capping the upload rate even more aggressively.

- I want to emphasize here that thread priorities will do NOTHING to help you here. Thread scheduling only has to do with CPU resources. It will not help with prioritizing network traffic. The reason is because your network card has 100 or 1000 megabits  of available bandwidth. The bottleneck only happens when the packets reach the router, and the router can't send them across the Internet fast enough. The bottleneck happens at the network layer, not the application layer.
0
 

Author Closing Comment

by:DaveThomasPilot
ID: 38772778
Thanks, I'll try setting the speed limits even lower on FileZilla.

Thanks for the netbalancer.  That looks really interesting!  I'll probably give that a try if using lower speed limits on Filezilla doesn't help.

Dave Thomas
0

Featured Post

Courses: Start Training Online With Pros, Today

Brush up on the basics or master the advanced techniques required to earn essential industry certifications, with Courses. Enroll in a course and start learning today. Training topics range from Android App Dev to the Xen Virtualization Platform.

Question has a verified solution.

If you are experiencing a similar issue, please ask a related question

Suggested Solutions

Computer science students often experience many of the same frustrations when going through their engineering courses. This article presents seven tips I found useful when completing a bachelors and masters degree in computing which I believe may he…
I had an issue with InstallShield not being able to use Computer Browser service on Windows Server 2012. Here is the solution I found.
In this fifth video of the Xpdf series, we discuss and demonstrate the PDFdetach utility, which is able to list and, more importantly, extract attachments that are embedded in PDF files. It does this via a command line interface, making it suitable …
Internet Business Fax to Email Made Easy - With  eFax Corporate (http://www.enterprise.efax.com), you'll receive a dedicated online fax number, which is used the same way as a typical analog fax number. You'll receive secure faxes in your email, f…

776 members asked questions and received personalized solutions in the past 7 days.

Join the community of 500,000 technology professionals and ask your questions.

Join & Ask a Question