Client Server Application with Socket Programming for bandwidth computation
Posted on 2004-11-24
I'm totally new to java and I have to develop a client-server application where the server sends a stream of random bytes to the client during a time t. This time is divided into slices of the same lenght (for instance 20 ms). For each time slice I have to count the number of bytes received and divide it by the lenght of the time slice, obtaining a bandwidth sample. Then I have to use this value to calculate some statistics.
My question is if I need to use two threads, one that just receives data from the server and the other one that calculate the bandwidth samples and statistics, so I make sure that during each time slice I only receive data.
What do you think?
Thanks a lot!
I don't know a priori how many bandwidth samples I'm going to calculate. Data about each sample is used to calculate some statistics and then according to results I decide if I need a new sample or if I'm done. I have to calculate the minimum number of samples that gives me a certain confidence level.