Link to home
Start Free TrialLog in
Avatar of Vijai
Vijai

asked on

Client Server Application with Socket Programming for bandwidth computation

I'm totally new to java and I have to develop a client-server application where the server sends a stream of random bytes to the client during a time t. This time is divided into slices of the same lenght (for instance 20 ms). For each time slice I have to count the number of bytes received and divide it by the lenght of the time slice, obtaining a bandwidth sample. Then I have to use this value to calculate some statistics.

My question is if I need to use two threads, one that just receives data from the server and the other one that calculate the bandwidth samples and statistics, so I make sure that during each time slice I only receive data.

What do you think?

Thanks a lot!

I don't know a priori how many bandwidth samples I'm going to calculate. Data about each sample is used to calculate some statistics and then according to results I decide if I need a new sample or if I'm done. I have to calculate the minimum number of samples that gives me a certain confidence level.

Thanks.
Avatar of ThummalaRaghuveer
ThummalaRaghuveer

I guess you are correct, you have to use 2 threads. One thread listens to a socket and dumps data into a buffer. Other thread calculates the data in the buffer after every 20 ms(approximately) and calculates the bandwidth. I guess it would be easy if the second thread empties the buffer after calculating bandwidth for that sample. You need to synchronizze both the threads by obtaining a lock on buffer.

If you keep empting the buffer after every 20ms sample, It does not matter how many samples you are going to use.

-Raghu
Well that would depend on the way in which you tend to reason. You can keep the data in a list/vector that can change it's size dinamically when you add data so the unknown amount of frames is not an issue. I would advise you against the multi threading if you are interested in calculating bandwidth statistics since when threading you will never be able to make full use of the bandwidth due to the threading model.

1. Get the start time
2. send the data
3. get the stop time
4. see if ihe diffirence is 20 ms (your frame length)
5. repeat stem 1..4 until the diffirence is 20ms
6. save the data in a structure for later processing
7. add the structure to a list
8. when everything stops then do the calculations as to not adversely affect the sending process.
Its a description appropriate for server and not for client. If the same approach is used for client then as the number of sample bursts are not known then client might hang waiting for responses from server.

-Raghu
Avatar of Vijai

ASKER

I don't know a priori how many bandwidth samples I'm going to get. Data about each sample is used to calculate some statistics and then according to results I decide if I need a new sample or if I'm done. I have to calculate the minimum number of samples that gives me a certain confidence level.

Thanks.
May be you should just go with one thread that receives data... calculates bandwidth.. decides for further sampling... and continues....
Avatar of Vijai

ASKER

but the time is split in slices of 20 ms, and I have to count the amount of data I receive in that time to calculate bandwidth and other statistics. If I use part of the time in the slice to do that then my sequence of time slices doesn't cover my entire interval of time..
If I understand correctly... you are usign a TCP stream and even if you calculate you do not lose any slices. Also you bandwidth is calculated at the client and not at the server. At the client side you know exactly how much time you are listeneing to and how much time you are calculating... So I guess that does not create any problems

-Raghu
ASKER CERTIFIED SOLUTION
Avatar of RuadRauFlessa
RuadRauFlessa

Link to home
membership
This solution is only available to members.
To access this solution, you must be a member of Experts Exchange.
Start Free Trial