I've been tasked with writing a test spec for a project with applications spanning several geographic locations, and comprising multiple LANs, using a combination of TCP/IP and serial comms.
I am not a comms expert, but I think I know enough to get by.
What I want is a way of calculating the maximum time that should be allowed for two applications on a network to communicate a certain amount of data.
I'm planning so far on using some batch files with ping commands, ideally I'd like to say:
"We need to be able to send X1 amount of bytes in X2 amount of seconds, therefore, tests require a response to a Ping of 50000 bytes in X3 milliseconds."
If I know X1 and X2, do you think this is a reliable way of working X3?
I hope that makes sense, any help appreciated.