I have this idea I was thinking of implementing on some future projects, and I thought I'd run it by you experts to hear what you think.
Some Websites have a high bandwidth and low bandwidth version for broadband and dialup users. But usually it involves selecting which version you want to see.
I was thinking of developing a simple script that takes getBytesLoaded() over the Date() object to determine how much time it takes a user's browser to download a certain number of bytes to determine a transfer rate, then if they exceed a minimum requirement, automatically launch them into the high bandwidth version of the site. The code is easy enough.
But how reliable would this really be? It's hard to test this, cause I don't have access to an array of test computers with different connections. Would the Date() function (using milliseconds) be the most reliable way to measure time for this purpose? I was thinking of watching the getBytesLoaded() until it reaches a certain value, then determining how long it took to get to that value, then dividing one by the other ( /1024) to get Kps.
If so, how much data should I use for the test to be accurate without taking too much time?
Thanks for the advice -