Go Premium for a chance to win a PS4. Enter to Win

x
  • Status: Solved
  • Priority: Medium
  • Security: Public
  • Views: 187
  • Last Modified:

Site or tool for testing website loading speed

I'm looking for a good website or tool to clock the load-time of a webpage. I would like the result I see to reflect the load-time of images as well, and whether or not my browser already has them cached should not affect the results. ... I also don't want to pay for the service/tool...

Anything out there?

Thanks.
0
Melvinivitch
Asked:
Melvinivitch
1 Solution
 
TimCotteeCommented:
Hi Melvinivitch,

http://www.websiteoptimization.com/services/analyze/index.html

Will do this and it is free.

Tim Cottee
0
 
MelvinivitchAuthor Commented:
Correct me if I'm wrong, but I believe that site merely analyzes the page, but does not actually download all the content in order to get a real-life test.

I need a tester that actually downloads all the content, including images, and returns the time it took to download it all.

I want to compare the speeds I get with different hosting providers, to download precisely the same content. I realize the resultant numbers will be highly variable based on time-sensative internet conditions on both the host's end, and the tester's end, but over time, I can gather a good relative picture of my hosting providers' performance.

I'd appreciate a recommendation for a site/tool with which the recommender has experience.


Thanks...
0
 
bigphuckinglizardCommented:
microsoft do a free web application stress tool that should provide the features you need

http://www.microsoft.com/downloads/details.aspx?FamilyID=E2C0585A-062A-439E-A67D-75A89AA36495&displaylang=en
0
What does it mean to be "Always On"?

Is your cloud always on? With an Always On cloud you won't have to worry about downtime for maintenance or software application code updates, ensuring that your bottom line isn't affected.

 
netmunkyCommented:
using linux or cygwin:

wget -O /dev/null -p --limit-rate=10k http://www.yourdomain/yourpage.html

you can use --limit-rate to similate to different speeds (in bytes per second)
for example, to simulate downloading google.com on a 14.4kbps modem:
wget -O /dev/null -p --limit-rate=1800 http://www.google.com
or on a 1.5Mbit DSL:
wget -O /dev/null -p --limit-rate=1540k http://www.google.com
0
 
netmunkyCommented:
sorry, i forgot the most important part, time:

time -f '%E' wget -O /dev/null -p -q --limit-rate=1800 http://www.google.com

the '%E' will tell you the real "wall clock"  time, -q for wget will keep it quiet
0
 
MelvinivitchAuthor Commented:
Thanks for the suggestions...

Can't figure out how to get the Microsoft tool to do what I want.

The time/wget command is functioning in cygwin, but returning extremely short times (0.3 seconds or quicker), while the page I'm testing takes at least 5 seconds to load in any browser...

So I'm still in search of a solution. Upping the points.
0
 
netmunkyCommented:
bash may be defaulting to using it's built in time.
i've found i have to add /usr/bin/ in front of time to force it to use that one:

root@munitions:~# time -f '%E' wget -O /dev/null -p -q --limit-rate=1800 http://www.cnn.com   
bash: -f: command not found

real    0m0.152s
user    0m0.002s
sys     0m0.003s
root@munitions:~# /usr/bin/time -f '%E' wget -O /dev/null -p -q --limit-rate=1800 http://www.cnn.com
0:56.64
0
 
MelvinivitchAuthor Commented:
Using whatever the default cygwin install is, there's no "time" command located in /usr/bin . ... I'm not really familiar enough with Linux to investigate this much further... Just know enough to "ls" my way into /usr/bin to see if there's a file named "time" there... Of course, when I execute the statement, it tells me "bash: /usr/bin/time: No such file or directory".
0
 
netmunkyCommented:
time isn't installed by default in cygwin. if you run setup, you'll find time under Utils
0
 
MelvinivitchAuthor Commented:
Does wget cache images or content in any way? Will I be getting a real-world fetch-everything-from-server time value every time I run that command?
0
 
MelvinivitchAuthor Commented:
From my tests, it does indeed appear to cache.

I uploaded a 1.4MB image to my test website and created an index file that just loads that image. The first time I ran the time/wget command, I got 1.8 seconds. Each subsequent time I run the same command, I get less than 0.3 seconds.

So it much be caching. I need a tool that gives the true 100% load-everything-from-server time EVERY time I run it.
0
 
MelvinivitchAuthor Commented:
Upping the points. Simple request. Elusive solution.
0
 
netmunkyCommented:
wget does not cache (with -O /dev/null), but it is possible your ISP has a transparent proxy server that does cache

using cygwin on my home PC with comcast cable:
Gavin@blackmamba ~
$ /usr/bin/time -f '%E' wget -O /dev/null -p -q --limit-rate=3600 http://www.cnn.com
0:30.12

Gavin@blackmamba ~
$ /usr/bin/time -f '%E' wget -O /dev/null -p -q --limit-rate=3600 http://www.cnn.com
0:30.26
0
 
netmunkyCommented:
the original post by TimCottee and the timed wget appear to be the only working solutions posted. contrary to the authors comment, the tool posted by TimCottee does download images and times shown reflect that.
0
 
MelvinivitchAuthor Commented:
Ok, the wget command does indeed seem to work for cnn.com, but it's still not handling my site correctly.

I created a simple test file for this. It's a 10.5MB jpeg which is the sole content of an html file. When I do the wget on that html file, I get 1/2 a second, which is clearly wrong as I'm on a standard residential DSL line (768k down), and it takes several seconds to load in a browser.

Here's the file/site I'm testing with:
http://www.feismarks.com/dev/speedtest/index.html
0
 
MelvinivitchAuthor Commented:
I should add that the wget time returns half a second whether that "index.html" file is loading the 10.5MB image or a 1.2MB test image...

0
 
MelvinivitchAuthor Commented:
Anyone?
0
 
netmunkyCommented:
-p tells wget to get page requisits, which includes images (<img>). if it finishes in 0.5 seconds then you are not limiting the speed or you are doing something else wrong wrong.
though i believe wget may not follow object tags to get swf or java
0
 
MelvinivitchAuthor Commented:
I'm executing the command just as you have it, verbatim. Furthermore, even if I wasn't limiting the rate, I don't  have the bandwidth to download a 10MB image in anything close to half a second, as previously stated. To confirm, my test file is a normal jpg file, referenced in my index.html file via a normal <img> tag.

What happens when you execute the command on the link I provided?
0
 
netmunkyCommented:
if you remove the -O /dev/null, it works properly with that file. (though it does download the files which would need to be removed afterwords.
for some reason the -O fails on the image with "./" in the name, but not without -O it works fine

perhaps -p relies on the file being there to parse for to get built in img, etc
0
 
MelvinivitchAuthor Commented:
Perfect, that did it. Exactly what I'm looking for. Thanks.
0

Featured Post

Ask an Anonymous Question!

Don't feel intimidated by what you don't know. Ask your question anonymously. It's easy! Learn more and upgrade.

Tackle projects and never again get stuck behind a technical roadblock.
Join Now