Solved

Site or tool for testing website loading speed

Posted on 2006-11-13
22
182 Views
Last Modified: 2010-08-05
I'm looking for a good website or tool to clock the load-time of a webpage. I would like the result I see to reflect the load-time of images as well, and whether or not my browser already has them cached should not affect the results. ... I also don't want to pay for the service/tool...

Anything out there?

Thanks.
0
Comment
Question by:Melvinivitch
22 Comments
 
LVL 43

Expert Comment

by:TimCottee
ID: 17928879
Hi Melvinivitch,

http://www.websiteoptimization.com/services/analyze/index.html

Will do this and it is free.

Tim Cottee
0
 

Author Comment

by:Melvinivitch
ID: 17928954
Correct me if I'm wrong, but I believe that site merely analyzes the page, but does not actually download all the content in order to get a real-life test.

I need a tester that actually downloads all the content, including images, and returns the time it took to download it all.

I want to compare the speeds I get with different hosting providers, to download precisely the same content. I realize the resultant numbers will be highly variable based on time-sensative internet conditions on both the host's end, and the tester's end, but over time, I can gather a good relative picture of my hosting providers' performance.

I'd appreciate a recommendation for a site/tool with which the recommender has experience.


Thanks...
0
 
LVL 6

Expert Comment

by:bigphuckinglizard
ID: 17929335
microsoft do a free web application stress tool that should provide the features you need

http://www.microsoft.com/downloads/details.aspx?FamilyID=E2C0585A-062A-439E-A67D-75A89AA36495&displaylang=en
0
How Do You Stack Up Against Your Peers?

With today’s modern enterprise so dependent on digital infrastructures, the impact of major incidents has increased dramatically. Grab the report now to gain insight into how your organization ranks against your peers and learn best-in-class strategies to resolve incidents.

 
LVL 8

Expert Comment

by:netmunky
ID: 17932545
using linux or cygwin:

wget -O /dev/null -p --limit-rate=10k http://www.yourdomain/yourpage.html

you can use --limit-rate to similate to different speeds (in bytes per second)
for example, to simulate downloading google.com on a 14.4kbps modem:
wget -O /dev/null -p --limit-rate=1800 http://www.google.com
or on a 1.5Mbit DSL:
wget -O /dev/null -p --limit-rate=1540k http://www.google.com
0
 
LVL 8

Expert Comment

by:netmunky
ID: 17932568
sorry, i forgot the most important part, time:

time -f '%E' wget -O /dev/null -p -q --limit-rate=1800 http://www.google.com

the '%E' will tell you the real "wall clock"  time, -q for wget will keep it quiet
0
 

Author Comment

by:Melvinivitch
ID: 17936642
Thanks for the suggestions...

Can't figure out how to get the Microsoft tool to do what I want.

The time/wget command is functioning in cygwin, but returning extremely short times (0.3 seconds or quicker), while the page I'm testing takes at least 5 seconds to load in any browser...

So I'm still in search of a solution. Upping the points.
0
 
LVL 8

Expert Comment

by:netmunky
ID: 17938177
bash may be defaulting to using it's built in time.
i've found i have to add /usr/bin/ in front of time to force it to use that one:

root@munitions:~# time -f '%E' wget -O /dev/null -p -q --limit-rate=1800 http://www.cnn.com   
bash: -f: command not found

real    0m0.152s
user    0m0.002s
sys     0m0.003s
root@munitions:~# /usr/bin/time -f '%E' wget -O /dev/null -p -q --limit-rate=1800 http://www.cnn.com
0:56.64
0
 

Author Comment

by:Melvinivitch
ID: 17942548
Using whatever the default cygwin install is, there's no "time" command located in /usr/bin . ... I'm not really familiar enough with Linux to investigate this much further... Just know enough to "ls" my way into /usr/bin to see if there's a file named "time" there... Of course, when I execute the statement, it tells me "bash: /usr/bin/time: No such file or directory".
0
 
LVL 8

Expert Comment

by:netmunky
ID: 17943426
time isn't installed by default in cygwin. if you run setup, you'll find time under Utils
0
 

Author Comment

by:Melvinivitch
ID: 17953681
Does wget cache images or content in any way? Will I be getting a real-world fetch-everything-from-server time value every time I run that command?
0
 

Author Comment

by:Melvinivitch
ID: 17953754
From my tests, it does indeed appear to cache.

I uploaded a 1.4MB image to my test website and created an index file that just loads that image. The first time I ran the time/wget command, I got 1.8 seconds. Each subsequent time I run the same command, I get less than 0.3 seconds.

So it much be caching. I need a tool that gives the true 100% load-everything-from-server time EVERY time I run it.
0
 

Author Comment

by:Melvinivitch
ID: 17953838
Upping the points. Simple request. Elusive solution.
0
 
LVL 8

Expert Comment

by:netmunky
ID: 17956463
wget does not cache (with -O /dev/null), but it is possible your ISP has a transparent proxy server that does cache

using cygwin on my home PC with comcast cable:
Gavin@blackmamba ~
$ /usr/bin/time -f '%E' wget -O /dev/null -p -q --limit-rate=3600 http://www.cnn.com
0:30.12

Gavin@blackmamba ~
$ /usr/bin/time -f '%E' wget -O /dev/null -p -q --limit-rate=3600 http://www.cnn.com
0:30.26
0
 
LVL 8

Expert Comment

by:netmunky
ID: 18151834
the original post by TimCottee and the timed wget appear to be the only working solutions posted. contrary to the authors comment, the tool posted by TimCottee does download images and times shown reflect that.
0
 

Author Comment

by:Melvinivitch
ID: 18156619
Ok, the wget command does indeed seem to work for cnn.com, but it's still not handling my site correctly.

I created a simple test file for this. It's a 10.5MB jpeg which is the sole content of an html file. When I do the wget on that html file, I get 1/2 a second, which is clearly wrong as I'm on a standard residential DSL line (768k down), and it takes several seconds to load in a browser.

Here's the file/site I'm testing with:
http://www.feismarks.com/dev/speedtest/index.html
0
 

Author Comment

by:Melvinivitch
ID: 18156638
I should add that the wget time returns half a second whether that "index.html" file is loading the 10.5MB image or a 1.2MB test image...

0
 

Author Comment

by:Melvinivitch
ID: 18188928
Anyone?
0
 
LVL 8

Expert Comment

by:netmunky
ID: 18189840
-p tells wget to get page requisits, which includes images (<img>). if it finishes in 0.5 seconds then you are not limiting the speed or you are doing something else wrong wrong.
though i believe wget may not follow object tags to get swf or java
0
 

Author Comment

by:Melvinivitch
ID: 18190252
I'm executing the command just as you have it, verbatim. Furthermore, even if I wasn't limiting the rate, I don't  have the bandwidth to download a 10MB image in anything close to half a second, as previously stated. To confirm, my test file is a normal jpg file, referenced in my index.html file via a normal <img> tag.

What happens when you execute the command on the link I provided?
0
 
LVL 8

Accepted Solution

by:
netmunky earned 500 total points
ID: 18190282
if you remove the -O /dev/null, it works properly with that file. (though it does download the files which would need to be removed afterwords.
for some reason the -O fails on the image with "./" in the name, but not without -O it works fine

perhaps -p relies on the file being there to parse for to get built in img, etc
0
 

Author Comment

by:Melvinivitch
ID: 18190501
Perfect, that did it. Exactly what I'm looking for. Thanks.
0

Featured Post

Free Tool: Subnet Calculator

The subnet calculator helps you design networks by taking an IP address and network mask and returning information such as network, broadcast address, and host range.

One of a set of tools we're offering as a way of saying thank you for being a part of the community.

Question has a verified solution.

If you are experiencing a similar issue, please ask a related question

Get to know the ins and outs of building a web-based ERP system for your enterprise. Development timeline, technology, and costs outlined.
When crafting your “Why Us” page, there are a plethora of pitfalls to avoid. Follow these five tips, and you’ll be well on your way to creating an effective page.
The viewer will learn how to dynamically set the form action using jQuery.
This tutorial will teach you the core code needed to finalize the addition of a watermark to your image. The viewer will use a small PHP class to learn and create a watermark.

820 members asked questions and received personalized solutions in the past 7 days.

Join the community of 500,000 technology professionals and ask your questions.

Join & Ask a Question