Solved

Site or tool for testing website loading speed

Posted on 2006-11-13
22
184 Views
Last Modified: 2010-08-05
I'm looking for a good website or tool to clock the load-time of a webpage. I would like the result I see to reflect the load-time of images as well, and whether or not my browser already has them cached should not affect the results. ... I also don't want to pay for the service/tool...

Anything out there?

Thanks.
0
Comment
Question by:Melvinivitch
[X]
Welcome to Experts Exchange

Add your voice to the tech community where 5M+ people just like you are talking about what matters.

  • Help others & share knowledge
  • Earn cash & points
  • Learn & ask questions
22 Comments
 
LVL 43

Expert Comment

by:TimCottee
ID: 17928879
Hi Melvinivitch,

http://www.websiteoptimization.com/services/analyze/index.html

Will do this and it is free.

Tim Cottee
0
 

Author Comment

by:Melvinivitch
ID: 17928954
Correct me if I'm wrong, but I believe that site merely analyzes the page, but does not actually download all the content in order to get a real-life test.

I need a tester that actually downloads all the content, including images, and returns the time it took to download it all.

I want to compare the speeds I get with different hosting providers, to download precisely the same content. I realize the resultant numbers will be highly variable based on time-sensative internet conditions on both the host's end, and the tester's end, but over time, I can gather a good relative picture of my hosting providers' performance.

I'd appreciate a recommendation for a site/tool with which the recommender has experience.


Thanks...
0
 
LVL 6

Expert Comment

by:bigphuckinglizard
ID: 17929335
microsoft do a free web application stress tool that should provide the features you need

http://www.microsoft.com/downloads/details.aspx?FamilyID=E2C0585A-062A-439E-A67D-75A89AA36495&displaylang=en
0
Creating Instructional Tutorials  

For Any Use & On Any Platform

Contextual Guidance at the moment of need helps your employees/users adopt software o& achieve even the most complex tasks instantly. Boost knowledge retention, software adoption & employee engagement with easy solution.

 
LVL 8

Expert Comment

by:netmunky
ID: 17932545
using linux or cygwin:

wget -O /dev/null -p --limit-rate=10k http://www.yourdomain/yourpage.html

you can use --limit-rate to similate to different speeds (in bytes per second)
for example, to simulate downloading google.com on a 14.4kbps modem:
wget -O /dev/null -p --limit-rate=1800 http://www.google.com
or on a 1.5Mbit DSL:
wget -O /dev/null -p --limit-rate=1540k http://www.google.com
0
 
LVL 8

Expert Comment

by:netmunky
ID: 17932568
sorry, i forgot the most important part, time:

time -f '%E' wget -O /dev/null -p -q --limit-rate=1800 http://www.google.com

the '%E' will tell you the real "wall clock"  time, -q for wget will keep it quiet
0
 

Author Comment

by:Melvinivitch
ID: 17936642
Thanks for the suggestions...

Can't figure out how to get the Microsoft tool to do what I want.

The time/wget command is functioning in cygwin, but returning extremely short times (0.3 seconds or quicker), while the page I'm testing takes at least 5 seconds to load in any browser...

So I'm still in search of a solution. Upping the points.
0
 
LVL 8

Expert Comment

by:netmunky
ID: 17938177
bash may be defaulting to using it's built in time.
i've found i have to add /usr/bin/ in front of time to force it to use that one:

root@munitions:~# time -f '%E' wget -O /dev/null -p -q --limit-rate=1800 http://www.cnn.com   
bash: -f: command not found

real    0m0.152s
user    0m0.002s
sys     0m0.003s
root@munitions:~# /usr/bin/time -f '%E' wget -O /dev/null -p -q --limit-rate=1800 http://www.cnn.com
0:56.64
0
 

Author Comment

by:Melvinivitch
ID: 17942548
Using whatever the default cygwin install is, there's no "time" command located in /usr/bin . ... I'm not really familiar enough with Linux to investigate this much further... Just know enough to "ls" my way into /usr/bin to see if there's a file named "time" there... Of course, when I execute the statement, it tells me "bash: /usr/bin/time: No such file or directory".
0
 
LVL 8

Expert Comment

by:netmunky
ID: 17943426
time isn't installed by default in cygwin. if you run setup, you'll find time under Utils
0
 

Author Comment

by:Melvinivitch
ID: 17953681
Does wget cache images or content in any way? Will I be getting a real-world fetch-everything-from-server time value every time I run that command?
0
 

Author Comment

by:Melvinivitch
ID: 17953754
From my tests, it does indeed appear to cache.

I uploaded a 1.4MB image to my test website and created an index file that just loads that image. The first time I ran the time/wget command, I got 1.8 seconds. Each subsequent time I run the same command, I get less than 0.3 seconds.

So it much be caching. I need a tool that gives the true 100% load-everything-from-server time EVERY time I run it.
0
 

Author Comment

by:Melvinivitch
ID: 17953838
Upping the points. Simple request. Elusive solution.
0
 
LVL 8

Expert Comment

by:netmunky
ID: 17956463
wget does not cache (with -O /dev/null), but it is possible your ISP has a transparent proxy server that does cache

using cygwin on my home PC with comcast cable:
Gavin@blackmamba ~
$ /usr/bin/time -f '%E' wget -O /dev/null -p -q --limit-rate=3600 http://www.cnn.com
0:30.12

Gavin@blackmamba ~
$ /usr/bin/time -f '%E' wget -O /dev/null -p -q --limit-rate=3600 http://www.cnn.com
0:30.26
0
 
LVL 8

Expert Comment

by:netmunky
ID: 18151834
the original post by TimCottee and the timed wget appear to be the only working solutions posted. contrary to the authors comment, the tool posted by TimCottee does download images and times shown reflect that.
0
 

Author Comment

by:Melvinivitch
ID: 18156619
Ok, the wget command does indeed seem to work for cnn.com, but it's still not handling my site correctly.

I created a simple test file for this. It's a 10.5MB jpeg which is the sole content of an html file. When I do the wget on that html file, I get 1/2 a second, which is clearly wrong as I'm on a standard residential DSL line (768k down), and it takes several seconds to load in a browser.

Here's the file/site I'm testing with:
http://www.feismarks.com/dev/speedtest/index.html
0
 

Author Comment

by:Melvinivitch
ID: 18156638
I should add that the wget time returns half a second whether that "index.html" file is loading the 10.5MB image or a 1.2MB test image...

0
 

Author Comment

by:Melvinivitch
ID: 18188928
Anyone?
0
 
LVL 8

Expert Comment

by:netmunky
ID: 18189840
-p tells wget to get page requisits, which includes images (<img>). if it finishes in 0.5 seconds then you are not limiting the speed or you are doing something else wrong wrong.
though i believe wget may not follow object tags to get swf or java
0
 

Author Comment

by:Melvinivitch
ID: 18190252
I'm executing the command just as you have it, verbatim. Furthermore, even if I wasn't limiting the rate, I don't  have the bandwidth to download a 10MB image in anything close to half a second, as previously stated. To confirm, my test file is a normal jpg file, referenced in my index.html file via a normal <img> tag.

What happens when you execute the command on the link I provided?
0
 
LVL 8

Accepted Solution

by:
netmunky earned 500 total points
ID: 18190282
if you remove the -O /dev/null, it works properly with that file. (though it does download the files which would need to be removed afterwords.
for some reason the -O fails on the image with "./" in the name, but not without -O it works fine

perhaps -p relies on the file being there to parse for to get built in img, etc
0
 

Author Comment

by:Melvinivitch
ID: 18190501
Perfect, that did it. Exactly what I'm looking for. Thanks.
0

Featured Post

Industry Leaders: We Want Your Opinion!

We value your feedback.

Take our survey and automatically be enter to win anyone of the following:
Yeti Cooler, Amazon eGift Card, and Movie eGift Card!

Question has a verified solution.

If you are experiencing a similar issue, please ask a related question

Because your company can’t afford for you to make SEO mistakes, you’ll want to ensure you’re taking the right steps each and every time you post a new piece of content. This list of optimization do’s and don’ts can help you become an SEO wizard.
Today, the web development industry is booming, and many people consider it to be their vocation. The question you may be asking yourself is – how do I become a web developer?
The viewer will get a basic understanding of what section 508 compliance can entail, learn about skip navigation links, alt text, transcripts, and font size controls.
Learn how to create flexible layouts using relative units in CSS.  New relative units added in CSS3 include vw(viewports width), vh(viewports height), vmin(minimum of viewports height and width), and vmax (maximum of viewports height and width).

623 members asked questions and received personalized solutions in the past 7 days.

Join the community of 500,000 technology professionals and ask your questions.

Join & Ask a Question