Solved

How to time a individual web file request into its parts?

Posted on 2014-10-06
15
186 Views
Last Modified: 2014-10-07
Looking to track and monitor time taken for web pages to download and identify bottlenecks.

After downloading the individual page html I can identify the files that are needed to make the request like the js, css and image files.

I want to be able to time how long each file takes and split into meaningful areas to identify any slow areas

The time areas for each file would be;

Blocking
DNS lookup
Connecting
Sending
Waiting
Receiving

How would I do that please?
0
Comment
Question by:stephenwilde
  • 6
  • 6
  • 2
  • +1
15 Comments
 
LVL 83

Expert Comment

by:Dave Baldwin
ID: 40363316
Using Firefox and Firebug, all that is shown on the Net tab.  Here's an example using one of my pages.
Firebug Net tab
0
 

Author Comment

by:stephenwilde
ID: 40363338
Thanks but I would like to have the data within .net c# so I can write the data to a database and manipulate the data before it is displayed.
0
 
LVL 52

Expert Comment

by:Julian Hansen
ID: 40363399
How are you going to do that though?
The data is recorded on the client side - you don't have access to that with .Net code.

The only solution is to have a browser plugin that collects that data and passes it back to your server for storage.
Will you be monitoring this from a browser you control or do you want to monitor this for all (random) visitors to the page?

Have you had a look at ShowSlow? I have not worked with it but it might do what you want

http://www.showslow.com/configure.php
0
 

Author Comment

by:stephenwilde
ID: 40363432
No looking to build a website audit programme

Mostly it deals with looking at content of title, h1 tags etc. for SEO purposes but would like to obtain data that runs automatically for each page of a several hundred page website, so needs automated code.
0
 
LVL 75

Expert Comment

by:käµfm³d 👽
ID: 40363445
You could download each file via code, and time those downloads with one of the various timers found in the Framework, but this may or may not be an accurate representation of what a particular browser is doing. You have no way of knowing what a particular browser is doing before, during, and after a file is downloaded. I'm not trying to discourage you, but just keep in mind that there may be some discrepancy between your testing and a particular browser. (I suspect the timings will be mostly accurate.) Also, I doubt you'll be able to time the DNS lookup without doing some low-level TCP work. Again, if you're up for it, then by all means, but I don't think the normal classes we use to download from the web in .NET will give you timings on that.
0
 
LVL 52

Accepted Solution

by:
Julian Hansen earned 500 total points
ID: 40363471
Data from such a process would be highly variable depending on a variety of factors - results you obtain may be totally different from another location.

Would a better solution be to rather look at the following per page

a) Total Page Size
b) Number of objects requested
c) In the case of scripted items - time for each script to complete.

These are more easily measurable and will give a better means to compare different pages / sites. Variances in line speed / server speed etc at time of test could skew results - whereas comparing actual objects and sizes will give a much better means of comparing.
0
 

Author Comment

by:stephenwilde
ID: 40363478
thank you, I agree with your observations

the issue is with

c) In the case of scripted items - time for each script to complete.

rather than just the "total time" for each item, I want to provide a breakdown to see if their is a bottleneck or problem with dns or server etc..
0
3 Use Cases for Connected Systems

Our Dev teams are like yours. They’re continually cranking out code for new features/bugs fixes, testing, deploying, testing some more, responding to production monitoring events and more. It’s complex. So, we thought you’d like to see what’s working for us.

 
LVL 52

Expert Comment

by:Julian Hansen
ID: 40363506
bottleneck or problem with dns or server etc..

That can be done without having to access the page. These metrics are common for all page requests so I would implement that as a separate script that just times the DNS resolution and measure server latency. That can (and should be) done with a dedicated script so there is no variables in the equation that have nothing to do with the metrics being measured.
0
 

Author Comment

by:stephenwilde
ID: 40363556
thank you for your observations

looking for a direction on obtaining the script you describe
or classes, functions if any .net c# framework
or any 3rd party tool that can used ?
0
 
LVL 52

Expert Comment

by:Julian Hansen
ID: 40363651
For the DNS resolution - I would imagine put timing around a Dns.Resolve Dns.GetHostEntry - fairly straight forward.

For the server timing - a simple request to a hello world page with timing around the call would also give you server / network latency values.
0
 

Author Comment

by:stephenwilde
ID: 40363687
Thanks but any way to split the server time between, what DNS free testing provide namely;

Connecting
 Sending
 Waiting
 Receiving
0
 
LVL 83

Expert Comment

by:Dave Baldwin
ID: 40364378
@julian is right, the data can vary quite a bit.  I have two ISPs here, one connects at 10Mbps and the other at over 100Mbps.  That's a factor 10 difference on the download part alone.
0
 
LVL 52

Expert Comment

by:Julian Hansen
ID: 40364579
I am not sure what you are going to achieve by micro-managing the bits. There are so many variables in the mix - network speed, congestion, routing (number of hops), resource utilisation on DNS server etc etc etc - these will obscure any results you capture on the micro bits.

One has very little control over the DNS side of things - unless you have a bunch of very tech savy users who go around entering custom DNS settings instead of using the ISP served default - you are going to have very little control over the lookup process. Variables will differ on a minute to minute basis and from one location to the next.

I am sure there are many solutions to capturing the information you want but if this is for SEO purposes it might be a bit of overkill.
0
 

Author Closing Comment

by:stephenwilde
ID: 40364774
Thanks all for there contributions on dealing with the issues
0
 
LVL 52

Expert Comment

by:Julian Hansen
ID: 40365366
You are welcome - thanks for the points.
0

Featured Post

Is Your Active Directory as Secure as You Think?

More than 75% of all records are compromised because of the loss or theft of a privileged credential. Experts have been exploring Active Directory infrastructure to identify key threats and establish best practices for keeping data safe. Attend this month’s webinar to learn more.

Question has a verified solution.

If you are experiencing a similar issue, please ask a related question

Suggested Solutions

It was really hard time for me to get the understanding of Delegates in C#. I went through many websites and articles but I found them very clumsy. After going through those sites, I noted down the points in a easy way so here I am sharing that unde…
Performance in games development is paramount: every microsecond counts to be able to do everything in less than 33ms (aiming at 16ms). C# foreach statement is one of the worst performance killers, and here I explain why.
This tutorial demonstrates how to identify and create boundary or building outlines in Google Maps. In this example, I outline the boundaries of an enclosed skatepark within a community park.  Login to your Google Account, then  Google for "Google M…
This tutorial walks through the best practices in adding a local business to Google Maps including how to properly search for duplicates, marker placement, and inputing business details. Login to your Google Account, then search for "Google Mapmaker…

867 members asked questions and received personalized solutions in the past 7 days.

Join the community of 500,000 technology professionals and ask your questions.

Join & Ask a Question

Need Help in Real-Time?

Connect with top rated Experts

21 Experts available now in Live!

Get 1:1 Help Now