I am using QueryPerformanceCounter to determine the amount of time a function takes (transforming XSL templates), and applying that value to a PerformanceCounter object (AverageTimer32) using the following psuedo-code:
_perfmonAvg = New PerformanceCounter("PG", "TimeToTransform", Me.ID.ToString, False)
_perfmonBase = New PerformanceCounter("PG", "TimeToTransformBase", Me.ID.ToString, False)
' End If
_perfmonAvg.RawValue = 0
_perfmonBase.RawValue = 0
Private Sub StartTime()
Private Sub UpdatePerformance()
Dim _end As Int64
_timetotransform = _end - _startTime
Catch oE As Exception
When I look at the performance counter, I get a value of .778 if the scale is set to 10,000.
I've looked at http://support.microsoft.com/default.aspx?scid=kb;en-us;306978
, but it doesn't answer my questions:
1) What is the .778 value in reference to seconds ? (ms, microsecs,ticks? )
2) Do I need to divide using QueryPerformanceFrequency to determine secs/ms?
3) what is the proper setting for the scale to make this read in ms?