I have the following problem -
I am trying to measure how much time has passed from one point in my code to another.
I take the time before the relevant part of the code starts with "long l = System.currentTimeMillis()" and at the end of the code I subtract l from the new System.currentTimeMillis().
This happens many times in my program. Most of the time the subtraction gives 0, and rarely it gives 10.
I assume that the resolution of "System.currentTimeMillis()" is 10MS (anything under 5 is rounded to 0, right?).
Because of the above problem, I get very inaccurate results.
Any suggestions ?
(I already tried multiplying the result by 1000, or converting it to double - no good).