This one's got me tearing my hair out...
I have a set of classes that increment values, depending on a data type. A subset of one of those classes is below:
public class IncrementDoubleValue : IIncrementValue
private double startValue;
private double stepValue;
private double maxValue;
public object GetValue()
if (currentValue >= maxValue)
currentValue = startValue;
currentValue += stepValue;
The idea is simply to increment a double by the stepValue until it reaches the maxValue, at which point it starts again.
I was getting some strange results and started tracing this through. My values were as follows:
Therefore I would expect to get: 1.0, 1.001, 1.002, 1.003, etc
However, instead I got:
Note how the last one also dropped by 2 in the final decimal place.
These are all exact values I've copied from the trace. Can anyone put me out of my misery and explain why this is happening?