if (CurrentBar == 0)
Value.Set(Input[0]);
else
{
double last = Value[1] * Math.Min(CurrentBar,
Period);
if (CurrentBar >= Period)
Value.Set((last + Input[0] - Input[Period])
/ Math.Min(CurrentBar, Period));
else
Value.Set((last + Input[0]) / (Math.Min(CurrentBar, Period) + 1));
}
}
For one thing it appears that there is one unnecessary math.min method, correct me if I am wrong.
My main question is how does that accomplish an averaging? Is there a looping action implied there somehow?
Comment