Value.Set(-100 * (MAX(High, Period)[0] - Close[0]) / (MAX(High, Period)[0] - MIN(Low, Period)[0] == 0 ? 1 : MAX(High, Period)[0] - MIN(Low, Period)[0]));
to
Value.Set( Close[0] )
Now when I plot the indicator on the graph it looks fine, but when I try to call it in NinjaScript from OnBarUpdate(), like Print(MyIndicator(10)[0]), it always gives me zero. If I use WilliamsR(10)[0] in the same context I get normal values. Why is this?
Comment