I've got a series of data in a DataSeries object for the last 17 periods. I want to find the Standard Deviation for the FIRST 9 periods...when I input the values on the statistical calculator (in Windows) the values for SUM and AVERAGE match what I'm getting in my output, however there are small differences in Standard Deviation, I'm at a loss to explain why?
The data in question:
the DataSeries is called "MINORS" and the period data is as follows:
[0] = 24; [1] = 34; [2] = 38; [3] = 37; [4] = 37; [5] = 32; [6] = 30.2;
[7] = 48.1; [8] = 47;
the STDEV command: Ds = StdDev(MINORS, 9)[0];
The Output Window reports the StdDev as "7.2252" however the calculator outputs "7.6635", why the difference?
Comment