I have a problem with an indicator I'm working on.
Basically, my indicator formula looks like following :
Myindicator.Set(varA*varB)
where varA and varB are double types(very basic). The thing is when I'm running it , it does not plot anything. I've checked the output window to point out the problem and it appears that it comes from my varA which is always equal to 0.
I think the problem comes from the type of varA .Since the formula looks like Constant/(Period*(Period-1)*(Period+1)), it obviously returns 0.0 when Period is set to 20 for example. The correct value is 0.00000xxx.
So I tried to change the value type to decimal in order to get the precision I need but when I do it, I'm forced to change the type of varB (also to decimal) and of course my indicator won't compile since Dataseries only holds double values.
I don't know what to do anymore. Could anyone help me ?
By the way ,why can't I print the values in the output window with my variables defined as decimals ?
Comment