I've been backtesting and optimizing some strategies I've coded. I've run hundreds of simulations using ticks charts and everything was fine so far. For this particular strategy I wanted to look the behaviour over many time frames (1 min, 2 min, 5 min, 10 min, etc)
For some reason when I try to run an optimization over the 1 min series the systems just crashes. RAM usage skyrocketed and I had to force the restart of my laptop.
When I tried the optmization over the 2 min series something similar happened but this time with the last simulation, as observed in the picture attached. NInja does not finish the simulation nor generate the log in the analyzerlog folder.
I'm totally lost here since its the same code that has been running for the ticks based series.
I would appreciate some light over this issue,
Thanks in advance,
Comment