Announcement

Collapse
No announcement yet.

Partner 728x90

Collapse

NT8 Strategy Analyzer never finishes

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

    NT8 Strategy Analyzer never finishes

    Version: NT 8.0.26.1 (OS: Windows 10 x64 Enterprise, Version 21H2, Build 19044.1826)

    As of last Friday evening (U.S. Central Standard Time), July 23rd, 2022, right around the same time as the Kinetick outage configuration issue with DTN Servers / logon APIs - I have been unable to run a successful backtest, one which I previously (just prior to above issue) was able to run without a problem.

    I've tried backups of my NT8 setup (I export a full NT backup to an external server daily), I've cleaned the db cache, refreshed the sdf file (no corruption issues), and verified my historical data is intact. I am backtesting quite a bit of data (going back to 2006) and realize this typically takes a while to cache the historical data... however, there is no progress with the Strategy Analyzer even after 48hrs of letting it run (shows "Running backtest on XYZ ..." - never completes or prints output as specified in the script like it used to).

    There are no log or trace files that show any error or issue with the backtest whatsoever...

    What I did notice is that the db 'cache' folder is no longer populating like it used to... basically, when it ran a successful backtest in the past (as recent as last week on the entire 16.5+ year dataset), the db cache folder would increase size to the amount of historical data it was processing (roughly 23 GB). Just to note, this entire backtest only usually took between 1-4 hrs depending on whether or not the data was already cached.

    Now, the db cache folder doesn't get any larger than 1.68 GB and the Strategy Analyzer never completes. Just to reiterate, I've verified the database contains all the necessary contract data (nothing is missing or has changed there)... and when looking through the "smaller" db cache folder, it appears that all the subfolders for the dataset (i.e. tick, minute contract-period folders) are present, but somehow the size of the folder is only 1.68GB vs. ~23GB -- clearly there is an issue I'm not seeing and need some assistance to get backtests running again.

    Also on another note - this does not have any relation to performance constraints; my system is running a stable-OC'd watercooled 64-core threadripper w/ 256GB RAM, all SSDs, and dual 2080Ti GPUs -- and NT barely uses 1/10th of those resources at break-neck usage. However, I did notice NT8 used to take up more memory when it was successfully running the backtest, and now it only tops out memory usage at roughly 10GB RAM (again, it appears to not be caching the historical data in order to actually run the backtest... it used to take up to 40GB RAM to finish the task, which was the desired response when caching the dataset).

    Need help from Dev/Eng!!! Thanks.

    #2
    NinjaTrader_PaulH -- Is someone able to assist me here? Thanks

    Comment


      #3
      After thoroughly evaluating with Procmon (from Sysinternals Suite), it came to my attention that one of my volumetric data series within 'State.Configure' was set to a much smaller tick interval (less than 200), and seeing how Strategy Analyzer has to read in each value before writing it to cache on the filesystem (a lot of costly file I/O), this was going to take a very very very long time (despite having fast M.2 SSDs)... once I commented out that data series, the Analyzer was able to finish within the target duration. Essentially my issue has been resolved.

      To whom it may concern... From a cursory view, it's difficult to distinguish NT8's architecture, but I would imagine there are faster ways to handle DB operations, caching and just largescale data management overall. Perhaps those faster ways simply aren't feasible, or NT8 is too monolithic and would require a complete overhaul, not sure. Nevertheless, it would be nice to have a more robust DB architecture and data management service that could take advantage of processor groups as well as more in-memory operations (if RAM is determined to be sufficiently large enough) - heck something like hazelcast seems like it would benefit NT. It just seems to need a significant speed boost in order to make the leap from retail platform to enterprise-grade (NT has a lot of potential but leaves a lot to be desired). Hope developments like this are in the roadmap.

      Cheers.

      Comment

      Latest Posts

      Collapse

      Topics Statistics Last Post
      Started by GLFX005, Today, 03:23 AM
      0 responses
      1 view
      0 likes
      Last Post GLFX005
      by GLFX005
       
      Started by XXtrader, Yesterday, 11:30 PM
      2 responses
      11 views
      0 likes
      Last Post XXtrader  
      Started by Waxavi, Today, 02:10 AM
      0 responses
      6 views
      0 likes
      Last Post Waxavi
      by Waxavi
       
      Started by TradeForge, Today, 02:09 AM
      0 responses
      12 views
      0 likes
      Last Post TradeForge  
      Started by Waxavi, Today, 02:00 AM
      0 responses
      2 views
      0 likes
      Last Post Waxavi
      by Waxavi
       
      Working...
      X