• If this is your first visit, you will have to register before you can post. To view messages, please scroll below and select the forum that you would like to visits. Questions? Be sure to check out the Forum FAQ.

Announcement

Collapse
No announcement yet.

Partner 728x90

Collapse

Data Storage Flexibility & Better Compression

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

    Data Storage Flexibility & Better Compression

    Hi,
    I would like to know whether or not there will be any changes in NT8 to the inflexible data storage options (alternate partitions, etc.) and the seemingly poor compression used in NT7.

    I would also like to know if there will be any way to create add-ins that handle Historical Data Management and Replay data in a better way. (I was very pleased to hear that replay of historical data will be possible).

    I know there must be commercial reasons for not exposing more of the API surface area. However, the Import/Export/Archiving integration with external systems is absurdly constraining for all but the most trivial of "retail day-trader" operations.

    Perhaps the thought is that more serious clients may gravitate to cloud-based services at some point in the distant future. But given the glacial progress on this new desktop platform, I won't be holding my breath. (Yes, I know it's not easy, I'm a professional developer myself).

    As for compression, there is absolutely no excuse for the amount of space currently used by NT7.

    I can store a million ticks (DateTime, Double) in less than one megabyte (17x). It encodes in about 100ms and decodes in about 50ms on a outdated 4-core i5 machine.

    I won't even go into the network latency incurred when downloading the data!

    If you need a little help in that area, or you just want to see the performance comparisons for yourself, help yourself to my open source code!...

    http://deltacodec.codeplex.com

    I plan on adding tests that show the compressed size of data exported from NT7 when I get a chance.

    Anyway, sorry for the rant. I think NT8 looks really great. I will definitely be converting to the lifetime license when it finally gets released, even if I have to hack together my own data management drivers (using System.Windows.Automation).

    Or maybe you should just hire me as a data architecture consultant.
    Last edited by bstabile; 07-07-2015, 03:32 PM.

    #2
    Thanks for sharing.

    You can install NinjaTrader 8 user directory on another drive using a command line switch (http://ninjatrader.com/support/forum...45&postcount=9)

    There are now supported "Import Types" which you could use to read from your own data files and import to the NinjaTrader local repository. You also have access to a BarsRequest class which will allow you to create a request which would pull either from local repository or from a historical data server, and you could theoretically create your own export process using this class. But this still won't help if you goal is to change the default way the data is stored on the disk.

    There are not any changes to the compression in NinjaTrader 8. We use a priority format to store the data and we're happy with the performance compared to our competition.
    MatthewNinjaTrader Product Management

    Comment


      #3
      Thanks Matthew,

      I am pleased to know that I'll be able to save my limited SSD space for more vital uses and point to that expansive terabyte drive that has plenty of room. I

      t never made sense to me to have to shift the entire "Documents" folder (which could possibly disrupt any number of other applications).

      On some of my machines I have half a dozen drives installed for performance reasons (SQL Server, etc.) and I sometimes have partitions defined all the way up to "X:" (great for disk imaging purposes in VM environments).

      I also mainly focus on tick data, so the amount of storage really matters when lots of active instruments are involved.

      The compression issue is something I have had to deal with in genetic programming scenarios where trading systems (genomes) are optimized by training them on clustered server nodes with constantly changing subsets of historical data (a few hundred thousand ticks at a time). Obviously, network latency makes a big difference, especially in a cloud-based ecosystem. But even when reading and writing to disk it is significant compared to encoding/decoding in memory.

      http://branecloud.codeplex.com

      Anyway, I appreciate your quick response, and I look forward to the upcoming release.

      Ben
      Last edited by bstabile; 07-08-2015, 10:50 AM.

      Comment

      Latest Posts

      Collapse

      Topics Statistics Last Post
      Started by Brillo, Yesterday, 08:47 PM
      0 responses
      4 views
      0 likes
      Last Post Brillo
      by Brillo
       
      Started by Gav_G, Yesterday, 02:12 PM
      3 responses
      11 views
      0 likes
      Last Post NinjaTrader_ChelseaB  
      Started by Gav_G, Yesterday, 01:41 PM
      1 response
      12 views
      0 likes
      Last Post NinjaTrader_ChelseaB  
      Started by kweiss, Yesterday, 01:21 PM
      1 response
      8 views
      0 likes
      Last Post NinjaTrader_ChelseaB  
      Started by cutzpr, Yesterday, 08:44 AM
      1 response
      6 views
      0 likes
      Last Post NinjaTrader_ChelseaB  
      Working...
      X