Announcement

Collapse
No announcement yet.

Partner 728x90

Collapse

Should not delete data from local DB

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

    Should not delete data from local DB

    I am having a hard time using the best improvement to v7, the historical data management capabilities. I have found gaps in the historical data that is stored locally on my machine, that is built up over time from my brokers feed. I finally understand how this happens, and hope this behavior will be corrected in the future. Please bare with me as I try to explain what happens as NT 7 is now.

    This becomes very critical when trying to work with tick data, as so little... 2 weeks worth of historical is available on my server.... so if you lose some of the historical you have built up in your local NT database, it must be frquently rebuilt from backup archives..... very labor intensive... so much so working with tick data is almost not worth the trouble. But the problem exists with minute data as well.

    Here's how NT v7 behaves now


    1. As you run NT through the day, it builds up a local database in historical data manager.

    2.My broker's server has about 6 months of minute data on the server, and 2 weeks of tick data. Think of a two week or 6 month moving window of data that as time moves on, the window advances to the right, and older data drops out of view because minute data older than 6 months, or tick data older than two weeks drops off the brokers server.

    3. No problem right? You have that data stored in your local NT database right? Not so fast..... it is there and it should always remain there, but NT will delete from the local DB any date folder, and all of its contents if you ask it to download a date that is currently in the local DB, but has dropped off the brokers server.

    NT trader should never delete existing data from the local database... at least not without asking if that is what you want to do. If the requested download date is not on the brokers server anymore, but does exist on the local machine it should at least preserve any data that may already be on the local DB.....

    The way NT behaves now, if you download a date that already exists in the local database, NT over writes the old folder with the new one..... but if no data for that date exists any more on the brokers server, it deletes the local DB's data for that date...

    Then that data is gone forever.

    It gets very difficult to keep track of what date and what time of day it is, what is left on the brokers server, and over time, gaps begin to appear in the local database's dataset, as pieces get deleted in the above manner. Now the local database is useless.

    Long story short... NT needs to handle its historical data in a BACKFILL manner better. The best case scenario would be for NT to be able to compare what data is missing from the local database, and FILL IN what is needed from the brokers server.

    Failing that... it would be better to have a warning window that pops up and informs the user that the requested data no longer resides on the brokers server, and continuing the operation will result in the loss of any locally stored data for the date(s) requested.

    I am still not sure how this all works with Reload All Historical Data while Get Data From Server is enabled.

    It is common to have the NT off for a couple hours, start it up, and want to get a BACK FILL.... that is, it should work as.... first load all data from my local database, determine where that data ends, and the current time, then go out to the brokers server and download only the data that FILLS IN the gap from where my data ends and the current time.

    This has been confusing e for some time, and I am still not sure if the above described data overwriting/deleteing happens during Reload All Historical Data while Get Data From Server is enabled.

    It certainly does delete the data when you use historical data manger, download, select a date you already have, and attempt to download data that has now scrolled off the brokers server.

    #2
    Crassius, thank you for the suggestion, I will pass it along.
    AustinNinjaTrader Customer Service

    Comment


      #3
      Hello,

      Please see this information here:

      When does NinjaTrader load historical data?



      The describes our data loading rules in NinjaTrader exactly as they occur.

      Most likely you are running into



      4. Chart of 12/28/07 to 1/5/08 -> load data request for all dates

      Most likely your pulling upa chart that goes back farther then the previous data in the database. In this scenario would load data from the data feed provider over your database.

      This is explicit NT design currently and due to other implications there are no changes moving forward on this near term as their is no logic for grab chuck A from server, grab chunk b from database, grab chuck C from server etc.

      You have a couple options this environment if you need to keep this data.

      1) Stay in the data loading rules so that you do not cause an overwrite of your database data.
      2) Connect to a third party data feed provider such as www.kinetick.com that would have more historical data available for you off he server to fit what you need and use this data instead.


      If you stay with 1) I would recommend exporting your data from time to time to make sure in case you make a mistake and go outside the data loading rules and cause data to try to load. You can reimport it back in.



      Or use the File->Utilities->Backup option.

      Let me know if I can be of further assistance.

      Comment


        #4
        Thanks for the replies

        Yes Brett that is exactly the condition (#4) that will erase the data that exists in the local database if you press Re-load all historical data while get data from server is true.

        I understand how the behavior is happening.

        How users actually find the software design works for them in the real world, and adjusting the design based on feedback is a good policy. I am sure you would agree.

        The way I use your software, is to have a workspace, on which I set up with charts that create a work environment. I save that workspace, and back up that workspace nightly, as the design anticipates a user would.

        Eventually my saved charts will contain data that is farther back in time than the data that is available on the server. So the current design creates the extra labor of requiring me to periodically either delete the chart I have been working on for months, or adjust its data series dates to load into display....

        if I don't remember to do this, or I miscalculate when the data that is on the server will not extend back far enough to cover the date range on my chart, NinjaTrader will delete from my local database.... requiring even more labor to re-build the local database from backups.

        I am suggesting that this is laborious and irritating to the user, and asking for at least a warning popup confirmation that continuing the operation will delete data from the local database.

        I understand that means NinjaTrader would have to somehow query the server and know before it loads the data that the date range requested is larger than resides on the server.

        I suggest from a user standpoint... the way your software is used in the real world, its current design allowing the deletion from the local database is a drawback to NinjaTrader.

        Yes, subscribing to a paid data service with a longer lookback would get around this problem... Since spot FX is not traded on a centralized exchange however, and instead various broker specific pools exist which have pricing differences between them, if one wants to chart the spot FX data he is actually trading, he must chart his broker's liquidity pool.

        eSignal offers MB Trading's pool that would achieve the aim of charting exactly what I'm trading. I could subscribe to eSignal and get the data I need. Kinetick does not offer MB's pool. However, if I was going to pay eSignal for its data, I would not want to pay NinjaTrader for its software, but use eSignal's at a lower overall cost.

        Anyway, just some feedback from a user in the real world to people who design software.

        Comment


          #5
          Thank you for the suggestions, Crassius and sharing your experience working with our platform. I will forward to our development team.
          Ryan M.NinjaTrader Customer Service

          Comment


            #6
            I just encountered this exact same problem. NT7 deleted a large amount of my historical data without any warning.

            Here are my comments:

            1. NT7 should NEVER delete anything!

            2. Deleting data must only be initiated by the user and there should be a warning before the actual deletion takes place.

            3. When downloading historical data, NT7 should only overwrite existing data. It should never delete all the data first and then replace it with the downloaded data. The downloaded data may not fully replace all the data that was deleted.


            Please fix this very dangerous flaw in NT7 programming ASAP!

            Thanks,

            CS

            Comment


              #7
              I Agree

              I've had to come up with an elaborate set of protocals to ensure backups exist for the inevitable deletions that occur.

              A warning screen that says proceeding can result in deletion of data and confirmation from the user to proceed is the LEAST that should be added to NinjaTrader.

              Comment


                #8
                Happens again NinjaTrader makes more work than it's worth

                I just posted this on my brokers forum.... it explains what is a routine annoying fact of life of using NinjaTrader.


                Granted some of this has to do with NinjaTraders designed in automatic deletion of data from the local database when you request more data from the server than exists on the server, even if you already have some of that data stored in the local database, but damn it every other broker provides downloadable files to patch historical data.... for just such inevitable glitches.

                Here's what happens. Lately there has been 40 days worth of tick data for EU on MB's Server. I have my chart set to load 30 days worth. But sometimes, for whatever reason the length of the data available on the MB Servers varies. There is no way for me to know that it has changed from 40 days before I attempt to load a chart. Today at around 10 PM EST there is only two weeks worth of EU tick data.

                So when I try to load a tick chart of 30 days EU tick data... which is a standard chart on my working desktop, Ninjarader goes out to MB's Servers, and requests 30 days worth... since there is only 14 days worth, it loads that, and then stupidly deletes the balance of the 30 days I already had in the local database.

                It gives me 30 days of whatever is on the server.... in this case it gets the 14 days that exist and then overwrites with nothing (deleting) the balance of the requested days.

                Now I have a gap in my local database, which I can only repair from a backup file since it is no longer on the MB Server. I know I know, I should religiously make my own backup files every day and store them for just such occassions, but damned if I didn't this time... Why should every user have to make their own backups rather than have a central location for them?

                Maybe that data will appear later in the week on MB's serber... maybe not.

                Yet once again I request that MB make available to its customers tick data (from which minute can be built) as downloadable files just like every other broker in this industry already does.

                If the data doesn't re appear on MB's servers later this week, I'll go over to Dukascopy and download their data, and patch my MB data with it.... but I would much rather have my own brokers data.

                __________________________________________________ ________________

                The cruxt of the problem is not knowing what is on the server until you download it, and NT deleting anything you already have if you request more than is available.

                Comment

                Latest Posts

                Collapse

                Topics Statistics Last Post
                Started by PaulMohn, Today, 05:00 AM
                0 responses
                8 views
                0 likes
                Last Post PaulMohn  
                Started by ZenCortexAuCost, Today, 04:24 AM
                0 responses
                6 views
                0 likes
                Last Post ZenCortexAuCost  
                Started by ZenCortexAuCost, Today, 04:22 AM
                0 responses
                3 views
                0 likes
                Last Post ZenCortexAuCost  
                Started by SantoshXX, Today, 03:09 AM
                0 responses
                16 views
                0 likes
                Last Post SantoshXX  
                Started by DanielTynera, Today, 01:14 AM
                0 responses
                5 views
                0 likes
                Last Post DanielTynera  
                Working...
                X