Data warehouse management and data analytics always had the challenge to decide what data to store and for how long to keep the data. This is even more relevant with today’s Data Lakes and the possibility of storing increasing volumes of information at cheaper cost in the cloud. Additionally, new data types, such as IoT or Social Media, have emerged that provide millions of records that may or may not be important to analytics at later stage.
Regulatory compliance is often used as the benchmark of what data should be stored and for how long. This can range from a few years to decades or indefinitely. In the airline industry, aircraft maintenance records must be held for the life of the asset plus 7 years. For most aircrafts that period is 20-30 years in which records need to be retained. The magic 7-year mark is another benchmark that is mainly relevant to financial and tax data but, has been defined as the regulatory retention policy in many cases.
Whilst regulatory retention compliance is pretty clear-cut and well defined, what about the other data types that make up the majority of corporate data.
We can probably all agree that the value of raw data diminishes over time to the point that it is no longer relevant. Aggregated data might never reach the point of no value. A good example is trend information such as stock prices or stock index information that is still somewhat relevant even after 100 years.
The problem is that we don’t know whether data might be useful in the future. New analytics models might want to analyse new patterns and formulas and test them over longer periods of time and in different market situations. With that in mind a lot of data managers are reluctant to remove data with the result that in a lot of cases organisations have large pools of stale data that is increasingly hard to manage.
A clearly defined data governance framework must be put in place. The cornerstone of the governance framework is a clear understanding and profiling of the data that is available. Secondly, a clearly defined retention policy must be defined. This should not be a sweeping statement across the organisation but must be defined within the individual data domains.
Some hard decisions need to be made on how long raw data should be held. As hard as it sometimes is to let go of data the cost/benefit ratio for keeping records is often not warranting the further retention of data even if it is held in low cost archive storage.
Let Fusion Professionals work with you to develop the most appropriate Data Governance framework and strategy for your organisation.
Achim Drescher is the Managing Consultant of the Big Data and Analytics Practice at Fusion Professionals.
With 30 years in the IT industry he is an Expert in Enterprise Software and Data Architecture, Data Governance frameworks and modern analytics platforms for Big Data and Data Lakes.
Comments