Data Lifecycle Management, or how to keep control over each stage in data processing

Table of contents
Share:
Discover Phoenix Data Platform
data lifecycle management

Adopting a true data governance policy is no longer an optional matter for any organisation. The reliability of processes is at stake, along with the performance of the entire IS, and strategic analysis of business activities.
Such governance is of relevance in data collection, dissemination and deletion alike… it is important to encompass the entire data lifecycle and not to neglect any aspect of it. Data Lifecycle Management (DLM) is consequently a response not only to the need for data assessment, but also provides cross-functionality in that functional business departments must be able to enrich data once it has been collected to keep it relevant in the information system.

Data Lifecycle Management: automate and supervise to keep better control over what happens to data

DLM is an approach whereby data is managed throughout its entire lifecycle, from collection to deletion. By breaking the data lifecycle down into processes, and then automating each one, data can move from one stage to another, given the right conditions, while any enrichment is both controlled and specific, to match the situation. Data Lifecycle Management is a blend of applications, methods and procedures, taking each organisation’s particular features into due account.

DLM meets three objectives :

  • Ensure data integrity : data has to remain accurate despite being used by various parties, circulating around the information system and being stored on differing storage media.
  • Ensure data availability: data must always be accessible to those who need it, as this helps processes and routine business activities to run smoothly.
  • Keep data secure: human and technical errors alike are checked, so data remains accurate and trustworthy, even when handled and manipulated to a significant degree. Protected by access permissions, data is stored securely until it is deleted.

Automation is a key feature of the DLM approach. The entire processing of the data – whether actively processed or simply stored – will be customised according to criticality and frequency of use. Data traffic around the IS will consequently be orchestrated to meet these needs as closely as possible.

With human support provided by data specialists such as Data Stewards, controlled processing of data at every stage can be assured. Responsible for data quality (checks and corrections, the input of metadata, etc.) and the data’s usefulness to the business, this key role slots naturally into any DLM approach.

mdm white paper

Master Data Management : data quality and traceability at the heart of your information system

The five phases of Data Lifecycle Management, from collection to deletion

The data lifecycle is typically split into five phases:

Creation and collection

Storage and management

Use and sharing

Archiving

Deletion

Génération et collecte des données
Conservation et maintenance des données
DLM : Consommation et partage
Archivage-donnees
Suppression et unification des données

The purpose of this stage is to collect data, on the basis of its origin and its users. Data can also be filtered to determine whether it is to be collected or not.

This stage can entail preparatory data processing (encryption, compression, de-duplication, etc.) and appropriate storage (in terms of security, accessibility, backup, etc.).

Here is where roles and permissions are managed, so that all authorised persons have access to the data they need. At this stage, data can be used by BI or other analytical tools, and it can itself generate more data.

Data not required for operational purposes must remain accessible subsequently (for compliance matters or analysis, etc.) and be stored securely.

There must also be rules to govern data deletion. Obsolete and duplicate data will consequently be deleted completely securely, and in accordance with the policies in force.

As data is perpetually being collected and generated in any Information System, more than one of these phases can be running simultaneously. Through automated processing, each data set follows a consistent pathway from collection to deletion. These processing steps show that DLM is different from ILM (Information Lifecycle Management), with which it is often confused. While DLM processes data on the basis of general attributes (type, size, creation date), ILM operates at a more micro level and is intended to ensure the accuracy of each individual file’s data set. The two solutions dovetail, and each of them makes it possible to take data qualification further, at the level at which they work.

white paper mdm versus pim

MDM versus PIM: bitter rivals or a dream team ?

Our view at Blueway is that the issues surrounding data mean that every aspect of data flows needs to be considered

At Blueway, we firmly believe that the operations needed to maintain data quality (LIEN https://www.blueway.fr/en/blog/data-quality-assurance) must be implemented automatically from the time that data is collected, to ensure a single version of master data is shared around the organisation. In many cases, data is constrained by technical issues, or application silos. Duplicate data and differing interpretations are just two hindrances to effectiveness. It is therefore crucial to provide the business with a unified picture of their data, nonetheless viewable through their own specific lens.

When building a single point of truth, functional managers have a key role to play. The DLM method needs to be underpinned by data management rules that mirror business requirements, and that will process data at source efficiently. By virtue of being involved in the process, users become genuine stakeholders in the data lifecycle.

But over and above data quality alone, how data flows around the IS, and adding value to it, are also important aspects. Data accessibility and enrichment can only really be optimised by considering every dimension of data traffic.

As well as the Data Governance module and the single master data repository (Master Data Management), which enable the DLM phases to be implemented most efficiently, the Blueway platform includes data transport (application bus, iPaaS) and business accessibility (BPM) modules. Combining these three dovetailing views of data means it can be truly put to best use while making sure it is processed correctly throughout its lifecycle.

Schedule a call

Want to discuss your data management challenges with an expert?

Alexis De Saint Jean
Alexis de Saint Jean
Fascinated by the impact of new technologies on our organizations and our environment, Alexis is a well-rounded mix of data, cloud, curiosity and good humor. With almost 20 years’ experience, he can provide a global vision of the market and assess key trends. If need be, he can also whip up a few wood-fired pizzas for you…
Dans la même catégorie :
Master Data Management