While everyone knows (or should know) what the RGPD stands for, there are other lesser-known regulations and standards applicable to the banking sector: the Digital Operational Resilience Regulation (DORA), the Payment Card Industry Data Security Standard (PCI DSS), the 2nd European Payment Services Directive (PSD2), the directive to strengthen cybersecurity within the European Union (NIS2), the processing of military personal data (DPCM)…. A real headache in the field of data management, especially considering that the bancarisation rate is 99% in France (according to the Fédération Bancaire Française) and that 75 million payment cards are in circulation in the country (according to the ECB).
In fact, with the sheer volume of data to be analyzed and its criticality, it’s easy to understand why this sector is the subject of particular vigilance on the part of the authorities. Frédéric Toumelin, Head of the Banking and Insurance sector at Blueway, explains that while the banking sector’s primary challenge with regard to data is security and compliance, it is not the only one.
When it comes to data governance, what are the current challenges facing banks?
Historically, banks have always been more vigilant than other professional sectors in the way they handle their data. This is logical, given that the very activity of a bank requires the handling of sensitive personal data. In fact, the challenges of data governance have long been identified, but meeting them is becoming increasingly complex, notably as a result of two phenomena. The first is the tightening of compliance requirements, with banks subject to specific requirements brought in by the regulator (PCI DSS, BCBS 239, DORA, PSD2, various reporting obligations such as AnaCredit, …). In the event of non-compliance, the impact can be considerable, both in terms of financial penalties and reputational damage.
The second is the ever-increasing level of risk. Firstly, in terms of security, with the proliferation of cyber-attacks targeting financial organizations in particular. But there’s another side to risk: the difficulty of consolidating a global view of an institution’s exposure to a given activity, across its various geographical and thematic entities. The subprime crisis in the United States in 2008 demonstrated the kind of domino effect that parrcellar visibility could induce, even leading to the bankruptcy of banks that were reputed to be solid.
This is why, over the past 15 years, regulators have considerably strengthened reporting requirements for activities with risk profiles that are most sensitive to volatility or default (witness the AnaCredit collection mechanism to better regulate bank lending activities). In Europe, the situation can be made even more complex by the stacking of domestic and supranational rules, a real headache for establishments operating in several countries, through entities whose legal nature may refer to bodies of rules that are sometimes imperfectly aligned.
Master Data Management : Data quality and traceability at the heart of your information system
How do these many challenges impact data governance in this sector?
To begin with, given the many compliance and risk control issues at stake, the first step is always to identify where the data requiring special attention is located. This is particularly difficult in the banking sector, due to both the size of the organizations concerned and the weight of legacy systems. It’s important to bear in mind that most of France’s major banks existed long before the advent of IT (1818 for Caisse d’Epargne, 1882 for BNP Paribas, 1885 for Crédit Agricole, etc.). In fact, it’s a real challenge for these organizations to work effectively around data within an IT heritage that has been mutating for decades, in line with the increasing digitization of processes as well as external growth operations.
So it’s easy to imagine how difficult it is to know exactly where data is located, so as to be able to exploit it effectively, not only for control purposes, but also for value enhancement. Indeed, most of today’s major banks have hundreds of subsidiaries around the world, each with its own patchwork of heterogeneous IT systems that may or may not be interfaced with those of other group entities! This is where an automatic data mapping tool comes into its own. Without it, the data discovery process is not only much, much longer and less effective in terms of exhaustiveness, it also proves impossible to implement in a number of cases.
MDM versus PIM: bitter rivals or a dream team ?
How does Blueway meet these challenges?
MyDataCatalogue is the module of Blueway’s Phoenix platform dedicated to mapping and cataloguing your data assets. The key word here is “automation”, and this applies to all stages of cataloging: discovery, standardization and classification. With Phoenix, you don’t have to go through time-consuming workshops to fill in the catalog, as this manual, declarative process doesn’t allow for perfect completeness. What’s more, unlike an approach based on the manual description of attributes, the use of MyDataCatalogue guarantees the catalog’s scalability, thanks to the automation of regular, programmable updates.
And there’s more: thanks to the technology embedded in its probes, MyDataCatalogue is able to scan IT sources not only at metadata level, but also at unit data level. This granularity makes MyDataCatalogue a virtually unique solution on the market, and opens up otherwise inaccessible use cases, notably in terms of application quality and the ability to purge ever-expanding unstructured data environments.
Finally, the functions of MyDataCatalogue can be combined with the other modules of the Phoenix platform to provide a solution for the entire data cycle, from identification to urbanization, governance and movement through processes. Blueway thus helps banks to meet today’s data governance challenges in a regulatory context of ever-increasing requirements, and to better exploit the potential of their decision-making assets.
Want to discuss your Data Catalog challenges with an expert?