How to upgrade legacy systems using standardised data interchanges?

Table of contents
Share:
Discover Phoenix Data Platform
How to upgrade legacy systems using standardised data interchanges?

Introduction: the new challenges created by legacy information systems

Information systems have undergone radical transformations in just a few decades. With the constant deployment of new digital solutions in businesses, many older systems are becoming obsolete, and legacy systems are becoming a fundamental issue.

With the rapid pace of technological innovation over the last twenty years, the installed base of applications is of considerable size, and difficult to control. Information system applications were often developed as a seat-of-the-pants job, with no view to a longer term strategy. This aging installed base of applications still includes some mission-critical business processes.

At the same time, the challenges are growing, mergers and acquisitions mean IS and business processes need to be converged, and increased levels of communication with commercial partners makes rationalisation a necessity. Mobility and the use of SaaS solutions are quickly changing the face of the modern IS. Meanwhile, the speed of technological change and the diversification of infrastructure just make migrating data from old applications even more complicated.

Another vital component can be added to this need for upgradeability, namely the business-driven change in IS governance. Functional departments are more involved than ever before in strategy decisions over software. Business solutions need to closely match business requirements. So the technical challenge of transmitting data comes on top of this need to solve grassroots problems. Developments to application components, whether new or legacy systems, must occur with minimal impact on staff performance, and therefore on the processes in place.

In response to these performance factors, businesses now need to not only consider upgrading their old applications, but also how to structure their entire IS so as to best anticipate the future.

esb-bpm white paper

How process (BPM) and data (ESB) integration can create value for IT and business departments ?

The organisational impact of legacy systems

The majority of business have upgraded their IS, including by using SaaS solutions. This new application layer, in addition to legacy applications, rarely presents genuine synergies with old systems. This re-engineering shortcoming causes real problems for business management:

  • The complexity of adding new applications: implementing new applications is complicated by the existence of a disparate legacy installed base. One-to-one communication, which is widespread in information systems, entails a great many individual data flows, which are difficult to map and manage.
  • Incomplete information: it is difficult to process an untidy jumble of data flows correctly. The data arriving from business processes is therefore often incomplete or poorly structured, and cannot be fully exploited.
  • Management overhead: managing a disparate installed base of applications results in a great many tasks of little or no added value, taking up employees’ time and effort. A unified IS architecture can, in contrast, make a degree of automation feasible and boost employees’ productivity.
  • Maintenance costs: the existence of legacy systems often means maintaining more systems, which are also sometimes redundant. Time and money are spent to no purpose, adversely affecting performance.
  • A processes overview is missing: business processes form part of a global strategy, and the same applies to data. Data traffic requires proper architecture to provide continuity at each stage of a project.

Developments in processes and software perpetually widen the gap between old and new applications. It is wise to consider a re-engineering and rationalisation strategy at the earliest possible stage. Following the right procedures and using the right solutions will meet technical requirements and make it possible to meet business requirements more quickly.

What is the right approach to re-engineering legacy systems?

In a context where agility reigns supreme, it is crucial to put an information system re-engineering process in place.

A number of key principles underpin such a structuring exercise:

  • Foster the de-siloing of information : lorsque les applications vieillissantes se multiplient, il devient essentiel d’instaurer une communication systématique et maîtrisée entre elles. L’urbanisation vise à éviter tout enfermement de l’information, et donc toute perte de performance.
  • Map the business processes : business processes must not find themselves constrained by applications’ limitations. Using actual business requirements, IT and functional departments work together to model all processes to produce a more effective IS.
  • Organise an integration architecture : good communication between applications should be combined with a more comprehensive and upgradeable architecture. Gaining this kind of perspective makes it possible to keep control over the IS and ensure inter-application dependency is minimised.

Such principles should enable communication between applications to be unified, delivering greater working flexibility, even if the structure changes fundamentally.

Meeting the challenges of legacy systems through re-engineering

IS re-engineering delivers a suitable response to the challenges brought about by legacy IS, by facilitating:

  • Application decommissioning: the removal of obsolete components makes it possible to retain only those serving a purpose. Decommissioning reduces risks and costs, and streamlines the information system.
  • Cutting dependencies: data traffic is no longer constrained by old applications and the difficulties arising from having to maintain them. Standardised data interchanges become useable in any company ecosystem.
  • Simpler addition of components: in a unified environment with clear flows of data, software components can be added and removed much more easily, with the functional impact fully controlled.
  • Identification of application bottlenecks: understanding the busy spots for data traffic makes it possible to improve how that traffic is organised and anticipate risks.
  • Gaining perspective on data flow organisation: how data circulates throughout its lifecycle has a significant impact on the implementation of corporate strategy. Re-engineering also boosts the continuous improvement of flows and processes.

Practical software solutions to help with re-engineering

Two essential solutions lie at the heart of any re-engineering exercise, serving to blend data traffic management and process optimisation:

L’ESB

Application bus technology carries data between applications, without needing to develop an individual interface for each one. Every component in the IS can consequently retrieve the data needed from this Enterprise Service Bus.

It is a solution that provides straightforward, structured data interchanges between recent and legacy applications. It also contributes to infrastructure stability: application components can be added or amended while maintaining constant and reliable data flows.

The application bus can also be used to deal with common issues for all businesses, e.g. the distribution of large volumes of data, populating the master data repository, synchronous and asynchronous data interchanges, etc.

expert view ESB vs ETL

ESB vs ETL ? The distinction between is no longer relevant relative to today’s business requirements.

BPM

Re-engineering data interchanges serves no purpose unless the data flows fully match business requirements. Processes, structuring what the business does, cause data to circulate and expand, but they also feed off the data gathered. This is where Business Process Management enters the picture, a solution used not only to digitalise business processes but also to boost their agility and make them truly part of the IS. With automation of low added-value tasks, scheduling, monitoring and functional rules, the interoperability between ESB and BPM means that processes use data that is always reliable and always available. Those processes are also fully part of the information system.

Taking advantage of other aspects of data interchange

To deliver optimal performance, not just internally but also to external business partners, the use of such solutions should form part of a Service-Oriented Architecture (SOA) approach.

Structuring data interchanges also helps to build an ecosystem consistent with how the business operates internally, and provides more flexibility in services to partners.

Solutions such as API Management form part of this approach, and provide governance for API exposure. They enable data consumption to be fine-tuned, and indeed the governance, implementation and removal of APIs themselves. In this way, data interchange between partners is facilitated, scalable and structured, and data kept secure throughout its lifecycle.

At Blueway, we firmly believe that your business challenges and the development of your information system should not be constrained by legacy systems and technical factors. Which is why we offer one platform that unifies all these aspects of data interchange, with ESB, BPM, MDM and APIM.

Schedule a call

Want to discuss interoperability challenges with an expert?

Edouard Cante
Edouard Cante
Executive Vice President Product Technical and functional expert, Edouard has specialized in IS urbanization and data governance for nearly 20 years. A man of the field, he and his teams support customers in their projects, and don’t hesitate to use this feedback to shape the product roadmap and gain in agility.
Dans la même catégorie :
Data Integration, Interoperability