Supporting Paredes’ transformation objectives while delivering interoperability within the IS

Interopérabilité SI : témoignage autour du projet de Paredes

Paredes revolutionised the hygiene sector in France by creating the single-use market. For more than seven decades, this 650-strong company has been manufacturing and distributing innovative hygiene products and services, and personal protective equipment for professional use. Their approach, based on their values, is resolutely customer-oriented.

Managing data streams within the information system is an issue found in every sector of the economy, exacerbated by the size of the organisation, the diversity of functional applications used, and requirements in terms of digitalisation and integration with the wider ecosystem.

In some organisations, orchestrating data streams is becoming a key factor in their transformation objectives. This applies to Paredes which, in 2018, set itself the target of being the recognised market leader and a specialist in multiple areas such as healthcare, local authorities and manufacturing, and offering an end-to-end service (after-sales, equipment, maintenance, etc.).

In this interview, Emmanuel Oberthur, Paredes CIO, reviews this transformation, the implementation of the first key components of the IS, and the challenges posed in terms of data stream management.

What business developments have affected the direction taken by information systems over recent years?

First and foremost, to set the background, we are only at the beginning of our transformation trajectory, which will be running for some years. The IT department lies at the heart of Paredes’ transformation and our objective is to give the company the impetus and capacity for perpetual development.

There are a number of aspects to this transformation programme, in particular a refocusing of the business on market sectors where Paredes brings real added value to customers. This applies for example to healthcare, local authorities, cleaning service providers and manufacturing.

The second key aspect is a faster move to digital. Previously, 90% of Paredes’ processes were based on its ERP system. The transformation programme aims to open up digital solutions for our customers, with a customer portal, an e-commerce platform and so on, i.e. solutions that will point Paredes towards the outside world; solutions that also need to communicate with each other.

If we go back in time, this also echoes the growing role of business users, who were increasingly pointing out to us that “ERP is fine, but this bit of the process wasn’t built to last”, or that “the IS needs to be connected to this or that best-of-breed application, or something built in-house” and the pace of these demands was quickening.

Our objective is to share information in a smooth, efficient, orchestrated and supervised way, firstly internally and then with the outside world.

Which were the issues that convinced you an ESB platform was vital to support this transformation?

Interoperability between the component applications making up the Parades information system had become a real issue when we started to hybridise the IS and push data out of ERP to circulate more widely. While it was all running on the ERP system, we were running highly conventional semi-interfaces.  Orchestration and supervision at the time were a manual, seat-of-the-pants job.

Historically, there had been a number of projects (Power BI, EDI, etc.) resulting in one-way or multi-directional data interchanges being implemented, but without putting a work plan in place enabling data traffic to be supervised.Data flows converged on ERP. Business-side applications uploaded files into a folder that ERP browsed through at a given frequency. If the timing wasn’t quite right, data interchanges could get telescoped.

A lot changed between late 2018 and 2021. We had accumulated a fair few projects and interchange formats were increasingly varied. The more they grew, the more we knew that we would be confronted with interoperability mechanisms presenting frequency and quality issues. Yesteryear’s technology was no longer up to the job. How too can we fulfil our wish to interact with the rest of the world, to extract and pool data and distribute it to a number of applications?

Recently, the number of data streams has doubled almost every year. Once you reach a certain volume and frequency, you just have to orchestrate data streams.

The start of the e-commerce project was the first major building block in the transformation programme to highlight the pressing need for interoperability.  Many other projects were also affected, such as CRM (Microsoft Dynamics) which had to communicate using an API. This was impossible through the current version of the ERP system.

By the end of the summer of 2020, we were consequently convinced of the need to set up an interchange hub that would allow us to collect data and redistribute it, regardless of the technology used. Especially since we had already had some trial runs with Azure Data Factory. This approach was very interesting technologically, but far from being suitable for our actual situation. We did not want to use even more publishers, and we are “pro Microsoft”, but when you cannot find an appropriate solution, you have look elsewhere.

This is why we wanted to implement a mechanism with a data warehouse populated by ESB or ETL, driven by the need to develop smoother and more shareable data interchanges.

What were the main stages in selecting and implementing the ESB platform?

When the CRM project arrived, with its wide variety of interchanges, we knew that ETL would not meet the need. All the services within Azure Data Factory are fragmented, and not pulled together in what we would see as a coherent toolkit. The cost was also an obstacle.

Our first work on our e-commerce aspect and SSIS (SQL Server Integration Services) meant we had already met the Data-Major team. We started to work together on other matters, then we sat down to think about this subject of data flows between applications, and found we shared the same basic outlook.

The decision was taken towards the end of summer 2020. In the time it took to write the statement of requirements and agree with the integrator on a “Why not?” basis, the CRM project started and the first data streams were designed. We then installed the platform in January 2021, trained in February, and launched the first phase on CRM with Data-Major aiming to go live in Q4 2021.

I wanted a straightforward, up-to-date, affordable approach, to move fast with a “why not?” mindset and with an integrator partner willing to buy into the process. I also wanted the contract to be for on-premise hosting.

We looked at a number of solutions when choosing the ESB platform. The majority of them were highly technical, which meant we found Blueway’s low-code approach extremely interesting, and likewise the way we can subsequently add modules such as MDM (Data Governance) and BPM to meet additional requirements, without constraints.This openness is something to consider and keep in mind for an MDM project in future. These were some of the factors that confirmed our thoughts on the subject. When we do start an MDM project, we will consequently have the option of a native connection to the ESB. And that is important.

All data interchanges from new applications will be coordinated through the ESB platform. The other large project, running simultaneously, is the migration of existing interfaces. The first phase of that will consist of the interfaces run with SSIS for e-commerce, because they have much in common with CRM in terms of type and the objects used. Our approach is pragmatic, so for example, we started the data streams exchanged between e-commerce and ERP with SSIS because we knew that this would be functional, pending the ESB project. This was our first experience with a change in technology.

The priorities set in the plan to migrate existing interfaces will then depend on the technological and technical issues. For some applications under the old interoperability method, those “in the know” who built the interfaces are no longer working here. Priorities will be set based on the risk for operations and mission criticality.

What are your next projects, and how will the ESB platform support these developments?

We are still broadening Paredes’ application scope and that will never stop. Our strategy dictates that the group will grow further in future.

We will be integrating new software. We make APIs available to our customers so they can automatically retrieve and transpose data for their applications, or operate certain business processes. One of our aims is to resolve a number of customer issues as we integrate this approach into Parades’ service offering. These highly business-oriented applications will need to call one platform, send information to another, communicate with customers’ IS, etc. Getting information systems to communicate with each other is part of the services we plan to develop.

The ESB platform will speed up all these new projects. We experienced this recently in respect of our new HR information systems overhaul. The ability to directly implement this modern approach to data interchanges via the ESB facilitated our integration work: retrieve data, format it then transport it in the IS…

In addition, while this platform was first used by the French company’s information system, the subsidiary in Italy also has a composite IS. We are a thriving company and collaboration between France and Italy is only going to intensify in terms of knowledge and common applications. Growth by acquisition is also part of our strategy, so we have to be willing to assemble information systems or get them to communicate with each other.

Master Data Management (MDM) will be another logical consequence of our IS structuring. Finally, as we are speeding up digital, there will be more and more services, and therefore more and more events, more real-time processes, and so on. We also needed an ESB to process event triggers.

It is no longer like two people who don’t speak the same language trying to communicate; instead both sides are talking and there is an interpreter between them.

With the benefit of your experience, what advice would you give to an IT Department starting a similar project?

My first tip would be don’t rush into anything. The technological issues are significant. My view is that it is vital to take some time to be clear about your requirements, spell out the facts, and then select a truly suitable solution.

Other questions have to be asked, such as whether the solution will be useable by trained business function staff, or reserved solely for IT people. The system needs to offer a great deal, and be simple to implement.

I also recommend being pragmatic about the projects to be carried out. Start with some quick wins and gradually increase project complexity to become more proficient.

The solution also needs to anticipate future requirements and be upgradeable. For example, our customers are increasingly asking us to give them access to data. We didn’t want the customer portal, as their entry point, to be massively complicated, so it had to be based on APIs and web services. The ESB needs to take the full situation into consideration from the outset, so that it helps to provide data to customers when triggered by events.

It’s not a matter of ETL or ESB, but ETL and ESB. Do not think that acquiring an ESB means throwing ETL out. ETL is still the more efficient option for very large volumes, whereas an ESB is more efficient at distributing data on request at any time, if an application has triggered a service. They are two different philosophies.

Looking beyond the solution, thinking how your ecosystem will be able to help is also essential. Choosing the right partners is one of the first steps towards success. With these systems and this technology, the partner chosen as the integrator is as important as the choice of software publisher. We have adopted a “why not?” stance because we trust our partner, Data-Major.

Taking our experience as an example, we decided to choose Blueway with risk shared between us and Data-Major.We were both certain that the philosophy behind Blueway’s system is right. Data-Major joined us and also made the effort to acquire skills in Blueway’s technology. This was part of the deal and made things much easier! We did not really have enough time to get those “in the know” to join our teams. This approach and division of labour meant we could move quickly while limiting risk.

As regards Blueway, the publisher, a point I particularly appreciate as reported during steering committee meetings, is that Blueway did not just make do with selling us the licences.Although Data-Major is sharing the project risk with us, Blueway is standing shoulder-to-shoulder to offer support to both of us. Whenever the Data-Major team has needed to know more, improve its expertise, or call on an expert, Blueway has always come through.

All three of us are firms of a size able to keep the personal touch. We work well as a trio, and so finding that synergy is actually my most important piece of advice!