Tag Archive for: All sectors

Getting ERP and WMS systems to communicate

ERP (Enterprise Resource Planning) and WMS (Warehouse Management System) both meet crucial requirements, and each fits neatly with the other. Data often needs to be consolidated between the grassroots level and overviews at a higher level to speed up order processing and improve stock planning. Communication between ERP and WMS has an impact not only on how the business is organised, but likewise on the information system. For this link between applications to be efficient and not deteriorate as time passes, a robust and upgradeable solution should be implemented.

ERP, the central access point to business data

ERP lies at the heart of manufacturing organisations, and many other industries besides. This potent management and coordination system centralises functional data and monitors activities in real time. Its company-wide reach and modular dashboard make it the go-to solution for decision-makers and functional business managers.

ERP meets the need for data consolidation and traceability, with financial information, purchasing, production, schedules, etc. Manufactured output is consequently tracked from design to shipping, and product compliance is more certain.

This integrated management package helps rationalise costs, by reducing errors and providing decision support, businesses keep better control over their projects and their production processes.

ERP has a profound influence over the information system. Forming the core of the IS, it offers an access hub to a major proportion of the business’ data, which fundamentally simplifies the use of multiple business data sources. It therefore makes sense for businesses to want to connect it to other applications in their information systems .

WMS, the grassroots solution that transforms warehouse management

Those other applications include WMS, or Warehouse Management System. This package structures and supports order preparation, optimising the available space and improving the quality of the packages shipped. It introduces invaluable smart management to warehouses for businesses with sustained levels of activity.

Sometimes wrongly confused with ERP, a WMS is actually a dedicated application offering advanced functionalities concerning supplies, storage, inventory and order picking. The crucial role played by a WMS is to properly match resources (available space, order pickers, hauliers, etc.) with business requirements and strategy.

It is therefore a system with a high impact on the effectiveness of stock management. By streamlining warehouse management, a WMS means improved anticipation of future demand and requirements, and better use made of resources.

The much-needed communication between two crucial systems: ERP and WMS

These two systems, occupying crucial roles in the IS, therefore absolutely must communicate to deliver the best possible performance. Integrating these applications is all the more important given that warehouse management depends on data from ERP, and vice versa:

  • For the business, warehouse management affects the human resources to be used, finance requirements to be anticipated, and production processes and storage to be orchestrated to optimise the space available. WMS provides accurate data (stocks, goods inwards and outwards) that helps to reduce the levels of costly, unused stocks held, and contributes to business management more generally.
  • For the warehouse, processes are dependent on what the business is able to provide and the level of stock operations (orders, additional products) from ERP. ERP is where the decisions are made, and relevant data is then provided so that warehouse processes are run accordingly. The aim is to improve storage capacity, eliminate tasks serving little or no purpose, and therefore reduce costs.

Communication between ERP and WMS thus pools two business views, one more strategic and high-level, the other more granular and operational. But both must communicate on the same terms and get data circulating in real time.

For this, it will be necessary to:

harmonisation of processes between ERP and WMS, ERP-WMS-

Harmonise the “language” used by the two systems:

WMS needs to have ERP codes and practices available to harmonise with the processes used

avoid duplication of data between ERP and WMS

Eviter lAvoid duplicate dataes doublons:

Entering a piece of information in both ERP and the WMS can potentially result in duplicates. Reconciliation should be automated, and orchestrating data flows between the systems should mean errors are avoided.

Instantly inform the other system when major changes occur:

Special orders, stockouts and other events with a serious business impact must be swiftly reported between the systems

DisposProvide a complete product database:

both systems must work in synergy so that all products are listed and full use can be made of product characteristics by the WMS. This requires a logistics profile of products to be made available (properties, batches, packaging, etc.)

Re-engineer the IS to coordinate interaction between ERP and the WMS

The objective of linking ERP and WMS is to introduce instant data interchange and not restrict functional processes to the scope of a single application. This will contribute to better choices in terms of product storage, and flawless orchestration for order preparation, including automated re-stocking, systematic quality controls, thresholds for triggering notifications, etc. Once added into ERP, the data and warnings form a basis for improving product supply, and working methods.

But while it acts as a unifier and is undoubtedly potent, ERP is not itself intended to deliver this communication between applications. Often wrongly viewed by businesses as an integration platform, ERP is sometimes linked to other business applications via standard or custom connectors, causing an increase in the number of point-to-point interfaces. However, while ERP is a lynchpin of the IS and supports transactional processing, it is not designed to unify and transport data.

Changes in business practices and data formats, or changes in application components, should not prevent interaction between ERP and a WMS.

Consequently, to ensure the solution chosen will last the course, the key point is to choose flexibility. Application components must be able to be removed or changed as the business develops. ERP and WMS can accordingly evolve with the business, with changing or obsolete applications posing no obstacle.

With its comprehensive approach to data interchange, in particular through an application bus (ESB), Blueway provides a flexible, upgradeable, supervised relationship between the ERP and WMS applications. ERP can exchange full, up-to-date data with the WMS or any other functional business application at any time. Doing so avoids the use of somewhat rigid point-to-point interfaces: the IS is re-engineered around a central system and can be enhanced as and when requirements arise.

This re-engineering ensures data is supplied to all business processes. Far from being limited to a functional business scope, data can be fully exploited, and not restricted to one application alone. Modelling processes using BPM can then help with the strategic distribution of business data to all those who need it, regardless of the constraints of any software component.

In this way, it is not only the information system itself that is rationalised, but also the organisation and its processes. The application bus ensures that the information system genuinely serves the business, and each component application, led by ERP and WMS. A key issue in today’s business performance.

Get in touch
with a Blueway Expert

Témoignage_client_Toray_films_europe_bpm

Toray Films Europe

Improvements to customer complaint handling between Production management, Salesforce and SharePoint…

Temoignage_departemant_morbihan_esb

The département of Morbihan

De-siloing of the information system…

Témoignage_samat_soa_esb

SAMAT

Real-time management of orders and deliveries…

Récipharm Fontaine

Interfacing the IS with a business added-value network and/or directly with their clients’ ERP…

Mecachrome

Improved purchase request processing between the Movex ERP and the system’s users…

Laboratoire Dômes Pharma

Consolidation of data from production (Sage X3) and HR (Nautilus) into SAP…

Data Steward tools

The data steward, a lynchpin in data governance

The amount of data collected by businesses is constantly expanding, and the need to process it likewise. Data-related professions are therefore of unprecedented importance, all the more so given the backdrop of rapid digitalisation. Driven by the wish to make best use of their data, businesses are increasingly making use of specialists in the field. From data architects to data analysts, these roles are genuinely useful and each has its own challenges and key features. A data steward is responsible for documenting data, and is thus central to the data leveraging process, forming a crucial link in data processing.

What is a data steward?

The role and work of a data steward

While functional specialists such as the data owner ensure the final quality of data, the data steward works on data documentation. The role entails tracking and gathering information from employees. The ultimate objective is to make it easier for functional business departments to access data and, in so doing, maximise the use made of the data that the business collects.

The data steward’s role is both practical and educational. They need to have in-depth knowledge of processes and an overview of data flows, and be able to brief staff about the benefits of their work.

Having the ability to combine business data, data stewards make it easier for business users to understand these data flows, and therefore leverage the value of data in the business.

In practical terms, data stewards:

  • Centralise data about data: they collect valuable information about how data is used, its potential obsolescence, the changes made to it, and the possible errors it might contain.
  • Check data quality: in parallel with the usual verifications, they put systematic controls in place and apply remedial measures if necessary.
  • Provide metadata: metadata describes data sets in the form of an object or label. Once entered, metadata makes it easier for the business to access the data.
  • Suggest appropriate formats for technical and business documentation: the aim is easier consultation by anyone interested, and to facilitate data maintenance.
  • Protect data: they determine protocols for data access, distribution, archiving and deletion, ideally ensuring that data remains both inviolable and confidential.

The purpose of data stewards’ work for the business and its data

For the business, employing a data specialist is primarily a way to orchestrate its data governance. Imposing some kind of structure is increasingly essential at a time when the amount of data collected could now truthfully be said to constitute “data lakes”, making quick and meaningful sorting absolutely necessary. Data stewards bring order to data that is difficult to control – and sometimes under-used – thereby giving their businesses a competitive advantage.

Data quality is also a major issue. The speed of data flows and exponential rise in applications is creating a substantial proportion of errors and redundancy. Highly-regulated industries must without fail be able to ensure their data is compliant. Labelling data sets in the way the data steward does is one way to ensure quality and currency.

Data stewards also help to increase the value of data by using labelling consistent with business codes. Adding metadata encourages the use of the data collected and makes it easier to consult.

Another central issue in current data strategies is data unification, helping to de-silo business departments and boost collaboration. The use of current, practical document formats creates a gateway for business departments, accelerating data processing.

Lastly, unification generates synergies, with instantaneous use of data by the business creating real-time collaboration, as data is no longer forced to shuttle tediously back-and-forth.

Methods, tools and solutions used by data stewards

Data stewards have a number of tools at their disposal to help them successfully complete data set referencing work:

The data stewardship platform

Vital to a data steward’s work, such a platform, which takes the form of a dashboard, serves to facilitate orchestration of data projects.

This data stewardship system enables data stewards to coordinate their work, certify data and monitor the progress of tasks within each project. The highly collaborative nature of data stewardship systems allows functional management to get involved in enhancing master data, guided by the data steward.

A data stewardship application accelerates reliable and participative data documentation for the data steward. The data stewardship system also provides some very useful automation features, such as data validation rules, sometimes even AI knowledge gathering. By making routine management easier, it helps to create a complete and error-free master data repository.

Master data management systems

Using an MDM solution to manage the business’ master data, data stewards consolidate data into a single master data repository. Up-to-date data can then be unified despite coming from a variety of source applications.

Master data management provides:

  • Consolidation of application data: data collected from different business applications is centralised and optimised within the MDM system
  • Data controls: rules for entering and managing data unify the consolidation process
  • Traceability: keeping track of changes from one process to another ensures project integrity
  • Data quality: current standards are followed when giving data a confidence index and for data consolidation 
  • Predictive capability: application mapping, relationships between data, and scenario analyses improve understanding of the data lifecycle.

Metadata management and data cataloguing

Metadata is a flourishing field, and rightly so. Whether data is technical, functional or operational, metadata management allows more value to be extracted from data sets.

Metadata management systems are useful to both business employees and data stewards. The latter use them to improve data administration, as managing access and data sharing, integration, analysis and maintenance all become easier to control.

Metadata management enables data stewards to vouch for data consistency more confidently. More lasting terminology extends data’s lifespan, and therefore improves service performance.

Metadata management systems also provide reports and analyses based on a data catalogue.

This data catalogue constitutes a smart database, accessible by stakeholders to find and potentially share data sets. Data quality and completeness depend largely on the data steward’s performance.

Data roles and data’s many dimensions

Data is now a central element, used in terms of both strategy and grassroots operations. A number of other key roles process data from technical and functional viewpoints. Mention can be made of data architects, responsible for data infrastructure, data owners whose role is to map data and protect it, data analysts for extraction and strategic interpretation, and data scientists who build predictive models on the basis of the data extracted.

Data use cases are also very numerous, and call for a range of software solutions:

Data stewards use MDM

Centralisation and unification MDM
The single master data repository makes data consultation and the implementation of data projects a simple matter

ESB, an application for data stewards

ESB
The Enterprise Service Bus contributes to the re-engineering of data flows between all generations of applications

Which software for data stewards?

Data exploitation by processes BPM
Data is highly valuable to business processes, which must be truly part of the IS and make use of process-oriented data.

Data stewards’ applications

Openness to partner companies – API Management
The rising number of data interchanges with outside companies requires finely-tuned API administration and control over data consumption.

It can be seen that data is multi-faceted. Data collection and management must therefore take into consideration all dimensions and usages throughout its lifecycle.

Data stewards are part of any worthy data processing strategy, but the same ought to apply to business applications. It is only by taking a step back from data that a business can fully leverage its value.

At Blueway, we firmly believe that the success of a business’ data strategy is based on a comprehensive, shared vision of processes and data, which is why we offer one platform that unifies all these aspects of data interchange, with ESB, BPM, MDM and APIM.

Get in touch
with a Blueway Expert

Intégrité des données ou data integrity

What are the points to watch and best practices to follow to ensure data integrity?

Introduction: the basics of data integrity Data integrity means the certainty that data is…

Data Steward tools

The data steward, a lynchpin in data governance

The data steward, a lynchpin in data governance The amount of data collected by businesses is…

Data Quality Management (DQM): six steps to improve the quality of your data

Running a data quality process

Data quality is a major issue for organisations. Poor quality data can be expensive; research…

Successfully manage reference data

Five steps to manage master data effectively

As business goes global, organisations are driven to undergo major transformations, to…

The concept of the extended enterprise is based on the importance of cooperation between all parties involved in a value chain (e.g. co-design of products, assembly of complex systems, supply chain, suppliers/distributors, etc.).

Other concepts also lie behind this term that first emerged in the 1990s, including the supplier ecosystem, co-creation, networked systems, etc.

However, in practical terms, it entails networking and integrating all those concerned in a given chain, be they customers, suppliers or partners, making for a direct link to the issues surrounding information system openness. In fact, applications have shifted from being a means of accessing content to being the main channel of interaction between a business and its customers, employees, suppliers and partners.

 However, while businesses fully appreciate the need for openness, determining a strategy for effective integration with their ecosystems is far from obvious.It nevertheless underpins the extended enterprise. Spanning ideas of stability, real time, scalability, standardisation, security, governance, monitoring…. this strategy of openness is directly linked to business expansion and business model development.

ESB vs APIM – two aspects of the same integration strategy

The strategy of integrating with the ecosystem is based on two dovetailing components:

  • IS service orchestration, which is the role played by the ESB (Enterprise Service Bus) ;
  • Governance over interchanges with the outside world, which is the role of API management.

While similarities can be seen in the software solutions built for ESB and API management, that does not mean that each can play the other’s role, but rather than they are two sides of the same coin.

From the viewpoint of solutions to implement, they are definitely two different, if complementary, steps, each with a specific objective:

Stratégie d’intégration d’un ESB

ESB

An ESB or Enterprise Service Bus can be defined as a toolkit ensuring the security of data interchanges between sources and targets in an information system. The central component of an ESB can be described as a channel used to convey data between applications called an “application bus”).

Stratégie d’intégration de l’API Management

APIM

The purpose of API Management is to manage and standardise API (Application Programming Interface) exposure, i.e. the publication, promotion and supervision of interchanges between a supplier service and a client service. API Management provides portals that can be used to monitor the use made of exposed APIs.

The key questions for structuring an ESB APIM integration strategy

Once the two software solutions have been chosen and their respective roles determined, the issue becomes deciding upon the strategy to implement to integrate with the wider ecosystem.

There is no single perfect integration strategy that will suit every business. It differs in fact depending on the importance placed on sharing data in the business model and likely developments in data interchange and services with the ecosystem concerned. Business requirements must take precedence over technical constraints.

The answers to the following questions can help set an initial framework for the ESB APIM integration strategy:

  1. What do you want to expose, to whom and with what objectives?
  2. What business requirements emerge from your business model as regards inbound and outbound data traffic? This entails examining data flows and the business model to determine which services are needed and the target consumption.
  3. What degree of security and control is required?
  4. How quickly should the integration be extended? What is the expected scalability of each service?
  5. What level of granularity is sought when opening up services? Are there any differences in access at start-up (e.g. for the top 10 premium customers versus the others)?
  6. How mature is your ecosystem? Will suppliers have to make an effort to standardise how they provide their data? The development of your integration strategy and its scalability are also in fact dependent on your ecosystem partners’ ability to get involved.
  7. Do processes need to be defined with partners beforehand? In some cases, such as data exposure (for example, providing product catalogue data), it is not necessary; in other, more complex situations, it is crucial.
  8. What type of data sharing is involved? Is the approach akin to a partnership? Are services paid, or free of charge?

Conclusion: your business objectives should guide your ESB APIM integration strategy

While, from a business viewpoint, these integration possibilities represent a fundamental transformation opening many doors, from an IT viewpoint, it is mainly a matter of extending a service-oriented strategy from the internal IS to the whole business ecosystem. The principle behind the approach is therefore fairly clear: to avoid increasing numbers of point-to-point connections between applications.

There are therefore admittedly some technical aspects (API versioning, security tokens, DMZ, etc.), but it is first and foremost your business transformation that should determine your integration and openness strategy.

If you would like to share feedback around data interchange governance with an ecosystem or extended enterprise, contact our experts.

Get in touch
with a Blueway Expert

Temoignage_departemant_morbihan_esb

The département of Morbihan

De-siloing of the information system…

Témoignage_samat_soa_esb

SAMAT

Real-time management of orders and deliveries…

Récipharm Fontaine

Interfacing the IS with a business added-value network and/or directly with their clients’ ERP…

Laboratoire Dômes Pharma

Consolidation of data from production (Sage X3) and HR (Nautilus) into SAP…

Optimisation processus administratifs et BPM

The administration departments in an organisation work on large numbers of management processes in conjunction with all the other departments in the business. It is in these other departments that the potential gains from process optimisation are often the most significant.

What does optimisation
of administrative processes mean?

When discussing automation of low added-value tasks, the inference is very often administrative tasks, manual data entry and tiresome checking work. However, administrative processes play a central role at the heart of businesses. These processes form the organisational backbone: audit, supplies, the mail room, meeting room bookings, cleaning & maintenance, secretarial work, filing, working conditions, etc. This generic term covers an extremely diverse range of work and processes.

It is precisely because administrative processes are essential to the smooth running of the business that a distinction needs to be made between time-consuming tasks of no added value and the work that should continue to be done by people. Optimising processes therefore means affirming the value of administrative staff while increasing quality and productivity.

More generally for the company, it is also an opportunity to eliminate the bottlenecks that some of these processes can constitute. Such bottlenecks are an issue not just for the administration department concerned, but for the whole business.

Behind the general idea of automating data entry, structuring workflows and concentrating human effort on adding value, the major areas for administrative process optimisation are:

  • See a different perspective and improve your working methods by modelling processes at a functional level
  • Eliminate manual data entry and any unnecessary tasks in general
  • Go paperless – digitalise information and data flows for ease of use and greater consolidation possibilities
  • Improve access security, ensure the long-term safety of data, provide an audit trail
  • Automate and accelerate data flows, the circulation of information and alerts
  • Include administrative processes within the overall information system

Order book management: a common example of administrative process optimisation

Taking the example of processes linked to the consolidation of order books within wholesale companies or manufacturing subcontractors, administrative staff have to regularly go to the websites of various clients and partners and consolidate new data with any data previously collected, taking into account any changes there might be to information already stored.

This whole process entails a great deal of manual work, including repetitive double-checking and making minor changes: retrieving source data in different formats (Excel, screen shots, manual re-entry of data) before comparing them with the current order book (or the previous version) and then making adjustments in the ERP system, taking each open order one at a time. The majority of these and many other operations can be automated with the Blueway platform which combines BPM (Business Process Management) with application bus solutions, MDM (Master Data Management) and API Management.

In the same way, the processing of checks (availability of stock or goods inwards) and special cases (“exceptions”) can be handled using the complex business rules engine and displayed to users.

Data flow orchestration is particularly useful for admin staff who can quickly see reductions in the time they spend on tasks of little or no added value, and who are able to spend more time dealing with more complicated matters of greater benefit to the performance of the business.

ROI (return on investment) from the Blueway platform through administrative process optimisation

With a platform like Blueway, data collection, formatting, comparison with existing ERP data and presentation on clear and practical screens for users all mean that data can be amended and edited much more quickly by admin staff. They can then also take whatever steps are needed to update open orders in the ERP system or any other relevant applications.

Return on investment from the Blueway platform is typically achieved in less than one year.

Beyond this quantifiable benefit, quality of service, traceability of data exchanged and statistics on changes to orders for each individual supplier enable the business to more readily adjust to changes in the portfolio and be more proactive.

Lastly, the Blueway platform incorporates some preconfigured administrative processes such as automated order building and the initiating of batched invoicing runs

Get in touch
with a Blueway Expert

Workflow software selection criteria

Which workflow software should you choose to automate your processes?

A workflow is a graphical representation or model of a business process produced using workflow…

BPMN 2.0: Business Process Model and Notation explained

What is the BPMN 2.0 (Business Process Model and Notation) standard? 

A large number of people in various roles are involved in each process in your organisation….

Témoignage_client_Toray_films_europe_bpm

Toray Films Europe

Improvements to customer complaint handling between Production management, Salesforce and SharePoint…

Temoignage_departemant_morbihan_esb

The département of Morbihan

De-siloing of the information system…

Optimisation processus administratifs et BPM

The benefits of process optimisation for administrative staff

The administration departments in an organisation work on large numbers of management processes…

Témoignage_samat_soa_esb

SAMAT

Real-time management of orders and deliveries…

Récipharm Fontaine

Interfacing the IS with a business added-value network and/or directly with their clients’ ERP…

Mecachrome

Improved purchase request processing between the Movex ERP and the system’s users…

Laboratoire Dômes Pharma

Consolidation of data from production (Sage X3) and HR (Nautilus) into SAP…

Business Process Management software: BPM methodologies and tools

How can Business Process Management be used to optimise your processes?

While Business Process Management is primarily itself a process, it requires technological…

BPMN 2.0: Business Process Model and Notation explained

A large number of people in various roles are involved in each process in your organisation. Technical and functional staff, the IT department… each is inclined to see a process from their own perspective. However, if we want to take a step back with the aim of optimising a process, or designing a new one, we have to speak the same language and use the same descriptions.

Which is where the BPMN 2.0 standard comes in.

The BPMN 2.0 standard in brief

BPMN stands for Business Process Model and Notation. Obviously the idea of “BPM”, business process management, is very much included, meaning the analysis, improvement, modelling and automation of the organisation’s processes, and the monitoring of them over the course of time.
With the ‘N’ for Notation, BPMN is a business process modelling method in which graphical representation plays a key role.  The objective is to determine a common framework for representing business processes.

It is maintained by the OMG (Object Management Group), an American consortium that aims to standardise and promote the object model. Since it was updated in 2011, we now refer to BPMN 2.0 and this standard has become the benchmark in terms of process modelling.

The objectives of the BPMN 2.0 standard

It should be obvious that ensuring the whole organisation shares the same picture of any given process is essential. However, this synchronisation of views requires all involved – technical, functional and end users – to easily understand the processes involved. With no common language understandable to all, things can be unclear…

The Business Process Model and Notation method works at exactly this level to determine this common framework for graphical representation of processes. The method is independent of the application or BPM software used. This is an important point that chimes with our values: at Blueway, we firmly believe that meeting business challenges should not be constrained by systems and technical factors.

Beyond the graphical aspect, BPMN2 also standardises the descriptions of objects, their attributes and file formats.

Are UML and BPMN complementary or competitors?

UML and BPMN dovetail together neatly because they meet two different needs:

definition of BPMN

BPMN

The BPMN standard concentrates on business process analysis and design. Within these processes, systems run and interact with each other.

definition of UML: Unified Modeling Language

UML

UML is a standardised, visual modelling language used to represent system design graphically.

It focuses on analysis and design of the IS. In conjunction with the BPMN, it can be used to model diagrams in use cases, for example.

The principles behind BPMN 2.0: simple without being simplistic

The BPMN 2.0 standard comprises a set of various graphical elements that can be used to produce diagrams anyone can read and understand.

There are three main basic elements in the BPMN method:

Activities in BPMN

Activities

Activities describe the action taken in an instance of the process (task, transaction, sub-process or call).

Gateways in BPMN

Gateways and connectors

Gateways either merge or fork paths (or flows). Connectors (connecting objects) link BPMN flow objects.

Events in BPMN

Events

These represent an event within a process (start, intermediate and end events).

Conclusion: BPMN 2.0 & Process Governance

Blueway’s Process Governance module makes full use of the BPMN 2.0 standard. We are firm believers in putting people and business needs back at the heart of processes. A shared representation of the processes within a business helps to achieve that.

Always looking to democratise the view of processes within an organisation, our BPM modelling module is designed to be readily understood by business users. The definition of roles, participants, stages, actions and so on all occurs within an intuitive and visual interface. Process Governance allows you to automatically create a BPMN2-compliant specification from your process models. Documents linked to the model can also be reproduced in the specification.

Get in touch
with a Blueway Expert

Workflow software selection criteria

Which workflow software should you choose to automate your processes?

A workflow is a graphical representation or model of a business process produced using workflow…

BPMN 2.0: Business Process Model and Notation explained

What is the BPMN 2.0 (Business Process Model and Notation) standard? 

A large number of people in various roles are involved in each process in your organisation….

Témoignage_client_Toray_films_europe_bpm

Toray Films Europe

Improvements to customer complaint handling between Production management, Salesforce and SharePoint…

Temoignage_departemant_morbihan_esb

The département of Morbihan

De-siloing of the information system…

Optimisation processus administratifs et BPM

The benefits of process optimisation for administrative staff

The administration departments in an organisation work on large numbers of management processes…

Témoignage_samat_soa_esb

SAMAT

Real-time management of orders and deliveries…

Récipharm Fontaine

Interfacing the IS with a business added-value network and/or directly with their clients’ ERP…

Mecachrome

Improved purchase request processing between the Movex ERP and the system’s users…

Laboratoire Dômes Pharma

Consolidation of data from production (Sage X3) and HR (Nautilus) into SAP…

Business Process Management software: BPM methodologies and tools

How can Business Process Management be used to optimise your processes?

While Business Process Management is primarily itself a process, it requires technological…

EAI: ETL versus ESB systems

There are many abbreviations – ETL, ESB, EAI, EDI, SOA and APIM for instance – relating to data traffic, and it is easy to get lost in them. Some of them refer to similar things, others dovetail neatly together, and yet others overlap in some aspects.

In this interview, Edouard Cante, General Product Manager at Blueway, shares his view of the difference between ETL and ESB, and what he believes the consequences are for businesses.Is this the right discussion to have, in view of business requirements?

In a previous article, we reached the conclusion that ESB and API management were two sides of the same coin. Does the same apply to ESB and ETL?

No, but their respective scopes do partly overlap: they both relate to data transport and transformation within an information system. Given their nature, it could be decided there is a choice to be made, depending on the data flow type. This was not the case between ESB and APIM, which each complement the other.

The historical difference is primarily based on the architectural dimension. The specific features of ETL and ESB are not easy to see for those who are not experts in data management. In addition, the landscape has changed over recent years.

Historically, and perhaps somewhat simplistically, ETL is effective in handling huge volumes where performance is important, but the number of interchanges is not that great. It processes data as a set, wholesale. For example, if you want total turnover broken down by customer, ETL will aggregate all the rows and apply a process to all of them. This set-based approach is adopted in particular by Business Intelligence and Data Warehousing. In contrast, ESB is effective when processing large numbers of high-frequency interchanges, each with a limited volume of data, combined with an algorithmic aspect to the processing. It certainly plays a role in de-siloing data and supporting a Service-Oriented Architecture by acting as a secure exchange bus across the entire IS.

To summarise using extreme examples, ETL is used to build data warehouses from ERP and CRM systems. ESB is for a business that wants to use a semi-interface to expose estimates, orders and customers from its CRM, for instance, and allow other applications to fetch this data by connecting to the application bus.

Edouard CANTE

However, I firmly believe these historical differences now verge on the simplistic. Can anyone really say these days that they are buying an ETL system solely for BI?

Why do you believe that the distinction between ESB and ETL no longer holds true?

Another difference often put forward is that ETL is a “pull” technology, that works on demand, whereas ESB is a “push” technology, that produces messages. But from the customer’s standpoint, is it feasible to only pull or only push? When a business has implemented an ETL system, and then needs to “push”, it can’t be built into each separate application individually. That requires specific developments and highlights the problem at the heart of this distinction.

While compartmentalising ESB and ETL was reasonable ten years ago, I firmly believe it no longer holds true from a business point of view.  Different concepts do not need to entail different solutions. Asking the IT department to choose between two different tools depending on whether it wants to expose or interact is approaching the problem from the wrong angle. The business requirement cannot be constrained by technical factors.

For the majority of businesses, there is no ROI from installing a pure ETL on the one hand and a pure ESB on the other.

Edouard CANTE

How has the market developed since the appearance of the ETL and ESB concepts?

Originally, there were some genuine technological differences. The markets split and have developed separately. Some ETL publishers have encroached on ESB’s patch to win market share, and vice versa. At the same time, marketing pitches have compartmentalised requirements based on technology, to confirm their relative positioning. ESB software publishers have also sought to define themselves as purists.

The battle has mainly been waged on a technological front, and not in relation to the actual requirements. This is a mistake!

Edouard CANTE

The boundaries consequently became blurred from the customer’s point of view, and simplistic stances have been the result. There is widespread misunderstanding about what an ESB is, and its potential. An ESB was sometimes summarised as a data transporter, completely neglecting its role in organising data traffic in its entirety. The result is that many projects have failed because of these dogmatic approaches.

I saw one example with a retailer, where the project started with a theoretical mapping of data traffic on a magnificent diagram, to argue the case for 100% ESB. Ultimately, when the project kicked off, this extreme position came up against the cold reality that some applications could not use it in real time. The project was a complete failure.

So if making ETL and ESB mutually exclusive is a mistake, what is the real issue?

The real issue is to see things the other way round. It is because of marketing pushing “either ESB or ETL” that we have these failures! This dichotomy no longer makes any sense in most cases.

The IT department finds itself forced to choose between two tools when its actual requirement is for both: in most cases, it wants to de-silo data and circulate it around the various applications in the IS, produce some BI and centralise data within an MDM system. Matching each concept to a different solution forces it to deviate from the business requirement with no benefit to itself.

They should instead be joined. The solution should meet the business requirement, and not some technological dogma.

Edouard CANTE

This challenge arises even in user communities that have some proficiency in the tools rather than the concepts. Users then have a reduced perspective on data flow management as such. It is by understanding the concepts that users can become more proficient, and more readily adopt different tools.

If the question is not ETL versus ESB, what questions should be asked to re-engineer systems successfully?

First, start by accepting reality, and dealing with it. While it is important to map processes and take a step back to gain some perspective, a theoretical outlook of “everything will be SOA” or “everything will communicate using APIs” will not pass muster. A vision of the future is fine, but in reality, applications need to be able to communicate now!

The IT tools need to converge to adapt to what the business needs, not the other way round. The objective is to unify collaboration in the organisation.

The battle between ETL and ESB is pointless. If IT department needs are to be met, they cannot continue to be separate. The distinction now is between data transport and transformation on the one hand, and data exploitation on the other.

Edouard CANTE

If there has to be a difference, I would place it at another level. We can differentiate data transformation solutions, that provide secure data traffic and make data available to highly functional data preparation tools, for specific roles, such as data scientist, BI, etc.

Business areas consequently have access to highly functional tools, designed for them, and the data manager keeps the central role of ensuring the quality, transformation and availability of data. It is these data preparation tools that are changing the market. Independent business departments make perfect sense, without requiring them to get involved in how data circulates. Data traffic has to meet the crucial challenges of performance and legality.

Data traffic should be understood as a whole, regardless of the method used to convey data for a particular business need. The real difference in terms of technical tools is now between data transformation and data preparation.

Edouard CANTE

I therefore firmly believe that it is necessary to both meet current standards and also handle the famous legacy systems that any information system is still running. Customers using ETL/ESB/EAI solutions have to use them with these applications, they have no choice. As software publishers, we therefore also have to do so. We have to be the toolbox that enables customers to move data around.

At Blueway, we have never set much store in the difference between ESB and ETL. Our wish is to provide a comprehensive response to IS re-engineering challenges with a modular platform that unifies various concepts in terms of business issues and people. There are some extreme cases where an ultra-ETL system is necessary, but in reality they are few and far between.

Get in touch
with a Blueway Expert

Temoignage_departemant_morbihan_esb

The département of Morbihan

De-siloing of the information system…

Témoignage_samat_soa_esb

SAMAT

Real-time management of orders and deliveries…

Récipharm Fontaine

Interfacing the IS with a business added-value network and/or directly with their clients’ ERP…

Laboratoire Dômes Pharma

Consolidation of data from production (Sage X3) and HR (Nautilus) into SAP…

Témoignage_ENI_ESB

ENI

Build interfaces for in-house applications and manage all the technical aspects of the connections…

DERICHEBOURG

Connection from IS to French Interior Ministry and re-writing of data interfaces…

Process Mining software and BPM

The business view of a process encompasses data together with various stages and processing, etc. Sometimes a difference only shows once a business process is translated into an application. However, it is a mistake to compartmentalise data and processes. The business view must prevail, and it cannot be constrained by technical or system limitations.

It is this connection, the interoperability between data and processes that enables a process to truly be part of the information system, prevents a proliferation of applications and therefore ultimately improves the agility of the business and upgradeability of the information system. Blueway is a strong believer in this business-wide, shared view of processes and data.

Process mining in brief

Process mining is found where two fields overlap:

  • Data science: data and data structures, centralisation, representation and display;
  • Process management: modelling, automation and the linked execution of sequences of events.

Process mining consequently analyses data relating to process execution, with the aim of finding areas for improvement. It thus lies at the crossroads of BPM (Business Process Management) and data mining.

In summary, process mining is an examination of processes undertaken using the data that circulates in those processes. It therefore focuses on “facts”. Process mining and process intelligence are very similar ideas: business process intelligence (BPI) refers to the application of data mining techniques and process mining in the area of business process management (BPM).

A number of initialisms have emerged around these topics in recent years, such as BAM (Business Activity Monitoring), CPM (Corporate Performance Management) and CPI (Continuous Process Improvement).

But what practical purpose does process mining serve?

In organisations, processes are everything. Each process will generate and change significant amounts of data over its lifetime. Multiply that effect by the number of processes in a typical organisation, and we get vast volumes of data, most of which is under-used.

Each application in the IS might, admittedly, analyse its own data. However, processes span the entire IS and data analysis should not be compartmentalised within individual applications. Process analysis therefore needs to occur at the functional business level if it is to be effective. It should also include event logs which are usually more or less disregarded in other analyses.

Process mining meets that need. It examines operational process data across the entire information system, irrespective of individual software components. It provides understanding of processes as they are actually run in the organisation.

It consequently delivers a number of benefits: 

Improve process modelling

Identify areas for improvement by process modelling (theory in the model versus practice in real life)

Improve processes operationally

Uncover operational improvements (bottlenecks, time taken, unnecessary steps, etc.)

Process simulation

Forecast using test scenarios and potential optimisations.

It therefore entails not only “taking the pulse” of processes but also carrying out projections and simulations. Process mining is the link between process management and improving performance in organisations.

As process mining is largely automated (using algorithms, etc.) and based on actual IS data, the resources required from the business are limited. It is also an objective approach, based as it is on algorithms and real data.

How process mining works

Process mining examines the actual use made of active processes using the information system’s transaction and event data.

This analysis takes place over a number of stages:  

Process mining discovery stage

Discovery stage

A picture of the real process, as it is actually run, using data from IS event logs. The complexity of the representation can vary depending on the objectives.

L’étape de conformité du process mining

Conformance checking stage

These processes reconstructed from real data are then compared with the corresponding theoretical model. This serves to show whether the organisation’s procedures and rules are being followed.

Process mining performance improvement stage

Performance
improvement stage

Normalised and consolidated data are shown in easy-to-use graphical formats, so that avenues for improvement can easily be seen.

Process mining simulation stage

Simulation

Based on real data and how processes actually run, process mining can simulate processes and project them into future scenarios in order to examine how they work and spot any weak points.

Blueway and process mining

Our objective at Blueway is to bring the various dimensions of data interchange back together, and the process and data aspects in particular. We consequently aim to get people working on adding value and we use data to support that principle.

This is why the Blueway solution is the only platform blending BPM, MDM, ESB and API Management. It is this all-encompassing approach to data flows in business that led to Blueway incorporating this process mining component.

Get in touch
with a Blueway Expert

Workflow software selection criteria

Which workflow software should you choose to automate your processes?

A workflow is a graphical representation or model of a business process produced using workflow…

BPMN 2.0: Business Process Model and Notation explained

What is the BPMN 2.0 (Business Process Model and Notation) standard? 

A large number of people in various roles are involved in each process in your organisation….

Témoignage_client_Toray_films_europe_bpm

Toray Films Europe

Improvements to customer complaint handling between Production management, Salesforce and SharePoint…

Temoignage_departemant_morbihan_esb

The département of Morbihan

De-siloing of the information system…

Optimisation processus administratifs et BPM

The benefits of process optimisation for administrative staff

The administration departments in an organisation work on large numbers of management processes…

Témoignage_samat_soa_esb

SAMAT

Real-time management of orders and deliveries…

Récipharm Fontaine

Interfacing the IS with a business added-value network and/or directly with their clients’ ERP…

Mecachrome

Improved purchase request processing between the Movex ERP and the system’s users…

Laboratoire Dômes Pharma

Consolidation of data from production (Sage X3) and HR (Nautilus) into SAP…

Business Process Management software: BPM methodologies and tools

How can Business Process Management be used to optimise your processes?

While Business Process Management is primarily itself a process, it requires technological…

Legacy information systems

Introduction: the new challenges created by legacy information systems

Information systems have undergone radical transformations in just a few decades. With the constant deployment of new digital solutions in businesses, many older systems are becoming obsolete, and legacy systems are becoming a fundamental issue.

With the rapid pace of technological innovation over the last twenty years, the installed base of applications is of considerable size, and difficult to control. Information system applications were often developed as a seat-of-the-pants job, with no view to a longer term strategy. This aging installed base of applications still includes some mission-critical business processes.

At the same time, the challenges are growing, mergers and acquisitions mean IS and business processes need to be converged, and increased levels of communication with commercial partners makes rationalisation a necessity. Mobility and the use of SaaS solutions are quickly changing the face of the modern IS. Meanwhile, the speed of technological change and the diversification of infrastructure just make migrating data from old applications even more complicated.

Another vital component can be added to this need for upgradeability, namely the business-driven change in IS governance. Functional departments are more involved than ever before in strategy decisions over software. Business solutions need to closely match business requirements. So the technical challenge of transmitting data comes on top of this need to solve grassroots problems. Developments to application components, whether new or legacy systems, must occur with minimal impact on staff performance, and therefore on the processes in place.

In response to these performance factors, businesses now need to not only consider upgrading their old applications, but also how to structure their entire IS so as to best anticipate the future.

The organisational impact of legacy systems

The majority of business have upgraded their IS, including by using SaaS solutions. This new application layer, in addition to legacy applications, rarely presents genuine synergies with old systems. This re-engineering shortcoming causes real problems for business management:

  • The complexity of adding new applications: implementing new applications is complicated by the existence of a disparate legacy installed base. One-to-one communication, which is widespread in information systems, entails a great many individual data flows, which are difficult to map and manage.
  • Incomplete information: it is difficult to process an untidy jumble of data flows correctly. The data arriving from business processes is therefore often incomplete or poorly structured, and cannot be fully exploited.
  • Management overhead: managing a disparate installed base of applications results in a great many tasks of little or no added value, taking up employees’ time and effort. A unified IS architecture can, in contrast, make a degree of automation feasible and boost employees’ productivity.
  • Maintenance costs: the existence of legacy systems often means maintaining more systems, which are also sometimes redundant. Time and money are spent to no purpose, adversely affecting performance.
  • A processes overview is missing: business processes form part of a global strategy, and the same applies to data. Data traffic requires proper architecture to provide continuity at each stage of a project.

Developments in processes and software perpetually widen the gap between old and new applications. It is wise to consider a re-engineering and rationalisation strategy at the earliest possible stage. Following the right procedures and using the right solutions will meet technical requirements and make it possible to meet business requirements more quickly.

What is the right approach to re-engineering legacy systems?

In a context where agility reigns supreme, it is crucial to put an information system re-engineering process in place.

A number of key principles underpin such a structuring exercise:

Foster the de-siloing of information

Lorsque les applications vieillissantes se multiplient, il devient essentiel d’instaurer une communication systématique et maîtrisée entre elles. L’urbanisation vise à éviter tout enfermement de l’information, et donc toute perte de performance.

Map the business processes

Business processes must not find themselves constrained by applications’ limitations. Using actual business requirements, IT and functional departments work together to model all processes to produce a more effective IS

Organise an integration architecture 

Good communication between applications should be combined with a more comprehensive and upgradeable architecture. Gaining this kind of perspective makes it possible to keep control over the IS and ensure inter-application dependency is minimised.

Such principles should enable communication between applications to be unified, delivering greater working flexibility, even if the structure changes fundamentally.

Meeting the challenges of legacy systems through re-engineering

IS re-engineering delivers a suitable response to the challenges brought about by legacy IS, by facilitating:

  • Application decommissioning: the removal of obsolete components makes it possible to retain only those serving a purpose. Decommissioning reduces risks and costs, and streamlines the information system.
  • Cutting dependencies: data traffic is no longer constrained by old applications and the difficulties arising from having to maintain them. Standardised data interchanges become useable in any company ecosystem.
  • Simpler addition of components: in a unified environment with clear flows of data, software components can be added and removed much more easily, with the functional impact fully controlled.
  • Identification of application bottlenecks: understanding the busy spots for data traffic makes it possible to improve how that traffic is organised and anticipate risks.
  • Gaining perspective on data flow organisation: how data circulates throughout its lifecycle has a significant impact on the implementation of corporate strategy. Re-engineering also boosts the continuous improvement of flows and processes.

Practical software solutions to help with re-engineering

Two essential solutions lie at the heart of any re-engineering exercise, serving to blend data traffic management and process optimisation:

L’ESB

Application bus technology carries data between applications, without needing to develop an individual interface for each one. Every component in the IS can consequently retrieve the data needed from this Enterprise Service Bus.

It is a solution that provides straightforward, structured data interchanges between recent and legacy applications. It also contributes to infrastructure stability: application components can be added or amended while maintaining constant and reliable data flows.

The application bus can also be used to deal with common issues for all businesses, e.g. the distribution of large volumes of data, populating the master data repository, synchronous and asynchronous data interchanges, etc.

BPM

Re-engineering data interchanges serves no purpose unless the data flows fully match business requirements. Processes, structuring what the business does, cause data to circulate and expand, but they also feed off the data gathered. This is where Business Process Management enters the picture, a solution used not only to digitalise business processes but also to boost their agility and make them truly part of the IS. With automation of low added-value tasks, scheduling, monitoring and functional rules, the interoperability between ESB and BPM means that processes use data that is always reliable and always available. Those processes are also fully part of the information system.

Taking advantage of other aspects of data interchange

To deliver optimal performance, not just internally but also to external business partners, the use of such solutions should form part of a Service-Oriented Architecture (SOA) approach.

Structuring data interchanges also helps to build an ecosystem consistent with how the business operates internally, and provides more flexibility in services to partners.

Solutions such as API Management form part of this approach, and provide governance for API exposure. They enable data consumption to be fine-tuned, and indeed the governance, implementation and removal of APIs themselves. In this way, data interchange between partners is facilitated, scalable and structured, and data kept secure throughout its lifecycle.

At Blueway, we firmly believe that your business challenges and the development of your information system should not be constrained by legacy systems and technical factors. Which is why we offer one platform that unifies all these aspects of data interchange, with ESB, BPM, MDM and APIM.

Get in touch
with a Blueway Expert

Témoignage_client_Toray_films_europe_bpm

Toray Films Europe

Improvements to customer complaint handling between Production management, Salesforce and SharePoint…

Temoignage_departemant_morbihan_esb

The département of Morbihan

De-siloing of the information system…

Témoignage_samat_soa_esb

SAMAT

Real-time management of orders and deliveries…

Récipharm Fontaine

Interfacing the IS with a business added-value network and/or directly with their clients’ ERP…

Mecachrome

Improved purchase request processing between the Movex ERP and the system’s users…

Laboratoire Dômes Pharma

Consolidation of data from production (Sage X3) and HR (Nautilus) into SAP…

Successfully manage reference data

As business goes global, organisations are driven to undergo major transformations, to reorganise and merge, all of which makes an already fragmented information system even more complicated. To impose some order and make the architecture upgradeable, IT departments have re-engineered the structure to include master data repositories following a SOA (data + processes) philosophy to make effective master data management possible.

These master data repositories also help meet three business issues:

  • Regulations in relation to data demanding ever more transparency and justification;
  • The competitive environment requires businesses to always stay one step ahead of the competition. This in turn requires an ability to predict and anticipate so as to design new products and services to satisfy both customers AND shareholders;
  • Businesses are more customer-centric than ever, which entails a need for up-to-date and relevant data about customers, and to be able to slice information up and share it across various applications and functional departments, internally and externally.

What is master data?

Businesses now have the resources to store impressive quantities of data. Some of this mountain of information is useful to more than one business process, and is consequently shared within the business’ major vertical IT systems, such as ERP (Enterprise Resource Planning), CRM (Customer Relationship Management), PLM (Product Lifecycle Management), WMS (Warehouse Management Systems) and PIM (Product information management). However, limiting the extent to which it is shared within these vertical applications impedes the cross-functional use of data across the organisation. Actual business processes do not stop at the boundaries between vertical IT applications.

The business therefore needs to ask which data is the most important, the most critical and also the most relevant for the organisation.

The idea of master data appeared in 2003 as Master Data Management (MDM) took off.
Master data can cover a range of aspects:

Product data as found in the retail sector or in manufacturing with the addition of any services also offered

Third party data held about customers and suppliers

Organisational data, such as employees

Financial data

Once identified, the challenge is to build a master data repository, a centralised master record to enable continuous improvements in quality, traceability of changes and the building of up-to-date master data.

Five top tips for managing master data

Before implementing a Master Data Management system, it is advisable to examine the direction of travel for your master data:

  • Purpose behind storing it
  • Level of complexity
  • Stability of the system
  • Nature of objects it handles

Such an inventory should also include the volume of data, its criticality, the source and target applications, and the business processes and rules having an effect on data quality.

It is then a matter of establishing how this master data repository is to be run, in terms of who approves or validates data, who decides over disputes, who has which level of access rights & permissions, and how the data held in the master data repository is circulated to other systems.

The following five steps will lead to successful master data management:

1 – Measure the quality of your master data

Data quality is assessed on the basis of the expected usage, with three possible approaches, namely goal-oriented, processing-oriented and semantic-oriented classifications.

Our focus when measuring the quality of master data will be on profiling, completeness, redundancy and master data standardisation.

2 – Use the right metrics

The metrics used can measure performance (data entry work saved, improved productivity, increased SLA, shorter cycles, etc.) or the MDM system’s impact on IS organisation, the installation of new components, business process management, etc.

3 – Set a scope for your master data management

This is the point where you determine the milestones that will lead to achieving your objectives. For each phase in your project, it will be important to involve both the functional business departments and the IT department to ratify feasibility both technically and in terms of human resources:

  • Data housekeeping
  • Initialising and importing to MDM
  • Interoperability between MDM with source and target applications in your IS
  • Implementation and future developments in the MDM

4 – Initialise target data

The first major step towards implementation is to initialise the master data repository. It is possible to populate this master data repository as and when data is available to be entered or updated; equally it is possible to initialise it by adding as much data as possible once it has been built.

In both cases, creating the master data repository requires:

  • The host database to be defined and created;
  • The creation and instantiation of checks and confirmations required when records are created;
  • Transcoding rules and dictionaries to be set up;
  • The data forming the basis of the master data repository to be loaded.

5 – Cultivate your master data repository and plan to make it interoperable with your IS

A master data repository is a living thing, continuously updated and moving forwards, forming the company’s benchmark data silo. It is therefore important to properly define:

  • The instantiation of checks and confirmations required when records are updated;
  • How and when data record history is kept;
  • Data deletion rules;
  • Policies to be followed for secure access to data.

Beyond these few aspects, it is advisable never to lose sight of the fact that an MDM project is going to handle a huge amount of data, and the rules set for including data and keeping data history can quickly prove to generate even larger volumes.

If your information system so allows, you can design a service-oriented architecture and publish Web services to provide access to data and the ability to update it.

In terms of resources for master data management, the approach we suggest, using the Blueway platform’s Data Governance module, is designed to be straightforward and pragmatic.

Get in touch
with a Blueway Expert

Intégrité des données ou data integrity

What are the points to watch and best practices to follow to ensure data integrity?

Introduction: the basics of data integrity Data integrity means the certainty that data is…

Data Steward tools

The data steward, a lynchpin in data governance

The data steward, a lynchpin in data governance The amount of data collected by businesses is…

Data Quality Management (DQM): six steps to improve the quality of your data

Running a data quality process

Data quality is a major issue for organisations. Poor quality data can be expensive; research…

Successfully manage reference data

Five steps to manage master data effectively

As business goes global, organisations are driven to undergo major transformations, to…

Data Quality Management (DQM): six steps to improve the quality of your data

Data quality is a major issue for organisations. Poor quality data can be expensive; research by MIT Sloan indicates that neglecting data quality can reduce revenue by 15 to 25%.

These losses can be quantified not only in terms of missed opportunities linked to poor decision-making, or reputational harm, but also legal penalties (e.g. for non-compliance) and the time spent in finding, housekeeping and correcting wrong data.

In contrast, high quality data allows businesses to improve their operational performance, boost customer satisfaction and be more competitive in swiftly reorienting business strategy if need be.

What quality criteria attach to data?

According to a report from PWC, Micropole, EBG, data quality refers to the ability of data’s entire intrinsic characteristics (freshness, availability, functional and/or technical consistency, traceability, security, completeness) to meet an organisation’s internal (management, decision-making, etc.) and external (regulations, etc.) requirements.

Data has no intrinsic quality. Quality can only be judged once the use to which data is to be put is known: What is the ultimate objective? How will it be processed? Is the information given any semantic meaning? In other words, quality is defined as a function of use, as expected by users.

This presupposes both high-level and detailed knowledge of business processes that span the entire organisation, and the standards in force to enable data interchange both internally and externally.

The GDPR sets well-defined limitations on the processing of personal data, throughout its lifespan. Data stored or used outside the framework set by regulation cannot be viewed as ‘quality data’ even if it does add efficiency and value to the organisation.

Considering all these points, data quality can be judged against various yardsticks including its profile, accuracy, completeness, compliance, integrity, consistency, availability, applicability, intelligibility, integration, flexibility, comparability, and so on. The list of criteria is infinitely varied!

Reasons for implementing data quality management

A data quality process is not restricted to loading the right data into information systems. It also means eliminating erroneous, corrupted or duplicate data.

While errors can have a technical cause, they are usually caused by human or organisational shortcomings at different stages in the lifecycle and different places in the IS:

  • When collecting data, through intentional or unintentional data entry errors;
  • When sharing data, by creating more than one version of a data item;
  • When exporting data, through poorly-defined rules or compatibility problems
  • When maintaining data through poor encoding.

Data quality management refers to the ability to provide reliable data meeting users’ functional business and technical requirements, in other words, to transform high-quality data into useful information.

Get in touch
with a Blueway Expert

Intégrité des données ou data integrity

What are the points to watch and best practices to follow to ensure data integrity?

Introduction: the basics of data integrity Data integrity means the certainty that data is…

Data Steward tools

The data steward, a lynchpin in data governance

The data steward, a lynchpin in data governance The amount of data collected by businesses is…

Data Quality Management (DQM): six steps to improve the quality of your data

Running a data quality process

Data quality is a major issue for organisations. Poor quality data can be expensive; research…

Successfully manage reference data

Five steps to manage master data effectively

As business goes global, organisations are driven to undergo major transformations, to…

Business Process Management software: BPM methodologies and tools

While Business Process Management is primarily itself a process, it requires technological resources making it possible to both model business processes in detail and then to run them within the organisation.

Up to about ten years ago, BPM projects were mainly the preserve of large businesses. But nowadays, technological maturity and the experience of previous implementations enable any type of business to complete a BPM project within a reasonable timescale and quickly see the benefits.

A BPM project is first and foremost a business project to support strategy. Before embarking on such a project, you must therefore ask yourself how process optimisation will improve your products and services and increase customer satisfaction.

What is Business Process Management?

Business Process Management is a technique used to examine processes by modelling them to streamline and optimise procedures, by allocating further resources and services, to improve consistency and operational efficiency and to benefit financially.

Business processes consist of a series of tasks and actions designed to achieve some predetermined outcome. The sequencing of the steps in a process can vary depending on different interaction scenarios, and require different resources, be they human or technical. A single process can therefore follow different procedures, of varying degrees of efficiency.

Processes typically span more than one department in a business. A typical example of this aspect is CRM, a process that involves marketing, sales and the after-sales services (or customer services) departments. Each function will describe procedures from their point of view, without necessarily knowing what happens before or after its own involvement. It is therefore crucial to get the functional business areas, IT department and senior management to talk, and talk the same language.

BPMN (Business Process Model and Notation:
a standard to make collaboration and execution easier

The BPMN 2.0 standard (Business Process Model and Notation version 2.0) is an internationally-recognised modelling standard that meets this need for a common language.

This simple language has the advantage of fostering cooperation and visually coordinating the logical sequences of actions and messages that exist across the various business departments, including inputs/outputs, automated tasks, manual tasks and any templates they use, sub-processes called by a parent process, routing and constraints for various tasks, and the role of each group, participant or user.

Based on mathematical rules, this language can also be easily translated into different BPEL Business Process Execution Languages which makes it ideal for quick design of Web services.

Two approaches to process optimisation

Implementation of an ERP (Enterprise Resource Planning) or CRM (Customer Relationship Management) package or a WMS (Warehouse Management System) can be an opportunity to optimise business processes. Depending on your business culture and its size, you might adopt one of two approaches:

  • The first approach is performance-oriented. Focusing on best practices and methodologies such as SCOR, Lean and Six Sigma, it consists of overhauling a process, either from absolute scratch or at least disregarding the existing process. This approach can result in a culture shock, and some resistance to any change. It can also entail a great deal of IT work. It is therefore helpful to produce a master plan describing the processes and how to optimise them.
  • The second approach aims for continuous improvement, focusing on an examination of existing processes without a comprehensive review, streamlining them little by little. It makes for a faster and more flexible implementation, applying iterative improvement techniques. Culture shocks are thus avoided through measurable and immediate ROI. However, implementation timescales to achieve the same final results are much longer than for the first approach.

BPM systems to accelerate business process optimisation

Nowadays, far and away the vast majority of processes deal with data and are thus data-focused. Continuous improvement therefore requires us to be as interested in changes to processes operationally as in data processing.

Once processes are described and modelled, they are then put live. Most execution engines include modules, some more sophisticated than others, to transport and transform data, and connectors to provide access to source and target data, regardless of where and how data is stored.

At Blueway, we firmly believe that rolling out a Business Process Management or Process Governance strategy requires a number of components to be combined:

  • A hybrid workflow design package, one that applies the BPMN2 standard
  • An ESB (Enterprise Service Bus) to identify steps that can be improved, and then automate certain sequences connected to IS applications
  • An MDM system to take charge centrally of all the complex issues relating to data management, quality and traceability.

Many functional modules can be combined to accelerate business process optimisation. These include BAM (Business Activity Monitoring) which is used to proactively check and measure the level of functional, technical and organisation services, and CEP (Complex Event Processing) which is used to manage large numbers of business rules and variables.

Get in touch
with a Blueway Expert

Workflow software selection criteria

Which workflow software should you choose to automate your processes?

A workflow is a graphical representation or model of a business process produced using workflow…

BPMN 2.0: Business Process Model and Notation explained

What is the BPMN 2.0 (Business Process Model and Notation) standard? 

A large number of people in various roles are involved in each process in your organisation….

Témoignage_client_Toray_films_europe_bpm

Toray Films Europe

Improvements to customer complaint handling between Production management, Salesforce and SharePoint…

Temoignage_departemant_morbihan_esb

The département of Morbihan

De-siloing of the information system…

Optimisation processus administratifs et BPM

The benefits of process optimisation for administrative staff

The administration departments in an organisation work on large numbers of management processes…

Témoignage_samat_soa_esb

SAMAT

Real-time management of orders and deliveries…

Récipharm Fontaine

Interfacing the IS with a business added-value network and/or directly with their clients’ ERP…

Mecachrome

Improved purchase request processing between the Movex ERP and the system’s users…

Laboratoire Dômes Pharma

Consolidation of data from production (Sage X3) and HR (Nautilus) into SAP…

Business Process Management software: BPM methodologies and tools

How can Business Process Management be used to optimise your processes?

While Business Process Management is primarily itself a process, it requires technological…

Workflow software selection criteria

A workflow is a graphical representation or model of a business process produced using workflow software.

The sequencing of tasks, and the interactions between internal and external parties involved and the IS, can be entirely fixed; this situation is referred to as a procedural (or linear) workflow. When a workflow is more dynamic, meaning it adapts to events and employee decisions, it is an ad-hoc workflow. In both cases, a workflow can be used to automate a process and/or align an application with actual business requirements.

Workflow and BPMN 2.0: on the road to automation

So that the various business functions and the IT department are speaking the same language, the BPMN (Business Process Model and Notation) standard is a standard very often used to model processes. It includes structured symbols, process diagrams, collaboration, conversation and choreography – it could be said to be the “score” for a process.

Since 2011, version 2 of the ISO/IEC 19510 standard has moved this language towards an XML-based data interchange format, enabling executable models to be converted into BPEL, designed to automate and implement application processes.

Most BPM packages have now adopted this standard for modelling and executing business processes.

What does workflow software do?

Designing workflows using BPM software is the first step towards the automation of some processes within a business. Not only repetitive, low added-value tasks, but also “smart” task sequences that can now be processed using AI and machine learning with the concept of RPA.

Workflow automation links a process’ tasks, data and interchanges into a sequential order clearly defined by a set of business rules.

This orchestration serves to improve the approval circuit and the meeting of deadlines by ensuring all involved in a process have the information they need to perform their tasks properly. Thanks to the traceability of interchanges, managers can also measure performance and pinpoint any issues.

Workflows are very often managed “manually” in application code itself, making maintenance tricky. A workflow engine combined with a business rules engine consequently makes it possible to run, using executable BPEL for example, a process definition and interface it with applications or other workflow management systems. It then becomes possible to regularly optimise these processes and implement their workflows rapidly, and correctly.

What is the difference between a workflow engine and a business rules engine?

Workflow engines and business rules engines are both vital tools in the automation of complex processes, and are often confused. To keep matters simple, a workflow engine is a router used to execute various instances of the workflow either sequentially or conditionally (forked pathways) based on a set of business rules. The system can manage the definition of these rules when they are straightforward and relatively few in number, e.g. Boolean operators, data fields in a process, values entered and so on.

When routing is more complicated, the workflow engine can be connected to a business rules engine able to handle a larger number of more involved or changeable business rules, entered in natural language. The workflow engine could be considered as the pilot and the business rules engine as air traffic control.

Making workflow software fully part of the IS: an important factor

Examining a technical or hybrid process entails an analysis of a great number of functionalities, methods, applications and departments in a business.

This is why, from process modelling to execution, it is vital to ensure that the workflow solution used really is part of the IS. It is crucial to confirm that your package provides interoperability and data normalisation, and comes with technical connectors (Web services, databases, text & XML files, LDAP, email, etc.) and functional connectors (to ERP, CRM, SCM and other applications) reducing development and integration times.

The functional architecture needs to be built around an ESB if this is to be achieved. The application bus will route data interchanges and ensure the persistence of messages exchanged.

How to choose a workflow package

While one package is much like another in terms of look-and-feel, good workflow software must include the following functionalities to effectively automate your processes at a reasonable cost:

  • A workflow design studio including hybrid process modelling or creation tools (user interface) that apply the BPMN 2.0 standard
  • WYSIWYG for mouse-controlled building of data input screens, forms, the user interface generally, CSS portals, etc.
  • Simulation tools allowing fine-tuning, testing, de-bugging, etc.
  • BPM accelerator modules (CEP, BAM, MDM, Mash-up, BRMS, etc.)
  • Flow orchestration and automation functions using RPA
  • Dynamic routing functions together with processing supervision: alert handling (emails, text messages), delegation management (roles and responsibilities) and traceability
  • A collaborative scheduling portal for all involved: visibility over the work they need to do or finish off, plus a view of the KPIs
  • Organisational reference base: process documentation, procedures, organisational memos, operating instructions, notices, etc.

Get in touch
with a Blueway Expert

Workflow software selection criteria

Which workflow software should you choose to automate your processes?

A workflow is a graphical representation or model of a business process produced using workflow…

BPMN 2.0: Business Process Model and Notation explained

What is the BPMN 2.0 (Business Process Model and Notation) standard? 

A large number of people in various roles are involved in each process in your organisation….

Témoignage_client_Toray_films_europe_bpm

Toray Films Europe

Improvements to customer complaint handling between Production management, Salesforce and SharePoint…

Temoignage_departemant_morbihan_esb

The département of Morbihan

De-siloing of the information system…

Optimisation processus administratifs et BPM

The benefits of process optimisation for administrative staff

The administration departments in an organisation work on large numbers of management processes…

Témoignage_samat_soa_esb

SAMAT

Real-time management of orders and deliveries…

Récipharm Fontaine

Interfacing the IS with a business added-value network and/or directly with their clients’ ERP…

Mecachrome

Improved purchase request processing between the Movex ERP and the system’s users…

Laboratoire Dômes Pharma

Consolidation of data from production (Sage X3) and HR (Nautilus) into SAP…

Business Process Management software: BPM methodologies and tools

How can Business Process Management be used to optimise your processes?

While Business Process Management is primarily itself a process, it requires technological…