Data intelligence has been in vogue since the last few decades and for good reason. The impact of intelligent technologies, especially AI, on business productivity and efficiency has been significant. In fact, as per McKinsey & Company research, nearly 70% of companies across industries and business functions are expected to implement AI by 2030, with AI delivering a year-on-year global GDP growth of 1.2% per year and “additional economic activity of around $13 trillion by 2030.”

Considering these statistics and how businesses are motivated to carry through their transformation goals, it is crucial to know whether organizations are ready for this shift. Our last blog ‘Moving from legacy software to data intelligence’ highlighted how legacy systems limit business growth and impede digital transformation. In this article, the focus is on the importance of master data management.

Data Management

Poor-Quality Data – More than just a Waste

For the full potential of data intelligence to be harnessed, businesses must not only record the data generated across the supply chain but also provide relevant and meaningful access to quality data. A 2018 Gartner study suggests that nearly 70% of business enterprise leaders can guarantee access to the end-to-end data of the supply chain.

What is missing, however, is the access to quality data.

For long, businesses have implementation specialized software like ERP, CRM, WMS, etc. to manage diverse supply chain processes – customer relationship management, demand planning, pricing management, production planning, etc. As such systems proliferate, so does the data. Disjointed, these systems do not provide a singular view of the supply chain data. Moreover, with lax controls and different data governance guidelines for different systems, standardization of the data becomes a challenge. For instance, the same SKU may be listed as SKU123, SKU 123, or Sku123 in different instances. These issues only evolve in scale and complexity as the organization grows.

The result-

  • With data residing in silos, quick information retrieval from any touch-point of the organization is difficult.
  • Without centralized and integrated management of the data, data fragmentation, duplication, and inaccuracy are common. Moreover, when data is updated in one system but the same data is not immediately reflected across different systems, data inconsistencies emerge.

Further, business decisions today require a data-centric approach with conclusions drawn through data analysis rather than gut and intuition. Such decision making requires quick access to the most up-to-date information, on-demand and from anywhere. Without access to trusted data, basic questions such as “Which product had the best margin in the last quarter?”, “Which suppliers in the North region had defaulted in the last season?” or “From which plant should Order 1214 be dispatched?” can be difficult to answer.

This paralyzes the responsiveness of decision making and can lead to graver consequences, especially when decisions are based on the wrong data.

Poor quality data also acts as a barrier to data intelligence utilization. Big data technologies require massive volumes of clean and standardized data. For instance, planning optimization algorithms, without access to the right data, operates on a flawed data space, thereby losing the desired optimization advantage. Consequently, an organization’s overarching business need, whether of order fulfilment, cost reduction, or increased throughput of operations, is heavily compromised.


The Way Forward – Master Data Management

Gartner defines master data management (MDM)as

A technology-enabled discipline in which business and IT work together to ensure the uniformity, accuracy, stewardship, semantic consistency and accountability of the enterprise’s official shared master data assets.”

A comprehensive MDM strategy provides a single consistent view of the master data of the business that is accessible to all. This ensures-

  1. A stable and consistent system of record of the critical data of the supply chain to be established. Master data may correspond to supply chain-wide data, comprising of the customer, supplier, partner, or location data or subsets of such data. Alternatively, it may also be defined as distinct to a supply chain function. For instance, the plant location, model, and distance master data can be useful data sets for dispatch planning, while production cost, plant-model, and model lot masters are more relevant to production planning. With such a centrally managed data repository, the flow of information can be efficiently utilized between processes and domains such that access to reliable information is ensured across supply chain functions.
  2. A framework of validation for operational data to be set up. All operational data can be easily validated through comparison with the master data for data accuracy and consistency. For instance, in Verdis Production Planning and Control (PPC) and Dispatch Allocation and Planning (DAP), the master data is maintained through a centralized command. Against this record, the data in the operational files are scrutinized through machine logic for data redundancies and inconsistencies. For instance, in the Verdis PPC environment, any data errors such as in the product specification in the sales data are detected by Verdis on cross-referencing with the product master data. Having a strict validation and master mapping protocol ensures that optimization algorithms operate only on a reliable data field and utilizes the full potential of the data.

A multi-pronged strategy combining data standardization, consolidation, cleansing, and de-duplication across internal and external systems ensures that data throughout the enterprise is accurate. Further, by promoting data accountability and establishing clear and defined governance rules, businesses can implement a uniform data management practice. Consequently, other benefits follow.

  • A culture of visibility and transparency is established that supports quick information access for faster decision making. This in turn supports better and quicker risk responses.
  • With cross-functional collaboration, plans and decisions are made with better coordination.

 

Conclusion

The path to data intelligence is not only paved with an overhaul of legacy systems but also with a data-first approach that prioritizes the relevant and quick access to quality data. Through comprehensive master data management, businesses can ensure the standardization, completeness, and consistency of the supply chain data across geographies and IT systems and establish a coherent and accurate field of data that can be referenced and accessed on-demand by supply chain stakeholders and employees alike anytime, anywhere. The result of such an endeavour is the creation of a single consistent view of data that supports data-centred decisioning and supports big data technologies towards their maximum utilization.