Mastering Master Data.

Why?

Gartner defines master data as a consistent and uniform set of identifiers and extended attributes that describes the core entities of the enterprise.

This places master data at the heart of every business process within an organisation and because it’s used in multiple systems and processes, bad master data will have huge effects in the business processes.

The negative impacts of unmanaged master data can easily be identified across almost multiple levels of an organization’s activities, impairing the decision-making processes, hence impacting performance, customer, and product profitability, but also with impacts at an operational level, reducing productivity and efficiency, but also leading to compliance risks.

Master Data Quality

In the usual siloed ecosystem, the most common scenario in most organizations, master data is scattered across multiple systems, governed by multiple rules, managed by multiple processes, collected, and updated under different conditions and replicated trough ad-hoc processes, unavoidably leading to lack of quality.

By definition - data that lacks quality is unfit for use. Considering the master data is a critical part of most of an organization’s processes, from the factory floor to the board, it’s easy to conclude how broad the impacts of bad master data are.

Traditionally data quality is evaluated using six dimensions: completeness, accuracy, consistency, validity, uniqueness, and integrity.

Considering customer data – the most common master data domain, and usually the priority when starting a master data management initiative – under the perspective of each of these dimensions, we can have an overall image of how unmanaged master data looks.

Completeness – This dimension measures if the data is sufficient for its purpose

Each of the business areas (or systems) within an organization has its own business purposes and its own data requirements, this means that not all customer information will be necessary in its multiple applications across the organization. From a detailed perspective probably some of these systems will have a complete set of customer data that fits its needs, from a broader, corporate perspective this is most likely not true. This usually becomes noticeable when there are compliance requirements that don’t completely overlap the business requirements.

Accuracy – This dimension measures the level of adherence data has to the real-world entity.

As with the previous dimension, with different requirements, also the needs and rules under which customer data is collected will be different for each of the systems, leading to different degrees of accuracy and impacting the next dimension, consistency.

Consistency - This dimension measures the level to which the same data matches at multiple instances.

Consistency is probably the major issue related with siloed architectures, even with the best built integration processes, and real time replication of data, the existing differences between the multiple systems (attributes, formats, etc.) will prevent the existence of a consistent version of customer data within the organization.

Validity - This dimension measures the level of compliance with specific domains or requirements.

With the different needs and requirements, inevitable the rules governing customer data will differ for each of the systems, data that is valid in one context will be unusable in a different one.

Uniqueness - This dimension measures the level to which a single instance of data is being used.

Duplicates are one of the most common issues when addressing customer data - either by the creation of multiple instances in single systems over time or by creating those instances in different systems that are then replicated – this can only be addressed from a global perspective, to ensure that single instances exist.

Integrity – This dimension measures the level to which the relations between attributes are correctly maintained.

The data elements that compose an entity, in this case a customer, are related and change over time, and the entity is considered complete and valid when all its elements, in all its instances respect these quality dimensions and business requirements. Again, in a siloed environment, integrity is a hard dimension to maintain, with multiple collection and update points, different rules, and replication processes.

Mastering Master Data. What?

Without wanting to be exhaustive about the impacts of unmanaged master data, I hope I was able to give an idea of the importance of have master data – that is after all the core business data for any organisation – properly managed.

Next, I’ll move to the What. What is master data management and how it positively impacts the usage of this core data.