Global MDM versus Local BPM

The linkage between Master Data Management (MDM) and Business Process Management (BPM) was intensively discussed at a workshop on a MDM conference organized by Marcus Evans in Barcelona, Spain today. More than 30 master data professionals from a range of large mainly European originated companies attended the workshop.

There was a broad agreement about that the intersection between MDM and BPM is growing – and should be doing so.

Google EarthOne of the challenges identified is that MDM tends to be global within the enterprise while BPM tends to be local.

The global versus local theme has frequently been mentioned as a challenge over the decade MDM has existed as a discipline. The core MDM global versus local challenges spans over common definitions, common value tables and common data models across different geographies. Having a mix of common business rules and business rules that have to be local adds to the difficulties. When applying the full impact of business process management with the variety of formal and informal organizational structures, decision rules and working culture there are certainly both wins and obstacles in linking MDM and BPM.

I think the commonly used phrase about thinking globally and acting locally makes sense in the intersection between MDM and BPM. Thinking big and starting small helps too.

Bookmark and Share

Business Agility, Continuous Improvement and MDM

Being able to react to market changes in an agile way is the path to the survival of your business today. As you may not nail it in the first go, the ability to correct with continuous improvement is the path for your business to stay alive.

open-doorDoing business process improvement most often involves master data as examined in the post Master Data and Business Processes. The people side of this is challenging. The technology side isn’t a walkover either.

When looking at Master Data Management (MDM) platforms in sales presentations it seems very easy to configure a new way of orchestrating a business process. You just drag and drop some states and transitions in a visual workflow manager. In reality, even when solely looking at the technical side, it is much more painful.

MDM solutions can be hard to maneuver. You have to consider existing data and the data models where the data sits. Master data is typically used with various interfaces across many business functions and business units. There are usually many system integrations running around the MDM component in an IT landscape.

A successful MDM implementation does not just cure some pain points in business processes. The solution must also be able to be maneuvered to support business agility and continuous improvement. Some of the data quality and data governance aspects of this is explored in the post Be Prepared.

Bookmark and Share

Master Data and Business Processes

The intersection of Master Data Management (MDM) and Business Process Management (BPM) is a very interesting aspect of implementing MDM solutions.

We may divide this battleground into three sectors:

  • Business processes that purely consumes master data
  • Business processes that potentially changes master data
  • Business processes that purely updates master data

BPM MDM

Business processes that purely consumes master data

An example of such a business process is the execution of a direct marketing campaign. Doing this in an effective way is heavily dependent on clean and updated master data. A key capability is the ability to separate which targeted real world entities belongs to the so called “new market” and which are existing customers (or prospects or churned customers). When working with known customers the ability to intelligently relate to previously products and their categories of interest is paramount. Often knowing about the right relation between targeted parties and locations is very valuable.

When doing MDM implementations and ongoing refinement the insight on how master data are used and creates value in business processes is the starting point.

Business processes that potentially changes master data

The most commonly mentioned wide business process is the order-to-cash process. During that process especially customer master data may be affected. A key question is whether the order is placed by a new customer or a known customer. If it truly is a new customer, then effective collection of accurate and timely master data determines the successful outcome of receiving the cash based on correct credit check, correct shipping information and more. If it is a known customer this is a chance to validate and eventually update customer master data.

While customer master data often is changed through business processes having another main purpose, this is not the case with product master data.

Business processes that purely updates master data

An example is from within manufacturing, distribution and retail where we have business processes with the sole purpose of enriching product master data. With the rise of customer self-service through e-commerce the data quality requirements for completeness and other data quality dimensions have increased a lot. This makes the orchestration of complex business processes for enriching product master data a whole new flavour of Business Process Management where master data itself is the outcome – of course in order to be optimally used in order-to-cash and other business processes.

PS: If you are interested in discussing BPM and MDM alignment on La Rambla in Barcelona on the 22nd April 2015, here is the chance.

Bookmark and Share

Three Stages of MDM Maturity

If you haven’t yet implemented a Master Data Management (MDM) solution you typically holds master data in dedicated solutions for Supply Chain Management (SCM), Enterprise Resource Planning (ERP), Customer Relation Management (CRM) and heaps of other solutions aimed at taking care of some part of your business depending on your particular industry.

MDM Stage 1
Multiple sources of truth

In this first stage some master data flows into these solutions from business partners in different ways, flows around between the solutions inside your IT landscape and flows out to business partners directly from the various solutions.

The big pain in this stage is that a given real world entity may be described very different when coming in, when used inside your IT landscape and when presented by you to the outside. Additionally it is hard to measure and improve data quality and there may be several different business processes doing the same thing in an alternative way.

The answer today is to implement a Master Data Management (MDM) solution. When doing that you in some degree may rearrange the way master data flows into your IT landscape, you move the emphasis on master data management from the SCM, ERP, CRM and other solutions to the MDM platform and orchestrate the internal flows differently and you are most often able to present a given real world entity in a consistent way to the outside.

MDM Stage 2
Striving for a single source of truth

In this second stage you have cured the pain of inconsistent presentation of a given real world entity and as a result of that you are in a much better position to measure and control data quality. But typically you haven’t gained much in operational efficiency.

You need to enter a third stage. MDM 3.0 so to speak. In this stage you extend your MDM solution to your business partners and take much more advantage of third party data providers.

MDM Stage 3
Single place of trust

The master data kept by any organization is in a large degree a description of real world entities that also is digitalized by business partners and third party data providers. Therefore there are huge opportunities for reengineering your business processes for master data collection and interactive sharing of master data with mutual benefits for you and your business partners. These opportunities are touched in the post MDM 3.0 Musings.

Bookmark and Share

Automate or Obliterate, That is the Question

Back in 1990 Michael Hammer made a famous article called Reengineering Work: Don’t Automate, Obliterate.

Indeed, while automation is a most wanted outcome of Master Data Management (MDM) implementations and many other IT enabled initiatives, you should always consider the alternative being eliminating (or simplifying). This often means thinking out of the box.

As an example I today stumbled upon the Wikipedia explanation about Business Process Mapping. The example used is how to make breakfast (the food part):

Makebreakfast

You could think about different Business Process Re-engineering opportunities for that process. But you could also realize that this is an English / American breakfast. What about making a French breakfast instead. Will be as simple as:

Input money > Buy croissant > Fait accompli

PS: From the data quality and MDM world one example of making French breakfast instead of English / American breakfast is examined in the post The Good, Better and Best Way of Avoiding Duplicates.

Bookmark and Share

Be Prepared

Working with data governance and data quality can be a very backward looking quest. It often revolves around how to avoid a recent data disaster or catching up with the organizational issues, the process orchestration and new technology implementations needed to support current business objectives with current data types in a better way.

This may be hard enough. But you must also be prepared for the future.

open-doorThe growth of available data to support your business is a challenge today. Your competitors take advantage of new data sources and better exploitation of known data sources while you are sleeping. New competitors emerge with business ideas based on new ways of using data.

The approach to inclusion of new data sources, data entities, data attributes and digital assets must be a part of your data governance framework and data quality capability. If you are not prepared for this, your current data quality will not only be challenged by decay of current data elements but also of not sufficiently governed new data elements or lack of business agility because you can’t include new data sources and elements in a safe way.

Some essentials in being prepared for inclusion of new kinds of data are:

  • A living business glossary that facilitates a shared understanding of new data elements within your organization including how they relate to or replaces current data elements.
  • Configurable data quality measurement facilities, data profiling functionality and data matching tools so on-boarding every new data element doesn’t require a new data quality project.
  • Self-service and automation being the norm for data capture and data consumption. Self-service must be governed both internally in your organization and externally as explained in the post Data Governance in the Self-Service Age.

Bookmark and Share

The Matrix

The data governance discipline, the Master Data Management (MDM) discipline and the data quality discipline are closely related and happens to be my fields of work as told in the post Data Governance, Data Quality and MDM.

Every IT enabled discipline has an element of understanding people, orchestrating business processes and using technology. The mix may vary between disciplines. This is also true for the three above-mentioned disciplines.

But how important is people, process and technology within these three disciplines? Are the disciplines very different in that perspective? I think so.

When assigning a value from 1 (less important) to 5 (very important) for Data Governance (DG), Master Data Management (MDM) and Data Quality (DQ) I came to this result:

The Matrix

A few words about the reasoning for the highs and lows:

Data governance is in my experience a lot about understanding people and less about using technology as told in the post Data Governance Tools: The New Snake Oil?

I often see arguments about that data quality is all about people too. But:

  • I think you are really talking about data governance when putting the people argument forward in the quest for achieving adequate data quality.
  • I see little room for having the personal opinion of different people dictating what adequate data quality is. This should really be as objective as possible.

Now I am ready for your relentless criticism.

Bookmark and Share

Data Quality X-mas Stories

Today is 2nd of December and time for the 2nd x-mas theme on this blog this year following up on the early yuletide post about The Shortcut to Lapland.

In a way it is not in line with a main subject on this blog being diversity to focus too much on Christmas as I know that many readers may have for example Eid, Diwali or Chinese New Year as the main days of celebration during the year.

To me The Holidays is much about having light in a time of year up north that else would be very dark and even depressive. When I am in Copenhagen I live on a cosy square called Gråbrødretorv (Grey Friars Market). In summertime the square is filled with outdoor seating. Not so much in the winter. But then there is a fir tree with lights on.

Gråbrødretorv2

Anyway there is lots of stuff in the x-mas theme you can relate to data quality. Some of the older ones on this blog were:

Bookmark and Share

The “Fit for Purpose” Trap

Gartner (the analyst firm), represented by Saul Judah, takes data quality back to basics in the recent post called Data Quality Improvement.

While I agree with the sentiment around measuring the facts as expressed in the post I have cautions about relying on that everything is good when data are fit for the purpose for business operations.

Some clues lies in the data quality dimensions mentioned in the post:

Accuracy (for now):

As said in the Gartner post data are indeed temporal. The real world changes and so does business operations. When you got your data fit for the purpose of use the business operations has changed. And when you got your data re-fit for the new purpose of use the business operations has changed again.

Furthermore most organizations can’t take all business operations into account at the same time. If you go down the fit for purpose track you will typically address a single business objective and make data fit for that purpose. Not at least when dealing with master data there are many business objectives and derived purposes of use. In my experience that leads to this conclusion:

“While we value that data are of high quality if they are fit for the intended use we value more that data correctly represent the real-world construct to which they refer in order to be fit for current and future multiple purposes”

Existence – an aspect of completeness:

The Gartner post mentions a data quality dimension being existence. I tend to see this as an aspect of the broader used term completeness.

For example having a fit for purpose completeness related to product master data has been a huge challenge for many organizations within retail and distribution during the last years as explained in the post Customer Friendly Product Master Data.

Omni

Bookmark and Share

Omni-purpose MDM

The terms omni-channel banking and omni-channel retailing are becoming popular within businesses these days.

In this context omni (meaning all) is considered to be something more advanced than multi (meaning many) as in multi-channel retailing.

Data management, including Master Data Management (MDM), is always a bit behind the newest business trends. In our discipline we have hardly even entered the multi stage yet.

Some moons ago I wrote about multi-channel data matching on the Informatica Perspectives blog in the post Five Future Data Matching Trends. Today, on the same blog, Stephan Zoder has the post asking: Is your social media investment hampered by your “data poverty”?

Herein Stephan examines the possible benefits of multi-channel data matching based on a business case within the gambling industry.

Using omni in relation to MDM was seen in a vendor presentation at the Gartner MDM Summit in London last week as reported in the post Slicing the MDM Space. Omnidomain MDM was the proposed term here.

The end goal should probably be something that could be coined as omni-purpose MDM. This will be about advancing MDM capabilities to cover multiple domains and embrace multiple channels in order to obtain a single view of every core entity that can be used in every business process.

Omni

Bookmark and Share