How do you exchange product data with your trading partners today? At the Product Data Lake we would like to know some more about that. We do expect that many still send eMails with spreadsheets and digital assets. But please tell us how it is with you. Take the survey by clicking here.
Also please comment on this blog post on your plans or if you work with Product Information Management (PIM) as a service provider and have experiences to share.
Business ecosystems is an important concept of the digital age. The father of business ecosystems, James F. Moore, defined business ecosystems as:
“An economic community supported by a foundation of interacting organizations and individuals—the organisms of the business world. The economic community produces goods and services of value to customers, who are themselves members of the ecosystem. The member organisms also include suppliers, lead producers, competitors, and other stakeholders”.
The problem with data management methodologies and tools today, as I see it, is that they emphasizes on the needs inside the corporate walls of a single company without much attention to, that every single company is a member of one or several business ecosystems as examined in the post called MDM and SCM: Inside and outside the corporate walls.
Opening your data management, including your Master Data Management (MDM), up to the outside is scary business, as the ecosystems often will include your competitors as well as mentioned in the post Toilet Seats and Data Quality.
Nevertheless, if you want your company to survive in the digital age by building up your company’s digitilazation effort you have to extend your data management strategy to encompass the business ecosystems where you are a member.
The Product Data Lake is a cloud service for sharing product data in the eco-systems of manufacturers, distributors, retailers and end users of product information.
As an upstream provider of products data, being a manufacturer or upstream distributor, you have these requirements:
When you introduces new products to the market, you want to make the related product data and digital assets available to your downstream partners in a uniform way
When you win a new downstream partner you want the means to immediately and professionally provide product data and digital assets for the agreed range
When you add new products to an existing agreement with a downstream partner, you want to be able to provide product data and digital assets instantly and effortless
When you update your product data and related digital assets, you want a fast and seamless way of pushing it to your downstream partners
When you introduce a new product data attribute or digital asset type, you want a fast and seamless way of pushing it to your downstream partners.
The Product Data Lake facilitates these requirements by letting you push your product data into the lake in your in-house structure that may or may not be fully or partly compliant to an international standard.
As an upstream provider, you may want to push product data and digital assets from several different internal sources.
The product data lake tackles this requirement by letting you operate several upload profiles.
As a downstream receiver of product data, being a downstream distributor, retailer or end user, you have these requirements:
When you engage with a new upstream partner you want the means to fast and seamless link and transform product data and digital assets for the agreed range from the upstream partner
When you add new products to an existing agreement with an upstream partner, you want to be able to link and transform product data and digital assets in a fast and seamless way
When your upstream partners updates their product data and related digital assets, you want to be able to receive the updated product data and digital assets instantly and effortless
When you introduce a new product data attribute or digital asset type, you want a fast and seamless way of pulling it from your upstream partners
If you have a backlog of product data and digital asset collection with your upstream partners, you want a fast and cost effective approach to backfill the gap.
The Product Data Lake facilitates these requirements by letting you pull your product data from the lake in your in-house structure that may or may not be fully or partly compliant to an international standard.
In the Product Data Lake, you can take the role of being an upstream provider and a downstream receiver at the same time by being a midstream subscriber to the Product Data Lake. Thus, Product Data Lake covers the whole supply chain from manufacturing to retail and even the requirements of B2B (Business-to-Business) end users.
The Product Data Lake uses the data lake concept for big data by letting the transformation and linking of data between many structures be done when data are to be consumed for the first time. The goal is that the workload in this system has the resemblance of an iceberg where 10% of the ice is over water and 90 % is under water. In the Product Data Lake manually setting up the links and transformation rules should be 10 % of the duty and the rest being 90 % of the duty will be automated in the exchange zones between trading partners.
When following the articles, blog posts and other inspirational stuff in the data management realm you frequently stumble upon sayings about a unique angle towards what it is all about, like:
It is all about people, meaning that if you can change and control the attitude of people involved in data management everything will be just fine. The problem is that people have been around for thousands of years and we have not nailed that one yet – and probably will not do that isolated in the data management realm. But sure, a lot of consultancy fees will go down that drain still.
It is all about processes. Yes it is. The only problem is that processes are dependent on people and technology.
It is all about technology. Well, no one actually says so. However, relying on that sentiment – and that shit does happen, is a frequent reason why data management initiatives goes wrong.
The trick is to find a balance between a priceworthy people focused approach, a heartfelt process way of going forward and a solid methodology to exploit technology in the good cause of better data management all aligned with achieving business benefits.
The article revolves around getting your data more fit. Notably, it is not about getting data fit for a known purpose of use, which is the thinking that has been around in the data and information quality realm for years. It is about having the data that makes you able to quickly adjust business strategies to meet changing customer needs.
A week ago I had the pleasure of hosting a workshop on the linkage between Business Process Management (BPM) and Master Data Management (MDM) at the Marcus Evans MDM conference in Barcelona, Spain. One of the solutions we referred to many times was to establish a common reporting approach across BPM and MDM grounded on the sentiment that you can’t manage what you can’t measure.
Setting improved agility as a goal for a master data programme is an additional approach. I am working on such a programme right now. Our executive sponsor actually wanted selling more stuff to be the goal. My promise is that the improved master data agility will lead to improved business agility that will lead to being able to sell more stuff in the future.
The linkage between Master Data Management (MDM) and Business Process Management (BPM) was intensively discussed at a workshop on a MDM conference organized by Marcus Evans in Barcelona, Spain today. More than 30 master data professionals from a range of large mainly European originated companies attended the workshop.
There was a broad agreement about that the intersection between MDM and BPM is growing – and should be doing so.
One of the challenges identified is that MDM tends to be global within the enterprise while BPM tends to be local.
The global versus local theme has frequently been mentioned as a challenge over the decade MDM has existed as a discipline. The core MDM global versus local challenges spans over common definitions, common value tables and common data models across different geographies. Having a mix of common business rules and business rules that have to be local adds to the difficulties. When applying the full impact of business process management with the variety of formal and informal organizational structures, decision rules and working culture there are certainly both wins and obstacles in linking MDM and BPM.
I think the commonly used phrase about thinking globally and acting locally makes sense in the intersection between MDM and BPM. Thinking big and starting small helps too.
Being able to react to market changes in an agile way is the path to the survival of your business today. As you may not nail it in the first go, the ability to correct with continuous improvement is the path for your business to stay alive.
Doing business process improvement most often involves master data as examined in the post Master Data and Business Processes. The people side of this is challenging. The technology side isn’t a walkover either.
When looking at Master Data Management (MDM) platforms in sales presentations it seems very easy to configure a new way of orchestrating a business process. You just drag and drop some states and transitions in a visual workflow manager. In reality, even when solely looking at the technical side, it is much more painful.
MDM solutions can be hard to maneuver. You have to consider existing data and the data models where the data sits. Master data is typically used with various interfaces across many business functions and business units. There are usually many system integrations running around the MDM component in an IT landscape.
A successful MDM implementation does not just cure some pain points in business processes. The solution must also be able to be maneuvered to support business agility and continuous improvement. Some of the data quality and data governance aspects of this is explored in the post Be Prepared.
The intersection of Master Data Management (MDM) and Business Process Management (BPM) is a very interesting aspect of implementing MDM solutions.
We may divide this battleground into three sectors:
Business processes that purely consumes master data
Business processes that potentially changes master data
Business processes that purely updates master data
Business processes that purely consumes master data
An example of such a business process is the execution of a direct marketing campaign. Doing this in an effective way is heavily dependent on clean and updated master data. A key capability is the ability to separate which targeted real world entities belongs to the so called “new market” and which are existing customers (or prospects or churned customers). When working with known customers the ability to intelligently relate to previously products and their categories of interest is paramount. Often knowing about the right relation between targeted parties and locations is very valuable.
When doing MDM implementations and ongoing refinement the insight on how master data are used and creates value in business processes is the starting point.
Business processes that potentially changes master data
The most commonly mentioned wide business process is the order-to-cash process. During that process especially customer master data may be affected. A key question is whether the order is placed by a new customer or a known customer. If it truly is a new customer, then effective collection of accurate and timely master data determines the successful outcome of receiving the cash based on correct credit check, correct shipping information and more. If it is a known customer this is a chance to validate and eventually update customer master data.
While customer master data often is changed through business processes having another main purpose, this is not the case with product master data.
Business processes that purely updates master data
An example is from within manufacturing, distribution and retail where we have business processes with the sole purpose of enriching product master data. With the rise of customer self-service through e-commerce the data quality requirements for completeness and other data quality dimensions have increased a lot. This makes the orchestration of complex business processes for enriching product master data a whole new flavour of Business Process Management where master data itself is the outcome – of course in order to be optimally used in order-to-cash and other business processes.
PS: If you are interested in discussing BPM and MDM alignment on La Rambla in Barcelona on the 22nd April 2015, here is the chance.
If you haven’t yet implemented a Master Data Management (MDM) solution you typically holds master data in dedicated solutions for Supply Chain Management (SCM), Enterprise Resource Planning (ERP), Customer Relation Management (CRM) and heaps of other solutions aimed at taking care of some part of your business depending on your particular industry.
In this first stage some master data flows into these solutions from business partners in different ways, flows around between the solutions inside your IT landscape and flows out to business partners directly from the various solutions.
The big pain in this stage is that a given real world entity may be described very different when coming in, when used inside your IT landscape and when presented by you to the outside. Additionally it is hard to measure and improve data quality and there may be several different business processes doing the same thing in an alternative way.
The answer today is to implement a Master Data Management (MDM) solution. When doing that you in some degree may rearrange the way master data flows into your IT landscape, you move the emphasis on master data management from the SCM, ERP, CRM and other solutions to the MDM platform and orchestrate the internal flows differently and you are most often able to present a given real world entity in a consistent way to the outside.
In this second stage you have cured the pain of inconsistent presentation of a given real world entity and as a result of that you are in a much better position to measure and control data quality. But typically you haven’t gained much in operational efficiency.
You need to enter a third stage. MDM 3.0 so to speak. In this stage you extend your MDM solution to your business partners and take much more advantage of third party data providers.
The master data kept by any organization is in a large degree a description of real world entities that also is digitalized by business partners and third party data providers. Therefore there are huge opportunities for reengineering your business processes for master data collection and interactive sharing of master data with mutual benefits for you and your business partners. These opportunities are touched in the post MDM 3.0 Musings.
Indeed, while automation is a most wanted outcome of Master Data Management (MDM) implementations and many other IT enabled initiatives, you should always consider the alternative being eliminating (or simplifying). This often means thinking out of the box.
As an example I today stumbled upon the Wikipedia explanation about Business Process Mapping. The example used is how to make breakfast (the food part):
You could think about different Business Process Re-engineering opportunities for that process. But you could also realize that this is an English / American breakfast. What about making a French breakfast instead. Will be as simple as: