The Master Data Management (MDM) discipline is something that belongs in the backbone of digitalization and enterprise architecture and therefore new ways of doing things always have a hard time in this realm. Fore sure there have been talk about big data and MDM for years, but actual implementations are few compared to ongoing traditional system of record implementations. The same will be the case with Artificial Intelligence (AI) and MDM. We will still see a lot of clerking around MDM for years.
So, I am stretching it far when working with yet a new must do thing for MDM (besides working with MDM, big data and AI).
But I have no doubt about that shareconomy (or sharing economy) will affect the way we work with MDM in the future. A few others are on the same path as for example the Swiss consultancy CDQ as presented on their page about Shareconomy for Customer and Supplier Data and The Corporate Data League (CDL).
Doing Master Data Management (MDM) enterprise wide is hard enough. The ability to control master data across your organization is essential to enable digitalization initiatives and ensure the competitiveness of your organization in the future.
But it does not stop there. Increasingly every organization will be an integrated part of a business ecosystem where collaboration with business partners and through market places will be a part of digitalization and thus, we will have a need for working on the same foundation around master data.
This new aspect of MDM is also called multienterprise MDM. It will take years to be widespread. But you better start thinking about how this will be a part of your MDM strategy. Because in the long run you must Share or be left out of business.
In software architecture, publish–subscribe is a messaging pattern where senders of messages, called publishers, do not program the messages to be sent directly to specific receivers, called subscribers, but instead categorize published messages into classes without knowledge of which subscribers, if any, there may be. Similarly, subscribers express interest in one or more classes and only receive messages that are of interest, without knowledge of which publishers, if any, there are.
This kind of thinking is behind the service called Product Data Lake I am working with now. Whereas a publish-subscribe service is usually something that goes on behind the firewall of an enterprise, Product Data Lake takes this theme into the business ecosystem that exists between trading partners as told in the post Product Data Syndication Freedom.
Therefore, a modification to the publish-subscribe concept in this context is that we actually do make it possible for publishers of product information and subscribers of product information to care a little about who gets and who receives the messages as exemplified in the post Using a Business Entity Identifier from Day One. However, the scheme for that is a modern one resembling a social network where partnerships are requested and accepted/rejected.
As messages between global trading partners can be highly asynchronous and as the taxonomy in use often will be different, there is a storage part in between. How this is implemented is examined in the post Product Data Lake Behind the Scenes.
The term narcissism originates from Greek mythology, where the young Narcissus fell in love with his own image reflected in a pool of water. While this is about how a natural person may behave it can certainly also be applied to how a company behaves.
Not to show empathy to customers
I think we all know the classic sales presentation with endless slides about how big and wonderful the selling company is and how fantastic the products they sell are. This approach contradicts everything we know about selling, which is to start with the needs and pain points at the buying company and then how the selling company effectively can fulfill the needs and make the pain points go away.
Not to show empathy to trading partners
While business outcomes originate from selling to your customers it certainly also is affected by how you treat your trading partners and how you can put yourself in their place.
An example close to me is exchange of product information (product data syndication) between trading partners. We often see solutions which is made to make it easy for you but then being difficult for your trading partner. This includes requiring your spreadsheet format to filled out by your trading partner, may be a customer data portal set up by a manufacturer or opposite a supplier data portal set up by a merchant. These are narcissistic dead ends as told in the post The Death Trap in Product Information Management: Your Customer/Supplier Portal.
A couple of weeks ago Microsoft, Adobe and SAP announced their Open Data Initiative. While this, as far as we know, is only a statement for now, it of course has attracted some interest based on that it is three giants in the IT industry who have agreed on something – mostly interpreted as agreed to oppose Salesforce.com.
Forming a business ecosystem among players in the market is not new. However, what we usually see is that a group of companies agrees on a standard and then each one of them puts a product or service, that adheres to that standard, on the market. The standard then caters for the interoperability between the products and services.
In this case its seems to be something different. The product or service is operated by Microsoft based on their Azure platform. There will be some form of a common data model. But it is a data lake, meaning that we should expect that data can be provided in any structure and format and that data can be consumed into any structure and format.
In all humbleness, this concept is the same as the one that is behind Product Data Lake.
The Open Data Initiative from Microsoft, Adobe and SAP focuses at customer data and seems to be about enterprise wide customer data. While it technically also could support ecosystem wide customer data, privacy concerns and compliance issues will restrict that scope in many cases.
At Product Data Lake, we do the same for product data. Only here, the scope is business ecosystem wide as the big pain with product data is the flow between trading partners as examined here.
20 years ago, when I started working as a contractor and entrepreneur in the data management space, data was not on the top agenda at many enterprises. Fortunately, that has changed.
An example is displayed by Schneider Electric CEO Jean-Pascal Tricoire in his recent blog post on how digitization and data can enable companies to be more sustainable. You can read it on the Schneider Electric Blog in the post 3 Myths About Sustainability and Business.
Manufacturers in the building material sector naturally emphasizes on sustainability. In his post Jean-Pascal Tricoire says: “The digital revolution helps answering several of the major sustainability challenges, dispelling some of the lingering myths regarding sustainability and business growth”.
One of three myths dispelled is: Sustainability data is still too costly and time-consuming to manage.
From my work with Master Data Management (MDM) and Product Information Management (PIM) at manufacturers and merchants in the building material sector I know that managing the basic product data, trading data and customer self-service ready product data is hard enough. Taking on sustainability data will only make that harder. So, we need to be smarter in our product data management. Smart and sustainable homes and smart sustainable cities need smart product data management.
Enterprises are increasingly going to be part of business ecosystems where collaboration between legal entities not belonging to the same company family tree will be the norm.
This trend is driven by digital transformation as no enterprise possibly can master all the disciplines needed in applying a digital platform to traditional ways of doing business.
Enterprises are basically selfish. This is also true when it comes to Master Data Management (MDM). Most master data initiatives today revolve around aligning internal silos of master data and surrounding processes to fit he business objectives within an enterprise as a whole. And that is hard enough.
However, in the future that is not enough. You must also be able share master data in the business ecosystems where your enterprise will belong. The enterprises that, in a broad sense, gets this first will survive. Those who will be laggards are in danger of being left out of business.
There is a tendency when deploying Product Information Management (PIM) solutions, that you may want to add a portal for your trading partners:
If you are a manufacturer, you could have a customer portal where your downstream re-sellers can fetch the nicely arranged product information that is the result of your PIM implementation.
If you are a merchant, you could have a supplier portal where your upstream suppliers can deliver their information nicely arranged according to your product information standards in your PIM implementation.
This is a death trap for both manufacturers and merchants, because:
As a trading manufacturer and merchant, you probably follow different standards, so one must obey to the other. The result is that one side will have a lot of manual and costly work to do to obey the strongest trading partner. Only a few will be the strongest all time.
If all manufacturers have a customer portal and all merchants have a supplier portal everyone will be waiting for the other and no product information will flow in the supply chains.
In here Frank has this question: Do ecosystems represent an opportunity to establish non-traditional revenue streams (e.g. monetizing data)?
I think so. One example very close to me is how merchants, shippers and manufacturers can work closely together in not only moving the goods between them in an efficient way, but also moving the product information between them in the most efficient way.
There are three kinds of data monetization: Selling data, wrapping data around products and utilizing advanced analytics leading to fast operational decision making. These options were examined in the post Three Flavors of Data Monetization.
If we look at the middle option, wrapping data around products, and narrow it down to wrapping data around tangible products, there are some ways to execute that for supply change delegates, not at least if the participating business entities embraces the business ecosystem where goods are moved through:
Manufacturers need to streamline the handling of product information internally. This includes disciplines as PLM (Product Lifecycle Management) and PIM (Product Information Management). On top of that, manufacturers need to be effective in the way the product information is forwarded to direct customers and distributors/wholesalers and merchants as exemplified in the post How Manufacturers of Building Materials Can Improve Product Information Efficiency.
Merchants need to utilize the best way of getting data into inhouse PIM (Product Information Management) solutions or other kind of solutions where data flows in from trading partners. Many merchants have a huge variety in product information needs as told in the post Work Clothes versus Fashion: A Product Information Perspective. On top of that a merchant will have supplying manufacturers and distributors with varying formats and capabilities to offer product information as discussed in the post PIM Supplier Portals: Are They Good or Bad?.