Is blockchain technology useful within MDM?

This question was raised on this blog back in January this year in the post Tough Questions About MDM.

Since then the use of the term blockchain has been used more and more in general and related to Master Data Management (MDM). As you know, we love new fancy terms in our else boring industry.

blockchainHowever, there are good reasons to consider using the blockchain approach when it comes to master data. A blockchain approach can be coined as centralized consensus, which can be seen as opposite to centralized registry. After the MDM discipline has been around for more than a decade, most practitioners agree that the single source of truth is not practically achievable within a given organization of a certain size. Moreover, in the age of business ecosystems, it will be even harder to achieve that between trading partners.

This way of thinking is at the backbone of the MDM venture called Product Data Lake I’m working with right now. Yes, we love buzzwords. As if cloud computing, social network thinking, big data architecture and preparing for Internet of Things wasn’t enough, we can add blockchain approach as a predicate too.

In Product Data Lake this approach is used to establish consensus about the information and digital assets related to a given product and each instance of that product (physical asset or thing) where it makes sense. If you are interested in how that develops, why not follow Product Data Lake on LinkedIn.

Bookmark and Share

Adding Things to Product Data Lake

Product Data Lake went live last month. Nevertheless, we are already planning the next big things in this cloud service for sharing product data. One of them is exactly things. Let me explain.

Product data is usually data about a product model, for example a certain brand and model of a pair of jeans, a certain brand and model of a drilling machine or a certain brand and model of a refrigerator. Handling product data on the model level within business ecosystems is hard enough and the initial reason of being for Product Data Lake.

stepping_stones_oc

However, we are increasingly required to handle data about each instance of a product model. Some use cases I have come across are:

  • Serialization, which is numbering and tracking of each physical product. We know that from having a serial number on our laptops and another example is how medicine packs now will be required to be serialized to prevent fraud as described in the post Spectre vs James Bond and the Unique Product Identifier.
  • Asset management. Asset is kind of the fourth domain in Master Data Management (MDM) besides party, product and location as touched in the post Where is the Asset. Also Gartner, the analyst firm, usually in theory (and also soon in practice in their magic quadrants) classifies product and asset together as thing opposite to party. Anyway, in asset management you handle each physical instance of the product model.
  • Internet of Things (IoT) is, according to Wikipedia, the internetworking of physical devices, vehicles (also referred to as “connected devices” and “smart devices”), buildings and other items—embedded with electronics, software, sensors, actuators, and network connectivity that enable these objects to collect and exchange data.

Fulfilling the promise of IoT, and the connected term Industry 4.0, certainly requires common understood master data from the product model over serialization and asset management as reported in the post Data Quality 3.0 as a stepping-stone on the path to Industry 4.0.

Bookmark and Share

Approaches to Sharing Product Information in Business Ecosystems

One of the most promising aspects of digitalization is sharing information in business ecosystems. In the Master Data Management (MDM) realm, we will in my eyes see a dramatic increase in sharing product information between trading partners as touched in the post Data Quality 3.0 as a stepping-stone on the path to Industry 4.0.

Standardization (or standardisation)

A challenge in doing that is how we link the different ways of handling product information within each organization in business ecosystems. While everyone agrees that a common standard is the best answer we must on the other hand accept, that using a common standard for every kind of product and every piece of information needed is quite utopic. We haven’t even a common uniquely spelled term in English.

Also, we must foresee that one organization will mature in a different pace than another organisation in the same business ecosystem.

Product Data Lake

These observations are the reasons behind the launch of Product Data Lake. In Product Data Lake we encompass the use of (in prioritized order):

  • The same standard in the same version
  • The same standard in different versions
  • Different standards
  • No standards

In order to link the product information and the formats and structures at two trading partners, we support the following approaches:

  • Automation based on product information tagged with a standard as explained in the post Connecting Product Information.
  • Ambassadorship, which is a role taken by a product information professional, who collaborates with the upstream and downstream trading partner in linking the product information. Read more about becoming a Product Data Lake ambassador here.
  • Upstream responsibility. Here the upstream trading partner makes the linking in Product Data Lake.
  • Downstream responsibility. Here the downstream trading partner makes the linking in Product Data Lake.

cross-company-data-governanceData Governance

Regardless of the mix of the above approaches, you will need a cross company data governance framework to control the standards used and the rules that applies to the exchange of product information with your trading partners. Product Data Lake have established a partnership with one of the most recommended authorities in data governance: Nicola Askham – the Data Governance Coach.

For a quick overview please have a look at the Cross Company Data Governance Framework.

Please request more information here.

Bookmark and Share

Sign Up is Open

Over the recent one and a half year many of the posts on this blog has been about Product Data Lake, a cloud service for sharing product data in the business ecosystems of manufacturers, distributors, retailers and end users of product information.

From my work as a data quality and Master Data Management (MDM) consultant, I have seen the need for a service to solve data quality issues, when it comes to product master data. My observation has been that the root cause of these issues are found in the way that trading partners exchange product information and digital assets.

It is the aim of Product Data Lake to ensure:

  • Completeness of product information by enabling trading partners to exchange product data in a uniform way
  • Timeliness of product information by connecting trading partners in a process driven way
  • Conformity of product information by encompassing various international standards for product information
  • Consistency of product information by allowing upstream trading partners and downstream trading partners to interact with in-house structure of product information
  • Accuracy of product information by ensuring transparency of product information across the supply chain.

You can learn more about how Product Data Lake works on the documentation site.

pdl-how-much-smallBecome a:

Sign Up is open on www.productdatalake.com

Bookmark and Share

Data Quality 3.0 as a stepping-stone on the path to Industry 4.0

The title of this blog post is a topic on my international keynote at the Stammdaten Management Forum 2016 in Düsseldorf, Germany on the 8th November 2016. You can see the agenda for this conference that starts on the 7th and end the on 9th here.

stepping_stones_ocData Quality 3.0 is a term I have used over the years here on the blog to describe how I see data quality, along with other disciplines within data management, changing. This change is about going from focusing on internal data stores and cleansing within them to focusing on external sharing of data and using your business ecosystem and third party data to drastically speed up data quality improvement.

Industry 4.0 is the current trend of automation and data exchange in manufacturing technologies. When we talk about big data most will agree that success with big data exploitation hinges on proper data quality within master data management. In my eyes, the same can be said about success with industry 4.0. The data exchange that is the foundation of automation must be secured by common understood master data.

So this is the promising way forward: By using data exchange in business ecosystems you improve data quality of master data. This improved master data ensures the successful data exchange within industry 4.0.

Bookmark and Share

Emerging Database Technologies for Master Data

The MDM Landscape Q2 2016 from Information Difference is out. MDM vendors usually celebrate these yearly analyst reports with tweets and posts about their prominent position, like Informatica trailed by Stibo Systems for being in the top right corner and Agility Mulitichannel closely followed by Orchestra Networks for having the happiest customers.

The Information DifferenceBut the market analysis and the trends observed is good stuff as well.

This year I noticed the trend in the underlying technology used by MDM vendors to store the master data. The report says: “Some vendors have also decided to cut their ties with the relational database platform that has traditionally been the core storage mechanism for master data. Certain types of analysis e.g. of relationships between data, can be well handled by other types of emerging databases, such as graph databases like Neo4J and NoSQL databases like MongoDB. One vendor has recently switched its underlying platform entirely away from relational, and others have similar plans.”

While we usually see graph databases and NoSQL databases as something to use for analytical purposes, the trend of moving master data platforms to these technologies implies that operational purposes will be based on these technologies too.

This is close to me as the master data service I’m work with right now is based on storing data for operational purposes in MongoDB (in the cloud).

Bookmark and Share

Choosing the Best Term to Use in MDM

Right now I am working with a MDM (Master Data Management) service for sharing product data in the business ecosystems of manufacturers, distributors, retailers and end users of product information.

One of the challenges in putting such a service to the market is choosing the best term for the entities handled by the service.

Below is the current selection with the chosen term and some recognized alternate terms used frequently and found in various standards that exists for exchanging product data:

Terms

Please comment, if you think there are other English (or variant of English) terms that deserves to be in here.

Multi-Domain MDM 360 and an Intelligent Data Lake

This week I had the pleasure of being at the Informatica MDM 360 event in Paris. The “360” predicate is all over in the Informatica communication. There are the MDM 360 events around the world.  The Product 360 solution – the new wrap of the old Heiler PIM solution, as I understand it. The Supplier 360 solution. Some Customer 360 stuff including the Cloud Customer 360 for Salesforce edition.

GW MDMAll these solutions constitutes one of the leading Multi-Domain MDM offerings on the market – if not the leading. We will be wiser on that question when Gartner (the analyst firm) makes their first Multi-Domain MDM Magic Quadrant later this year as reported in the post Gravitational Waves in the MDM World.

Until now, Informatica has been very well positioned for Customer MDM, but not among the leaders for Product MDM in the ranking according to Gartner. Other analysts, as Information Difference, have Informatica in the top right corner of the (Multi-Domain) MDM landscape as seen here.

MDM and big data is another focus area for Informatica and Informatica has certainly been one of the first MDM vendors who have embraced big data – and that not just with wording in marketing. Today we cannot say big data without saying data lake. Informatica names their offering the Intelligent Data Lake.

For me, it will be interesting to see how Informatica can take full Multi-Domain MDM leadership with combining a good Product MDM solution with an Intelligent Data Lake.

Bookmark and Share