Plug and Play – The Future for Data

What does the future for data and the need for power when travelling have in common? A lot, as Ken O’Connor explains in today’s guest blog post:

Bob Lambert wrote an excellent article recently summarising the New Direction for Data set out at Enterprise Data World 2017 (#EDW17).  As Bob points out “Those (organisations) that effectively manage data perform far better than organisations that don’t”. A key theme from #EDW17 is for data management professionals to “be positive” and to focus on the business benefits of treating data as an asset.  On a related theme, Henrik on this blog has been highlighting the emergence and value to be derived from business ecosystems and digital platforms.  

Building on Bob and Henrik’s ideas, I believe we need a paradigm shift in the way we think and talk about data.  We need to promote the business benefits of data sharing via “Plug and Play Data”.

AdaptorWhen we travel, we expect to be able to use our mobile devices anywhere in the world. We do this by using universal adaptors that convert country specific plug shapes and power levels for us.   

We need to apply the same concept to data. To enable data to be more easily reused across and between enterprises, we need to create “plug and play data”.         

How can organisations create “plug and play data”?

In the past, organisations could simply verify that the data they create / capture / ingest and share conforms to the business rules for their own organisation. That “silo-based” approach is no longer tenable. In today’s world, as Henrik points out, organisations increasing play a role within a business ecosystem, as part of a data supply chain. Hence they need to exchange data with business partners. To do this, they need to apply a “Data Sharing Concept” within a “Common Data Architecture” as set out by Michael Brackett in his excellent books “Data Resource Simplexity” and  “Data Resource Integration”.  Michael describes a “Data Sharing Medium”, which is similar in concept to the universal adaptor above. For data sharing, this involves organisations within a  given business ecosystem agreeing a “preferred form” for data sharing.  

Data Sharing.png

I quote Michael “The Common Data Architecture provides a construct for readily sharing data. When the source data are not in the preferred form, the source organisation must translate those non-preferred data to the preferred form before being shared over the data sharing medium. Similarly, when the target organisation uses the preferred data, they can be readily received from the data sharing medium. When the target organisation does not use preferred data, they must translate the preferred data to their non-preferred form. The “data sharing concept” states that shared data are transmitted over the data sharing medium as preferred data. Any organisation, whether source or target, that does not have or use data in the preferred form is responsible for translating the data.

In conclusion:

We Data Management Professionals need to educate both Business and IT on the need for, and the benefits of “plug and play data”. We need to help business leaders to understand that data is no longer used by just one business process. We need to explain that even tactical solutions within Lines of Business need to consider Enterprise and business ecosystem demands for data such as:

  1. Data feed into regulatory systems
  2. Data feeds to and from other organisations in the supply chain
  3. Ultimate replacement of application with newer generation system

We must educate the business on the increasingly dynamic information requirements of the Enterprise and beyond – which can only be satisfied creating “plug and play data” that can be easily reused and interconnected.

Ken O’Connor is an independent consultant with extensive experience helping multi-national organisations satisfy the Data Quality / Data Governance requirements of regulatory compliance programmes such as GDPR, Solvency II, BASEL II/III, Anti-Money Laundering, Anti-Fraud, Anti-Terrorist Financing and BCBS 239 (Risk Data Aggregation and Reporting).

Ken’s “Data Governance Health Check” provides an independent, objective assessment of your organisation’s internal data management processes to help you to identify gaps you may need to address to comply with regulatory requirements.

Ken is a founding board member of the Irish Data Management Association (DAMA) chapter. He writes a popular industry blog that regularly focuses on a wide range of data management issues faced by modern organisations: (Kenoconnordata.com).

You may contact Ken directly by emailing: Ken@Kenoconnordata.com

Ecosystems are The Future of Digital and MDM

A recent blog post by Dan Bieler of Forrester ponders that you should Power Your Digital Ecosystems with Business Platforms.

In his post, Dan Bieler explains that such business platforms support:

·      The infrastructure that connect ecosystem participants. Business platforms help organizations transform from local and linear ways of doing business toward virtual and exponential operations.

·      A single source of truth for ecosystem participants. Business platforms become a single source of truth for ecosystems by providing all ecosystem participants with access to the same data.

·      Business model and process transformation across industries. Platforms support agile reconfiguration of business models and processes through information exchange inside and between ecosystems.

A single source of truth (or trust) for ecosystem participants is something that rings a bell for every Master Data Management (MDM) practitioner. The news is that the single source will not be a single source within a given enterprise, but a single source that encompasses the business ecosystem of trading partners.

Gartner Digital Platforms.png

Gartner, the other analyst firm, has also recently been advocating about digital platforms where the ecosystem type is the top right one. As stated by Gartner: Ecosystems are the future of digital.

I certainly agree. This is why all of you should get involved at Master Data Share.

 

Multi-Domain MDM and PIM, Party and Product

Multi-Domain Master Data Management (MDM) and Product Information Management (PIM) are two interrelated disciplines within information management.

While we may see Product Information Management as the ancestor or sister to Product Master Data Management, we will in my eyes gain much more from Product Information Management if we treat this discipline in conjunction with Multi-Domain Master Data Management.

Party and product are the most common handled domains in MDM. I see their intersections as shown in the figure below:

Multi-Side MDM

Your company is not an island. You are part of a business ecosystem, where you may be:

  • Upstream as the maker of goods and services. For that you need to buy raw materials and indirect goods from the parties being your vendors. In a data driven world you also to need to receive product information for these items. You need to sell your finished products to the midstream and downstream parties being your B2B customers. For that you need to provide product information to those parties.
  • Midstream as a distributor (wholesaler) of products. You need to receive product information from upstream parties being your vendors, perhaps enrich and adapt the product information and provide this information to the parties being your downstream B2B customers.
  • Downstream as a retailer or large end user of product information. You need to receive product information from upstream parties being your vendors and enrich and adapt the product information so you will be the preferred seller to the parties being your B2B customers and/or B2C customers.

Knowledge about who the parties being your vendors and/or customers are and how they see product information, is essential to how you must handle product information.  How you handle product information is essential to your trading partners.

You can apply party and product interaction for business ecosystems as explained in the post Party and Product: The Core Entities in Most Data Models.

3 Old and 3 New Multi-Domain MDM Relationship Types

Master Data Management (MDM) has traditionally been mostly about party master data management (including not at least customer master data management) and product master data management. Location master data management has been the third domain and then asset master data management is seen as the fourth – or forgotten – domain.

With the rise of Internet of Things (IoT) asset – seen as a thing – is seriously entering the MDM world. In buzzword language, these things are smart devices that produces big data we can use to gain much more insight about parties (in customer roles), products, locations and the things themselves.

In the old MDM world with party, product and location we had 3 types of relationships between entities in these domains. With the inclusion of asset/thing we have 3 more exiting relationship types.

Multi-Domain MDM Relations

The Old MDM World

1: Handling the relationship between a party at its location(s) is one of the core capabilities of a proper party MDM solution. The good old customer table is just not good enough as explained in the post A Place in Time.

2: Managing the relationship between parties and products is essential in supplier master data management and tracking the relationship between customers and products is a common use case as exemplified in the post Customer Product Matrix Management.

3:  Some products are related to a location as told in the post Product Placement.

The New MDM World

4: We need to be aware of who owns, operates, maintains and have other party roles with any smart device being a part of the Internet of Things.

5: In order to make sense of the big data coming from fixed or moving smart devices we need to know the location context.

6: Further, we must include the product information of the product model for the smart devices.

Expanding to Business Ecosystems

In my eyes, it is hard to handle the 3 old relationship types separately within a given enterprise. When including things and the 3 new relationship types, expanding master data management to the business ecosystems you have with trading partners will be imperative as elaborated in the post Data Management Platforms for Business Ecosystems.

Infonomics and Second Party Data

The term infonomics does not yet run unmarked through my English spellchecker, but there are some information available on Wikipedia about infonomics. Infonomics is closely related to the often-mentioned phrases in data management about seeing data / information as an asset.

Much of what I have read about infonomics and seeing data / information as an asset is related to what we call first party data. That is data that is stored and managed within your own company.

Some information is also available in relation to third party data. That is data we buy from external parties in order to validate, enrich or even replace our own first party data. An example is a recent paper from among others infonomic guru Doug Laney of Gartner (the analyst firm). This paper has a high value if you want to buy it as seen here.

Anyway, the relationship between data as an asset and the value of data is obvious when it comes to third party data, as we pay a given amount of money for data when acquiring third party data.

Second party data is data we exchange with our trading and other business partners. One example that has been close to me during the recent years is product information that follows exchange of goods in cross company supply chains. Here the value of the goods is increasingly depending on the quality (completeness and other data quality dimensions) of the product information that follows the goods.

In my eyes, we will see an increasing focus on infonomics when it comes to exchanging goods – and the related second party data – in the future. Two basic factors will be:

pdl-top-narrow

We Need More Product Data Lake Ambassadors

ambassador

Product Data Lake is the new solution to sharing product information between trading partners. While we see many viable in-house solutions to Product Information Management (PIM), there is a need for a solution to exchange product information within cross company supply chains between manufacturers, distributors and retailers.

Completeness of product information is a huge issue for self-service sales approaches as seen in ecommerce. 81 % of e-shoppers will leave a webshop with lacking product information. The root cause of missing product information is often an ineffective cross company data supply chain, where exchange of product data is based on sending spreadsheets back and forth via email or based on biased solutions as PIM Supplier Portals.

However, due to the volume of product data, the velocity required to get data through and the variety of product data needed today, these solutions are in no way adequate or will work for everyone. Having a not working environment for cross company product data exchange is hindering true digital transformation at many organizations within trade.

As a Product Information Management professional or as a vendor company in this space, you can help manufacturers, distributors and retailers in being successful with product information completeness by becoming a Product Data Lake ambassador.

The Product Data Lake encompasses some of the most pressing issues in world-wide sharing of product data:

The first forward looking professionals and vendors in the Product Information Management realm have already joined. I would love to see you as well as our next ambassador.

Interested? Get in contact:

PIM Supplier Portals: Are They Good or Bad?

A recent discussion on the LinkedIn Multi-Domain MDM group is about vendor / supplier portals as a part of Product Information Management implementations.

A supplier portal (or vendor portal if you like) is usually an extension to a Product Information Management (PIM) solution. The idea is that the suppliers of products, and thus providers of product information, to you as a downstream participant (distributor or retailer) in a supply chain, can upload their product information into your PIM solution and thus relieving you of doing that. This process usually replace the work of receiving spreadsheets from suppliers in the many situations where data pools are not relevant.

In my opinion and experience, this is a flawed concept, because it is hostile to the supplier. The supplier will have hundreds of downstream receivers of products and thus product information. If all of them introduced their own supplier portal, they will have to learn and maintain hundreds of them. Only if you are bigger than your supplier is and is a substantial part of their business, they will go with you.

Broken data supply chainAnother concept, which is the opposite, is also emerging. This is manufacturers and upstream distributors establishing PIM customer portals, where suppliers can fetch product information. This concept is in my eyes flawed exactly the opposite way.

And then let us imagine that every provider of product information had their PIM customer portal and every receiver had their PIM supplier portal. Then no data would flow at all.

What is your opinion and experience?