Using Internal and External Product Information to Win

When working with product information I usually put the data into this five level model:

Five levels

The model is explained in the post Five Product Data Levels.

As a downstream participant in supply chains being a distributor or retailer your success is dependent on if you can do better than other businesses (increasingly including marketplaces) of your kind fighting over the same customer prospects. One weapon in doing that is using product information.

Here you must consider where you should use industry wide available data typically coming from the manufacturer and where you should create your own data.

I usually see that companies tend to use industry wide available data in the blue section below:

Internal and external product information

The white area, the internally created data, is:

  • Level 1: Basic product data with your internal identifiers as well as supplier data that reflects your business model
  • Level 5: Competitive data with your better product stories, your unique up-sell and cross-sell opportunities and your choice of convincing advanced digital assets
  • Level 3 in part: Your product description (perhaps in multiple languages) that is consistent with other products you sell and a product image that could be the one provided by the manufacturer or one you shoot yourself.

Obviously, creating internal product data that works better than your competitor is a way to win.

For the blue area, the externally created data, your way of winning is related to how good you are at on-boarding this data from your upstream trading partners being manufacturers and upstream distributors or how good you are in exploiting available product data pools and industry specific product data portals.

In doing that, connect is better than collect. You can connect by using Product Data Pull.

Three Game Changers within Product Information Management

Product Information Management (PIM) is a fast-growing discipline enabled by PIM platforms. While the current market for PIM platforms is much about supporting a consistent in-house management of the information related to product models we make, buy and sell, there are new opportunities arising. Three of them on my radar are:

globalInternet of Things (IoT)

With the rise of IoT and the related theme Industry 4.0 we will in the future not just have to deal with the product model but also each physical instance of that product. As an example of how many product groups that might embrace, read about that IKEA is thinking about embedding its furniture with artificial intelligence.

Value webs

The recent buzzword in the chain starting with “supply chain” and going over “value chain” is “value web”. Learn about the arrival of continuously evolving business ecosystems and value webs in this article from Deloitte University Press. Product information management encompassing business ecosystems will be imperative in value webs.

Product Data Lake

This is in all humbleness my venture by having launched a PIM-2-PIM platform that deals with the current main pain in product information management, being exchanging product information between trading partners. We do that in an agile and automated way by supporting partnerships in value webs and are soon adding things to Product Data Lake.

Get into the game by registering for a trial account on Product Data Lake.

MDM Summit Europe 2017 Preview

Next week we have the Master Data Management (and Data Governance) Summit Europe 2017 in London. I am looking forward to be there.

MDMDG2017The Sponsors

Some of the sponsors I am excited to catch up with are:

  • Semarchy, as they have just released their next version multi-domain (now promoted as multi-vector) MDM (now promoted as xDM) offering emphasizing on agility, smartness, intelligence and being measurable.
  • Uniserv, as they specialize in hosted customer MDM on emerging technology infused with their proven data quality capabilities and at the same time are open to coexistence with other multi-domain MDM services.
  • Experian Data Quality, as they seem to be a new entry into the MDM world coming from very strong support for party and location data quality, however with a good foundation for supporting the whole multi-domain MDM space.

The Speakers

This year there are a handful of Danish speakers. Can’t wait to listen to:

  • Michael Bendixen of Grundfos pumping up the scene with his Data Governance Keynote on Key Factors in Successful Data Governance
  • Charlotte Gerlach Sylvest of Coloplast on taking care of Implementing Master Data Governance in Large Complex Organisations
  • Birgitte Yde and Louise Pagh Covenas of ATP telling how they watch after my pension money while being on a Journey Towards a New MDM System
  • Erika Bendixen of Bestseller getting us dressed up for Making Master Data Fashionable by Transforming Information Chaos into a Governance-Driven Culture.

10 Analyst Firms in the MDM Space

When working with Master Data Management (MDM) it is always valuable to follow the analyst firms that are active on this subject and the related subjects as data quality, data governance and data management in general. You can learn from their insights – and disagreements – on the matters. Here are 10 analyst firms I follow:

Gartner, the large analyst firm known for their magic quadrants, hype cycles and cool vendor lists. There is a lot of brain power in this firm and they have never been caught in admitting a mistake. Quite a lot of posts on this blog mentions Gartner.

Forrester, another firm with heaps of analysts. Forrester has though been less prominent in the MDM world since Robert Karel left for Informatica. However, there are lots of wider insights to gain from as mentioned in the post Ecosystems are The Future of Digital and MDM.

The MDM Institute, which basically is Aaron Zornes, known as the Father Christmas of MDM. Aaron Zornes was the inspirational source in my recent post called MDM as Managed Service.

The Information Difference, headed by Andy Hayler. They publish a yearly MDM landscape report latest referenced on this blog in the post Emerging Database Technologies for Master Data.

Bloor Group has occasionally made reports about MDM latest mentioned on this blog in the post The MDM Market Wordle.

Ventana Research has been especially active around Product Information Management (PIM) as seen in the recent press release on their Product Information Management Research.

Intelligent Business Strategies, run by Mike Ferguson. No nonsense, plain English insights from the around the UK Midlands. Home page here.

Constellation Research, the Silicon Valley perspective. Home page here.

The Group of Analysts has published a series of interviews with MDM and PIM notabilities as for example this one with Richard Hunt of Agility Multichannel on Content Gravity.

Aberdeen Group, a company you as a MDM vendor can hire to put numbers on your blog as for example Stibo Systems did here.

Analysts

A Product Information Management (PIM) Solar System

Hundreds of years ago the geocentric model was replaced by heliocentrism, meaning that we recognize that the earth travels around the sun and not the other way around.

When it comes to Product Information Management (PIM), we also need a Copernican Revolution, meaning that it is good to manage product information consistently inside a given company, but it is better to manage product information in the light of the business ecosystem where we participate.

Exchanging product information in the business ecosystems of manufacturers, distributors and merchants cannot work properly by asking all your trading partners to use your version of a spreadsheet – if they don’t get to you first with their version. Nor will self-centered supplier / customer product data portals work as examined in the post PIM Supplier Portals: Are They Good or Bad?

Your company is not a lonely planet. You are part of a business ecosystem, where you may be:

  • Upstream as the maker of goods and services. For that you need to buy raw materials and indirect goods from the parties being your vendors. In a data driven world you also to need to receive product information for these items. You need to sell your finished products to the midstream and downstream parties being your B2B customers. For that you need to provide product information to those parties.
  • Midstream as a distributor (wholesaler) of products. You need to receive product information from upstream parties being your vendors, perhaps enrich and adapt the product information and provide this information to the parties being your downstream B2B customers.
  • Downstream as a retailer/etailer or large end user of product information. You need to receive product information from upstream parties being your vendors and enrich and adapt the product information so you will be the preferred seller to the parties being your B2B customers and/or B2C customers.

At Product Data Lake we support business ecosystems in Product Information Management (PIM). And this is not just a nice model. There are concrete business benefits too. 5 for you and 5 for your trading partner:  Check our 10 business benefits.

WordChart

MDM as Managed Service

This month I am going to London to attend the Master Data Management Summit Europe 2017.

As a teaser before the conference Aaron Zornes made a post called MDM Market 2017-18: Facts vs. Beliefs (with apologies to current political affairs fans!).

In his article, Aaron Zornes looks at the slow intake of multi-domain MDM, proactive data governance, graph technology and Microsoft stuff ending with stating that MDM as MANAGED SERVICE = HOT:

“Just as business users increasingly gave up on IT to deliver modest CRM in a timely, cost effective fashion (remember all the Siebel CRM debacles), so too are marketing and sales teams especially looking to improve the quality of their customer data… and pay for it as a “service” rather than as a complex, long-time-to-value capital expenditure that IT manages”.

Master Data ShareI second that, having been working with the iDQ™ service years ago, and will add, that the same will be true for product data as well and then eventually also multi-domain MDM.

How that is going to look like is explained here on Master Data Share.

Three Ways of Embracing Digital Ecosystem Platforms

Gartner, the analyst firm, has recently promoted their take on the five kinds of digital platforms you will need to consider in your digital transformation journey.

Gartner Digital Platforms 2The top right kind of platform is the ecosystem one. This kind of platform will facilitate how you interact with business partners.

I my eyes, there are three kind of ways you can do that:

  1. You provide and own an ecosystem digital platform for your business partners
  2. You participate in an ecosystem digital platform provided and owned by one of your business partners
  3. You participate in a neutral provided and owned ecosystem digital platform for a given purpose

Currently I am working with Product Data Lake, which is the third kind of platform. In this ecosystem digital platform you can exchange product information with your trading partners. There are alternatives of the other kinds as discussed in the post PIM Supplier Portals: Are They Good or Bad?

Data Quality for the Product Domain vs the Party Domain

Same Same But Different

The difference between solving data quality issues for party (customer, supplier and other business partner) master data and product master data was discussed 7 years ago on this blog in the post Same Same But Different.

Data Quality Dimensions
Some data quality dimensions

Since then I have worked intensively with both party master data and product master data and the data quality challenges organizations have within these domains.

Building on the findings from 7 years ago and recent experiences, I think there are two areas it is worth emphasizing on:

  • Data Quality Dimensions: All dimensions are important and they support each other in solving the issues. But there are some differences as explained in the post Multi-Domain MDM and Data Quality Dimensions. In my mind, uniqueness is the worst pain for party master data and completeness is the worst pain for product master data.
  • External Data Sources: The use of data sources was examined in the post 1st Party, 2nd Party and 3rd Party Master Data. In my mind, extensive utilization of third party data is paramount for party master data quality and effective exchange of second party data is paramount for product master data quality.

A Sharing Concept

For solving both party master data and product master data quality issues you need Multi-Domain MDM for business ecosystems as proposed in the Master Data Share concept.

Five Product Classification Standards

When working with Product Master Data Management (MDM) and Product Information Management (PIM) one important facet is classification of products. You can use your own internal classification(s), being product grouping and hierarchy management, within your organization and/or you can use one or several external classification standards.

Five External Standards

Some of the external standards I have come across are:

UNSPSC

The United Nations Standard Products and Services Code® (UNSPSC®), managed by GS1 US™ for the UN Development Programme (UNDP), is an open, global, multi-sector standard for classification of products and services. This standard is often used in public tenders and at some marketplaces.

GPC

GS1 has created a separate standard classification named GPC (Global Product Classification) within its network synchronization called the Global Data Synchronization Network (GDSN).

Commodity Codes / Harmonized System (HS) Codes

Commodity codes, lately being worldwide harmonized and harmonised, represent the key classifier in international trade. They determine customs duties, import and export rules and restrictions as well as documentation requirements. National statistical bureaus may require these codes from businesses doing foreign trade.

eClass

eCl@ss is a cross-industry product data standard for classification and description of products and services emphasizing on being a ISO/IEC compliant industry standard nationally and internationally. The classification guides the eCl@ss standard for product attributes (in eClass called properties) that are needed for a product with a given classification.

ETIM

ETIM develops and manages a worldwide uniform classification for technical products. This classification guides the ETIM standard for product attributes (in ETIM called features) that are needed for a product with a given classification.

pdl-whyThe Competition and The Neutral Hub

If you click on the links to some of these standards you may notice that they are actually competing against each other in the way they represent themselves.

At Product Data Lake we are the neutral hub in the middle of everyone. We cover your internal grouping and tagging to any external standard. Our roadmap includes more close integration to the various external standards embracing both product classification and product attribute requirements in multiple languages where provided. We do that with the aim of letting you exchange product information with your trading partners, who probably do the classification differently from you.

Plug and Play – The Future for Data

What does the future for data and the need for power when travelling have in common? A lot, as Ken O’Connor explains in today’s guest blog post:

Bob Lambert wrote an excellent article recently summarising the New Direction for Data set out at Enterprise Data World 2017 (#EDW17).  As Bob points out “Those (organisations) that effectively manage data perform far better than organisations that don’t”. A key theme from #EDW17 is for data management professionals to “be positive” and to focus on the business benefits of treating data as an asset.  On a related theme, Henrik on this blog has been highlighting the emergence and value to be derived from business ecosystems and digital platforms.  

Building on Bob and Henrik’s ideas, I believe we need a paradigm shift in the way we think and talk about data.  We need to promote the business benefits of data sharing via “Plug and Play Data”.

AdaptorWhen we travel, we expect to be able to use our mobile devices anywhere in the world. We do this by using universal adaptors that convert country specific plug shapes and power levels for us.   

We need to apply the same concept to data. To enable data to be more easily reused across and between enterprises, we need to create “plug and play data”.         

How can organisations create “plug and play data”?

In the past, organisations could simply verify that the data they create / capture / ingest and share conforms to the business rules for their own organisation. That “silo-based” approach is no longer tenable. In today’s world, as Henrik points out, organisations increasing play a role within a business ecosystem, as part of a data supply chain. Hence they need to exchange data with business partners. To do this, they need to apply a “Data Sharing Concept” within a “Common Data Architecture” as set out by Michael Brackett in his excellent books “Data Resource Simplexity” and  “Data Resource Integration”.  Michael describes a “Data Sharing Medium”, which is similar in concept to the universal adaptor above. For data sharing, this involves organisations within a  given business ecosystem agreeing a “preferred form” for data sharing.  

Data Sharing.png

I quote Michael “The Common Data Architecture provides a construct for readily sharing data. When the source data are not in the preferred form, the source organisation must translate those non-preferred data to the preferred form before being shared over the data sharing medium. Similarly, when the target organisation uses the preferred data, they can be readily received from the data sharing medium. When the target organisation does not use preferred data, they must translate the preferred data to their non-preferred form. The “data sharing concept” states that shared data are transmitted over the data sharing medium as preferred data. Any organisation, whether source or target, that does not have or use data in the preferred form is responsible for translating the data.

In conclusion:

We Data Management Professionals need to educate both Business and IT on the need for, and the benefits of “plug and play data”. We need to help business leaders to understand that data is no longer used by just one business process. We need to explain that even tactical solutions within Lines of Business need to consider Enterprise and business ecosystem demands for data such as:

  1. Data feed into regulatory systems
  2. Data feeds to and from other organisations in the supply chain
  3. Ultimate replacement of application with newer generation system

We must educate the business on the increasingly dynamic information requirements of the Enterprise and beyond – which can only be satisfied creating “plug and play data” that can be easily reused and interconnected.

Ken O’Connor is an independent consultant with extensive experience helping multi-national organisations satisfy the Data Quality / Data Governance requirements of regulatory compliance programmes such as GDPR, Solvency II, BASEL II/III, Anti-Money Laundering, Anti-Fraud, Anti-Terrorist Financing and BCBS 239 (Risk Data Aggregation and Reporting).

Ken’s “Data Governance Health Check” provides an independent, objective assessment of your organisation’s internal data management processes to help you to identify gaps you may need to address to comply with regulatory requirements.

Ken is a founding board member of the Irish Data Management Association (DAMA) chapter. He writes a popular industry blog that regularly focuses on a wide range of data management issues faced by modern organisations: (Kenoconnordata.com).

You may contact Ken directly by emailing: Ken@Kenoconnordata.com