The Real Reason Why Your Business Needs a PIM Tool

Today’s guest blog post is the second one from Dan O’Connor, a United States based product data taxonomy guru. Here are Dan’s thoughts on why you should have a Product Information Management (PIM) tool:

Over the past year I have moved from a position of watching a Product Information Management tool, or PIM, being installed, to working for a PIM vendor, to working through the process of installing a PIM tool from the client side. In the same way that I justified buying a sports car to my wife based on the utilitarian value of having 350 horsepower at my disposal, I’ve seen many different justifications for installing a PIM tool. From “Micro Moments” to “collaborative data collection” and “syndication”, terms are tossed around that attempt to add to the value of a PIM installation.

The simple truth is there is only one reason you need a PIM tool. Every justification is solving a symptom of a data problem in a business, not the core problem. Every good management executive knowns that solving symptoms is a rabbit hole that can cost time and money at an incredible rate, so understanding what the core problem that requires a PIM in your business is vital to your business growth.

PIM messageControlling your Messaging

That core problem your business needs to solve is product messaging. Simply put, without a central hub for your data your business has a lack of control over how your product messaging is spread both internally and externally.  If you are still working in spread sheets or collecting data multiple times for a single product for different channels you have lost most of your product messaging structure.

PIM is a tool that solves that problem, and the symptomology that comes with it. Does your business spend too much time assembling data to meet downstream partner needs? You have a product messaging problem. Is your business’ ability to ingest data limited by spread sheets transferred over network folders or email? You have a product messaging problem.

All the benefits of PIM can be summed up into a simple statement: If you want to be in control of your product brand and your product data quality your business needs a PIM tool. Do you want to reduce product data setup costs? You need a central location for all your product messaging to do so. Does your business have product data quality issues that occur due to poor adherence to best practices? Poor data quality affects your product messaging, and can be solved by a PIM tool. Is your business spending too much time chasing down emails with product specs and spread sheets full of setup data? These bad workflow practices affect your ability to provide a consistent message downstream to your business partners, whether your business is B2B or B2C. They are a symptom of your poor product messaging control.

The True PIM ROI Story

The central premise of a PIM tool is to standardize and normalize your product data collection and setup workflows and processes. If your business looks at a PIM tool only for this metric your vision for PIM is limited. Syndication, the distribution of data to consuming internal and external systems, is another huge benefit to PIM. However, if the product messaging your PIM system is sending or receiving is not well controlled within your PIM your vision is incomplete. There is not a single benefit to PIM that you cannot add the terms “with a consistent approach to your product messaging” to the end of.

Why is product messaging so important? In previous blogs I have demonstrated how failures in product messaging lead to odd product experiences, especially when you look at the messaging across platforms. If your web store shows a length for a product and your channel partner shows a different length you have a product messaging problem. If that product data came from a central source that issue would not exist. It might be as simple as the downstream partner swapped length for depth and there isn’t a true data issue, but to your customers there is an inconsistent product data message.

Extrapolating this out to something as simple as web descriptions actually validates this business case. If you provide a basic web description for a product based on an individual manually typing in marketing copy into a web portal you have lost control of your product messaging. That same person may be responsible for typing that web description in 4 different places, and without a central repository for that data the chances that those 4 messages will complement each other is slim. Add to that the fact that many major retailers edit web descriptions to conform to their standards after your business has completed product setup and you are less in control of your product messaging than you imagined.

Having a PIM tool solves this. You have a single source for web descriptions that you know will be represented in a singular repeatable fashion downstream. You can map your dimension attributes to your downstream channel partner dimensions, ensuring that the appropriate data appears in each field. You can customize web descriptions in a controlled and normalized environment so that you have more control over how those descriptions are customized by your channel partners.

The Importance of Product Messaging

Product messaging is your voice to your customers. As B2B ecommerce follows the path blazed by B2C it has become more important to have a consistent and controlled message for your products to all your customers. Spread sheets are not capable of that task, and email is not a mechanism for maintaining product data quality. Automated systems with proper workflows and data quality checks are paramount to ensuring the voice you expect your customers to hear is your business’ voice.

Reducing catalog printing costs, syndication of product data to channel partners, and reducing product setup headcount are valid reasons to install a PIM tool. However, they all should be part of a greater goal to control your voice to your customers. Those benefits are symptoms of a need in your business to have a unifying voice, and not including product messaging control as the overriding goal of your PIM installation is a strategic error.

In having performed many PIM installations here is the impact of not seeing product messaging control as the overarching goal. A company I worked with went through the process of installing a PIM tool, and we reached the point of remediating their existing product data to fit the new model. This company, who had invested heavily in this project, decided they did not want to perform any data remediation. They simply added back into their PIM tool every attribute that had existed in their old system. There was vision to improve the data they were displaying to their customers: They simply wanted to speed up product setup.

That business has spent the last 6 months undoing the benefits on controlled product messaging. It was less costly to them in the short term to simply replicate their existing data issues in a new system. Their old product data was unwieldly, hyper-specific to channel, and involved writing product titles and web descriptions manually for each channel. There is no common theme to the product messaging they are creating, and their ability to reduce product setup costs has been hampered by these decisions.

In Summary: Product Data is Your Product Messaging

Micro moments and product experience management is just fancy terminology for what is simply an understanding of the importance of your product data. If your vision is to control your product messaging, you have to start with your product data. A PIM tool is the only functional approach that meets that goal, but has to be looked at as a foundational piece of that product messaging. Attempting to reduce product setup costs or speed product data transfer is a valid business goal and a justification for a PIM project, but the true visionary approach has to include an overall product messaging approach. Otherwise, your business is limiting the return on investment it will achieve from any attempt to solve your product data setup and distribution problems.

Dan O’Connor is a Product Taxonomy, Product Information Management (PIM), and Product Data Consultant and an avid blogger on taxonomy topics. He has developed taxonomies for major retails as well as manufacturers and distributors, and assists with the development of product data models for large and small companies. See his LinkedIn bio for more information.

IoT and Multi-Domain MDM

The Internet-of-Things (IoT) is a hot topic and many Master Data Management (MDM) practitioners as well as tool and service vendors are exploring what the rise of the Internet-of-Things and the related Industry 4.0 themes will mean for Master Data Management in the years to come.

globalIn my eyes, connecting these smart devices and exploiting the big data you can pull (or being pushed) from them will require a lot for all Master Data Management domains. Some main considerations will be:

  • Party Master Data Management is needed to know about the many roles you can apply to a given device. Who is the manufacturer, vendor, supplier, owner, maintainer and collector of data? Privacy and security matters on that basis will have to be taken very seriously.
  • Location Master Data Management is necessary at a much deeper and precise level than what we are used to when dealing with postal addresses. You will need to know a home location with a timespan and you will need to confirm and, for moving devices, supplement with observed locations with a timestamp.
  • Product and Asset Master Data Management is imperative in order to know about the product model of the smart device and individual characteristics of the given device.

It is also interesting to consider, if you will be able to manage this connectivity within a MDM platform (even multidomain and end-to-end) behind your corporate walls. I do not think so as told in the post The Intersections of 360 Degree MDM.

Big Data Fitness

A man with one watch knows what time it is, but a man with two watches is never quite sure. This old saying could be modernized to, that a person with one smart device knows the truth, but a person with two smart devices is never quite sure.

An example from my own life is measuring my daily steps in order to motivate me to be more fit. Currently I have two data streams coming in. One is managed by the app Google Fit and one is managed by the app S Health (from Samsung).

This morning a same time shot looked like this:

Google Fit:

google-fit

S Health:

s-health

So, how many steps did I take this morning? 2,047 or 2413?

The steps are presented on the same device. A smartphone. They are though measured on two different devices. Google Fit data are measured on the smartphone itself while S Health data are measured on a connected smartwatch. Therefore, I might not be wearing these devices in the exact same way. For example, I am the kind of Luddite that do not bring the phone to the loo.

With the rise of the Internet of Things (IoT) and the expected intensive use of the big data streams coming from all kinds of smart devices, we will face heaps of similar cases, where we have two or more sets of data telling the same story in a different way.

A key to utilize these data in the best fit way is to understand from what and where these data comes. Knowing that is achieved through modern Master Data Management (MDM).

At Product Data Lake we in all humbleness are supporting that by sharing data about the product models for smart devices and in the future by sharing data about each device as told in the post Adding Things to Product Data Lake.

We Need More Product Data Lake Ambassadors

ambassador

Product Data Lake is the new solution to sharing product information between trading partners. While we see many viable in-house solutions to Product Information Management (PIM), there is a need for a solution to exchange product information within cross company supply chains between manufacturers, distributors and retailers.

Completeness of product information is a huge issue for self-service sales approaches as seen in ecommerce. 81 % of e-shoppers will leave a webshop with lacking product information. The root cause of missing product information is often an ineffective cross company data supply chain, where exchange of product data is based on sending spreadsheets back and forth via email or based on biased solutions as PIM Supplier Portals.

However, due to the volume of product data, the velocity required to get data through and the variety of product data needed today, these solutions are in no way adequate or will work for everyone. Having a not working environment for cross company product data exchange is hindering true digital transformation at many organizations within trade.

As a Product Information Management professional or as a vendor company in this space, you can help manufacturers, distributors and retailers in being successful with product information completeness by becoming a Product Data Lake ambassador.

The Product Data Lake encompasses some of the most pressing issues in world-wide sharing of product data:

The first forward looking professionals and vendors in the Product Information Management realm have already joined. I would love to see you as well as our next ambassador.

Interested? Get in contact:

IT is not the opposite of the business, but a part of it

Yin and yangDuring my professional work and not at least when following the data management talk on social media I often stumble upon sayings as:

  • IT should not drive a CRM / MDM / PIM /  XXX project. The business should do that.
  • IT should not be responsible for data quality. The business should be that.

I disagree with that. Not that the business should not do and be those things. But because IT should be a part of the business.

I have personally always disliked the concept of dividing a company into IT and the business. It is a concept practically only used by the IT (and IT focused consulting) side. In my eyes, IT is part of the business just as much as marketing, sales, accounting and all the other departmental units.

With the raise of digitalization the distinction between IT and the business becomes absolutely ridiculous – not to say dangerous.

We need business minded IT people and IT savvy business people to drive digitilization and take responsibility of data quality.

Used abbreviations:

  • IT = Information Technology
  • CRM = Customer Relationship Management
  • MDM = Master Data Management
  • PIM = Product Information Management

Who will become Future Leaders in the Gartner Multidomain MDM Magic Quadrant?

Gartner emphasizes that the new Magic Quadrant for Master Data Management Solutions Published 19 January 2017 is not solely about multidomain MDM or a consolidation of the two retired MDM quadrants for customer and product master data. However, a long way down the report it still is.

If you want a free copy both Informatica here and Riversand here offers that.

The Current Pole Position and the Pack

The possible positioning was the subject in a post here on the blog some while ago. This post was called The Gartner Magic Quadrant for MDM 2016. The term 2016 has though been omitted in the title of the final quadrant probably because it took into 2017 to finalize the report as reported in the post Gartner MDM Magic Quadrant in Overtime.

Below is my look at the positioning in the current quadrant:

mdm-mq

Starting with the multidomain MDM point the two current leaders, Informatica and Orchestra, have made their way to multidomain in two different ways. Pole position vendor Informatica has used mergers and acquisitions with the old Siperian MDM solution and the Heiler PIM (Product Information Management) solution to build the multidomain MDM leadership. Orchestra Networks has built a multidomain MDM solution from the gound.

The visionary Riversand is coming in from the Product MDM / PIM world as a multidomain MDM wannabe and so is the challenger Stibo. I think SAP is in their right place: Enormous ability to execute with not so much vision.

If you go through the strengths and cautions of the various vendors, you will find a lot of multidomain MDM views from Gartner.

The Future Race

While the edges of the challengers and visionaries’ quadrants are usually empty in a Gartner magic quadrant, the top right in this first multidomain MDM quadrant from Gartner is noticeably empty too. So who will we see there in the future?

Gartner mentions some interesting upcoming vendors earning too little yet. Examples are Agility Multichannel (a Product Data Lake ambassador by the way), Semarchy and Reltio.

The future race track will according to Gartner go through:

  • MDM and the Cloud
  • MDM and the Internet of Things
  • MDM and Big Data

PS: At Product Data Lake we are heading there in full speed too. Therefore, it will be a win-win to see more MDM vendors joining as ambassadors or even being more involved.

Data Born Companies and the Rest of Us

harriThis post is a new feature here on this blog, being guest blogging by data management professionals from all over the world. First up is Harri Juntunen, Partner at Twinspark Consulting in Finland:

Data and clever use of data in business has had and will have significant impact on value creation in the next decade. That is beyond reasonable doubt. What is less clear is, how this is going to happen? Before we answer the question, I think it is meaningful to make a conceptual distinction between data born companies and the rest of us.

Data born born companies are companies that were conceived from data. Their business models are based  on monetising clever use of data. They have organised everything from their customer service to operations to be capable of maximally harness data. Data and capabilities to use data to create value is their core competency. These companies are the giants of data business: Google, Facebook, Amazon, Über, AirBnB. The standard small talk topics in data professionals’ discussions.

However, most of the companies are not data born. Most of the companies were originally established to serve a different purpose. They were founded to serve some physical needs and actually maintaining them physically, be it food, spare parts or factories. Obviously, all of these companies in  e.g. manufacturing and maintenance of physical things need data to operate. Yet, these companies are not organised around the principles of data born companies and capabilities to harness data as the driving force of their businesses.

We hear a lot of stories and successful examples about how data born companies apply augmented intelligence and other latest technology achievements. Surely, technologies build around of data are important. The key question to me is: what, in practice, is our capability to harness all of these opportunities in companies that are not data born?

In my daily practice I see excels floating around and between companies. A lot of manual work caused by unstandardised data, poor governance and bad data quality. Manual data work simply prevents companies to harness the capabilities created by data born companies. Yet, most of the companies follow the data born track without sufficient reflection. They adopt the latest technologies used by the data born companies. They rephrase same slogans: automation, advanced analytics, cognitive computing etc. And yet, they are not addressing the fundamental and mundane issues in their own capabilities to be able to make business and create value with data. Humans are doing machine’s job.

Why? Many things relate to this, but data quality and standardization are still pressing problems in every day practice in many companies. Let alone between companies. We can change this. The rest of us can reborn from data just by taking a good look of our mundane data practices instead of aspiring to go for the next big thing.

P.S. The Google Brain team had reddit a while ago and they were asked “what do you think is underrated?

The answer:

“Focus on getting high-quality data. “Quality” can translate to many things, e.g. thoughtfully chosen variables or reducing noise in measurements. Simple algorithms using higher-quality data will generally outperform the latest and greatest algorithms using lower-quality data.”

https://www.reddit.com/r/MachineLearning/comments/4w6tsv/ama_we_are_the_google_brain_team_wed_love_to/

About Harri Juntunen:

Harri is seasoned data provocateur and ardent advocate of getting the basics right. Harri says: People and data first, technology will follow.

You can contact Harri here:

+358 50 306 9296

harri.juntunen@twinspark.fi

www.twinspark.fi