Be Prepared

Working with data governance and data quality can be a very backward looking quest. It often revolves around how to avoid a recent data disaster or catching up with the organizational issues, the process orchestration and new technology implementations needed to support current business objectives with current data types in a better way.

This may be hard enough. But you must also be prepared for the future.

open-doorThe growth of available data to support your business is a challenge today. Your competitors take advantage of new data sources and better exploitation of known data sources while you are sleeping. New competitors emerge with business ideas based on new ways of using data.

The approach to inclusion of new data sources, data entities, data attributes and digital assets must be a part of your data governance framework and data quality capability. If you are not prepared for this, your current data quality will not only be challenged by decay of current data elements but also of not sufficiently governed new data elements or lack of business agility because you can’t include new data sources and elements in a safe way.

Some essentials in being prepared for inclusion of new kinds of data are:

  • A living business glossary that facilitates a shared understanding of new data elements within your organization including how they relate to or replaces current data elements.
  • Configurable data quality measurement facilities, data profiling functionality and data matching tools so on-boarding every new data element doesn’t require a new data quality project.
  • Self-service and automation being the norm for data capture and data consumption. Self-service must be governed both internally in your organization and externally as explained in the post Data Governance in the Self-Service Age.

Bookmark and Share

The Matrix

The data governance discipline, the Master Data Management (MDM) discipline and the data quality discipline are closely related and happens to be my fields of work as told in the post Data Governance, Data Quality and MDM.

Every IT enabled discipline has an element of understanding people, orchestrating business processes and using technology. The mix may vary between disciplines. This is also true for the three above-mentioned disciplines.

But how important is people, process and technology within these three disciplines? Are the disciplines very different in that perspective? I think so.

When assigning a value from 1 (less important) to 5 (very important) for Data Governance (DG), Master Data Management (MDM) and Data Quality (DQ) I came to this result:

The Matrix

A few words about the reasoning for the highs and lows:

Data governance is in my experience a lot about understanding people and less about using technology as told in the post Data Governance Tools: The New Snake Oil?

I often see arguments about that data quality is all about people too. But:

  • I think you are really talking about data governance when putting the people argument forward in the quest for achieving adequate data quality.
  • I see little room for having the personal opinion of different people dictating what adequate data quality is. This should really be as objective as possible.

Now I am ready for your relentless criticism.

Bookmark and Share

Data Quality X-mas Stories

Today is 2nd of December and time for the 2nd x-mas theme on this blog this year following up on the early yuletide post about The Shortcut to Lapland.

In a way it is not in line with a main subject on this blog being diversity to focus too much on Christmas as I know that many readers may have for example Eid, Diwali or Chinese New Year as the main days of celebration during the year.

To me The Holidays is much about having light in a time of year up north that else would be very dark and even depressive. When I am in Copenhagen I live on a cosy square called Gråbrødretorv (Grey Friars Market). In summertime the square is filled with outdoor seating. Not so much in the winter. But then there is a fir tree with lights on.

Gråbrødretorv2

Anyway there is lots of stuff in the x-mas theme you can relate to data quality. Some of the older ones on this blog were:

Bookmark and Share

The “Fit for Purpose” Trap

Gartner (the analyst firm), represented by Saul Judah, takes data quality back to basics in the recent post called Data Quality Improvement.

While I agree with the sentiment around measuring the facts as expressed in the post I have cautions about relying on that everything is good when data are fit for the purpose for business operations.

Some clues lies in the data quality dimensions mentioned in the post:

Accuracy (for now):

As said in the Gartner post data are indeed temporal. The real world changes and so does business operations. When you got your data fit for the purpose of use the business operations has changed. And when you got your data re-fit for the new purpose of use the business operations has changed again.

Furthermore most organizations can’t take all business operations into account at the same time. If you go down the fit for purpose track you will typically address a single business objective and make data fit for that purpose. Not at least when dealing with master data there are many business objectives and derived purposes of use. In my experience that leads to this conclusion:

“While we value that data are of high quality if they are fit for the intended use we value more that data correctly represent the real-world construct to which they refer in order to be fit for current and future multiple purposes”

Existence – an aspect of completeness:

The Gartner post mentions a data quality dimension being existence. I tend to see this as an aspect of the broader used term completeness.

For example having a fit for purpose completeness related to product master data has been a huge challenge for many organizations within retail and distribution during the last years as explained in the post Customer Friendly Product Master Data.

Omni

Bookmark and Share

Omni-purpose MDM

The terms omni-channel banking and omni-channel retailing are becoming popular within businesses these days.

In this context omni (meaning all) is considered to be something more advanced than multi (meaning many) as in multi-channel retailing.

Data management, including Master Data Management (MDM), is always a bit behind the newest business trends. In our discipline we have hardly even entered the multi stage yet.

Some moons ago I wrote about multi-channel data matching on the Informatica Perspectives blog in the post Five Future Data Matching Trends. Today, on the same blog, Stephan Zoder has the post asking: Is your social media investment hampered by your “data poverty”?

Herein Stephan examines the possible benefits of multi-channel data matching based on a business case within the gambling industry.

Using omni in relation to MDM was seen in a vendor presentation at the Gartner MDM Summit in London last week as reported in the post Slicing the MDM Space. Omnidomain MDM was the proposed term here.

The end goal should probably be something that could be coined as omni-purpose MDM. This will be about advancing MDM capabilities to cover multiple domains and embrace multiple channels in order to obtain a single view of every core entity that can be used in every business process.

Omni

Bookmark and Share

From B2B and B2C to H2H

I stumbled upon an article from yesterday by Bryan Kramer called There is no more B2B or B2C: It’s Human to Human, H2H.

H2H

The article is about the implications for marketing caused by the rise of social media which now finally seems to eliminate what we have known as business-to-business (B2B) and more or less merges B2B and business-to-consumer (B2C).

As discussed here on the blog several times starting way back in 2009 in the post Echoes in the Database a problem with B2B indeed is that while business transactions takes place between legal entities a lot of business processes are done between employees related to the selling and buying entities. You may call that employee-to-employee (E2E), people-to-people (P2P) or indeed human-to-human (H2H).

Related to databases, data quality and Master Data Management (MDM) this means we need real world alignment with two kinds of parties:

While B2B and B2C may melt together in the way we do messaging the distinction between B2B and B2C will be there in many other aspects. Even in social media we see it as for example two of the most used social networks being FaceBook and LinkedIn clearly belongs mainly to B2C and B2B respectively for marketing and social selling purposes.

The different possibilities with B2B and B2C in the H2H world was touched in an interview on DataQualityPro last year: What are the Benefits of Social MDM?

Bookmark and Share

When High Quality Data doesn’t Yield High Quality Service

Better data quality is a prerequisite of better quality of service but unfortunately high quality data doesn’t necessarily lead to high quality service when the data flow is broken. This happened to me last night.

ubicabs2When landing in London Heathrow Airport I usually, economically as I am, use the train to reach my doorstep. However, when I have to catch an early morning flight I order a cab, which actually has a very reasonable price. So yesterday I decided to book a cab in order to cut 30 to 40 minutes of the journey home on the expense of a minor amount of extra pounds.

Excellent data capture

Usually I just call the cab, but as I arrived by airplane and my local cab service is part of an online booking service, I used that service for the first time. The user interface is excellent. There is rapid addressing for entering the pick-up place which quickly presented me the possible terminals at Heathrow. The destination was just a smooth. As the pick-up is an airport they prompted me for the flight number. Very nice as that makes tracking delays possible for them and also you can check that the airline and terminal is a correct match.

Also they have an app that I geekly downloaded to my phablet.

Going down

Landing times at Heathrow are difficult to predict as it often happens that your flight has a couple of circles over London before landing due to heavy traffic. Yesterday was good though as we came directly down and therefore were ahead of schedule.

ubicabsSo it was OK that my name wasn’t at the signs held by drivers already waiting at the passenger exit. Actually I was so early that I could have reached the not so frequent direct train home. But as I now already had troubled the driver to go there I of course waited while spending time on the app.

There actually also was a driver tracking on the app. Marvelous. At first glance it seemed the driver was there. But then I noticed a message saying driver tracking wasn’t available and therefore the spot in the terminal 3 building would be my own position or requested pick-up place.

Going crazy

5 minutes after requested time the driver called:

“Where are you Mr. Sorensen?”

“I’m at the passenger exit where all drivers are waiting.”

“OK. I’m just parking the car. Go to the front of the coffee shop and I’ll be there in a few minutes.”

I spotted a coffee shop in front of the lifts to the short stay parking and went over there.

10 minutes later the driver called:

“Where are you Mr. Sorensen?”

“I am in front of the coffee shop”

“Costa Coffee?”

“No. It has a different name…”. After some ping-pong I mentioned terminal 3.

“Terminal 3?” the driver responded. “I’m at terminal 5. I was told to go here. I’ll be with you in 5 minutes”.

Going by car in 5 minutes I wondered. That would indicate crossing the runways or using the train tunnel.

Well, while spending more happy time on the phablet the clock approached the point where I would be at my doorstep using the slow train.

40 minutes after requested time the driver arrived. I was waiting for the mandatory sorry that Brits use even when they are not sorry at all.

Instead the driver greeted me with: “Did you order the cab yourself Mr. Sorensen?”

“Yes I did. On the internet.”

“Internet?” the driver replied.

“Your company has an excellent online booking system” I friendly remarked.

“When I called you first I asked for confirmation about where you were”.

As I realized that he was trying to establish that everything was my fault I presented the confirmation on the app.

ubicabs3We continued (without the usual smalltalk) to the destination. Here the driver (instead of a discount) presented an upgraded version of the price on the booking confirmation.

At that point it was too difficult to keep calm and carry on…..

Bookmark and Share

Getting eMail Addresses Right the First Time

emailChecking if an eMail address will bounce is essential for executing and measuring campaigns, news letter operations and other activities based on sending eMails as explained here on the site Don’t Bounce by BriteVerify.

A good principle within data quality prevention and Master Data Management (MDM) is the first time right approach. There is a 1-10-100 rule saying:

“One dollar spent on prevention will save 10 dollars on correction and 100 dollar on failure costs”.

(Replace dollars with your favorite currency: Euros, pounds, rubles, rupees, whatever.)

This also applies to capturing an eMail address of a (prospect) customer and other business partners. Many business processes today requires communication through eMails in order to save costs and speed up processes. If you register an invalid eMail address or allow self registration of an invalid eMail address you have got yourself some costly scrap and rework or maybe lost an opportunity.

As a natural consequence the instant Data Quality MDM Edition besides ensuring right names and correct postal addresses also checks for valid eMail addresses.

Bookmark and Share

Somehow Deduplication won’t Stick

psychographic MDM18 years ago I cruised into the data quality realm when making my first deduplication tool. Then it was an attempt to solve a business case of two companies who were considering merging and wanted to know the intersection of customers. So far, so good.

Since then I have worked intensively with deduplication and other data matching tools and approaches and also co-authored a leading eLearning course on the matter as seen here.

Deduplication capability is a core feature of many data quality tools and indeed the probably most mentioned data quality pain is lack of uniqueness not at least in party master data management.

However, most deduplication efforts don’t in my experience stick. Yes, we can process a file ready for direct marketing and purge the messages that might end up in the same offline or online inbox despite of spelling differences. But taking it from there and use the techniques in achieving a single customer view is another story. Some obstacles are:

In the comments to the latter 3 year old post the intersection (and non-intersection) of Entity Resolution and Master Data Management (MDM) was discussed.

During my latest work I have become more and more convinced that achieving a single view of something is a lot about entity resolution as expressed in the post The Good, Better and Best Way of Avoiding Duplicates.

Bookmark and Share

Undertaking in MDM

Pluto's moon CharonIn the post Last Time Right the bad consequences of not handling that one of your customers aren’t among us anymore was touched.

This sad event is a major trigger in party master data lifecycle management like The Relocation Event I described last week.

In the data quality realm handling so called deceased data has been much about suppression services in direct marketing. But as we develop more advanced master data services handling the many aspects of the deceased event turns up as an important capability.

Like with relocation you may learn about the sad event in several ways:

  • A message from relatives
  • Subscription to external reference data services, which will be different from country to country
  • Investigation upon returned mail via postal services

Apart from in Business-to-Consumer (B2C) activities the deceased event also has relevance in Business-to-Business (B2B) where we may call it the dissolved event.

One benefit of having a central master data management functionality is that every party role and related business processes can be notified about the status which may trigger a workflow.

An area where I have worked with handling this situation was in public transit where subscription services for public transport is cancelled when learning about a decease thus lifting some burden on relatives and also avoiding processes for paying back money in this situation.

Right now I’m working with data stewardship functionality in the instant Data Quality MDM Edition where the relocation event, the deceased event and other important events in party master data lifecycle management must be supported by functionality embracing external reference data and internal master data.

Bookmark and Share