instant Single Customer View

Achieving a Single Customer View (SCV) is a core driver for many data quality improvement and Master Data Management (MDM) implementations.

As most data quality practitioners will agree, the best way of securing data quality is getting it right the first time. The same is true about achieving a Single Customer View. Get it right the first time. Have an instant Single Customer View.

The cloud based solution I’m working with right now does this by:

  • Searching external big reference data sources with information about individuals, companies, locations and properties as well as social networks
  • Searching internal master data with information already known inside the enterprise
  • Inserting really new entities or updating current entities by picking  as much data as possible from external sources

instant Single Customer View

Some essential capabilities in doing this are:

  • Searching is error tolerant so you will find entities even if the spelling is different
  • The receiving data model is real world aligned. This includes:
    • Party information and location information have separate lives as explained in the post called A Place in Time
    • You may have multiple means of contact attached like many phones, email addresses and social identities

How do you achieve a Single Customer View?

Bookmark and Share

Data Quality Vendors Beware of SEO Agencies

As reported in the post Fighting Identity Fraud with Identity Fraud and experienced with the post 255 Reasons for Data Quality Diversity I have seen several sloppy attempts of link building from SEO agencies working for data quality tool vendors.

The other day it happened again, this time on LinkedIn.

There was a comment in the Master Data Management Interest group:

DataLadder SEO

The comment is now deleted by the author and I do understand why.

I guess a SEO guy was working for Simon at DataLadder and Nathan from somewhere else at the same time and given access to their LinkedIn accounts. However he/she posted a comment to be meant being from Simon logged in as Nathan (who is not working with MDM and data quality).

So, data quality tool and service vendors: You can’t fight identity fraud with identity fraud and you can’t advocate for a single view of customer with a messy view of you as a vendor. Be authentic.

Bookmark and Share

Beware of False Positives in Data Matching

In a recent blog post by Kristen Gregerson of Satori Software you may learn A Terrible Tale where the identity of two different real world individuals were merged into one golden record with the most horrible result you may imagine associated with a recent special day related to the results of the other kind of matching going around.

datamatching
Join the Data Matching Group on LinkedIn

As reported by Jim Harris some years ago in the post The Very True Fear of False Positives the bad things happening from false positives in data matching is indeed a hindrance for doing data matching

If we do data matching we should be aware that false positives will happen and we should know the probability of that it happens and we should know how to avoid the resulting heartache.

Indeed using a data matching tool is better than relying on simple database indexes and indeed there are differences in how good various data matching tools are at doing the job, not at least doing it under different circumstances as told in the post What is a best-in-class match engine?

Curious about how data matching tools work (differently)? There is an eLearning course available co-authored by yours truly. The course is called Data Parsing, Matching and De-duplication.

Bookmark and Share

Data Quality Does Matter!

The title of this blog post is the title of a seminar about data quality and data matching taking place in Copenhagen:

Data Quality Does Matter

The seminar is hosted by Affecto, a data management consultancy firm with strong presence in the Nordic and the Baltic countries, and Informatica, a leading data management tool vendors word-wide.

There will be three sessions on the seminar:

  • First you will learn about steps for working with a data quality platform to improve BI and master data management solutions.
  • Then you will see a walkthrough of the architecture and capabilities of the Informatica Data Quality platform.
  • And finally you shouldn’t miss the session with yours truly on data matching based on a Informatica Perspectives blog post called Five Future Data Matching Trends.

Hope to see you in Copenhagen, København, Köpenhamn, Kopenhagen, Copenhague, Copenaghen, Hafnia or whatever name you use for that place as told in the post about data matching and Diversity in City Names.

Bookmark and Share

Customer Management, Data Quality and MDM

Today I am visiting the Call Centre and Customer Management Expo 2012 in London and have a chance to learn about what’s going on in this area – and what happens to data quality and master data management.

Postcodes Anywhere

At the PostcodeAnywhere stand the talk is about data quality. PostcodeAnywhere has become a well known vendor of services for validating addresses in the United Kingdom based on the unique structure of the UK postal code and addressing system. I had a chat with Marketing Executive Ed Nash about the challenges of delivering similar services for all the other countries on the planet with their particular ways of addressing.

Phone Number Testing

Peter Muswell of ”ThePhone Number Testing Company” describes his company as the best kept secret in customer management. Indeed, I haven’t heard of this service before. The trick is a service for testing if a phone number is alive or not – notably without making any ghost calls. The service works in the UK. It works in some other countries and it doesn’t work in some other other countries. Just like most other data quality services.

Social Customer Service

The Salesforce.com stand is all about Social Customer Service. There is plenty of functionality offered for getting social with CRM (Customer Relationship Management). The tricky part, as confirmed by the Salesforce.com representative, is to manage customer master data embracing all the traditional data as addresses and phone numbers and the new keys to social data being social network profile identifiers. Sure, there will be a huge demand for Social Master Data Management (Social MDM).

Bookmark and Share

Data Quality along the Timeline

When working with data quality improvement it is crucial to be able to monitor how your various ways of getting better data quality is actually working. Are things improving? What measures are improving and how fast? Are there things going in the wrong direction?

Recently I had a demonstration by Kasper Sørensen, the founder of the open source data quality tool called DataCleaner. The new version 3.0 of the tool has comprehensive support of monitoring how data quality key performance indicators develop over time.

What you do is that you take classic data quality assessment features as data profiling measurements of completeness and duplication counting. The results from periodic executing of these features are then attached to a timeline. You can then visually asses what is improving, at what speed and eventually if anything is not developing so well.

Continuously monitoring how data quality key performance indicators are developing is especially interesting in relation to using concepts of getting data quality right the first time and follow up by ongoing data maintenance through enrichment from external sources.

In a traditional downstream data cleansing project you will typically measure completeness and uniqueness two times: Once before and once after the executing.

With upstream data quality prevention and automatic ongoing data maintenance you have to make sure everything is running well all the time. Having a timeline of data quality key performance indicators is a great feature for doing just that.

Bookmark and Share

Cross Border Data Quality

In data quality improvement you always have to find a balance between the almost impossible, and usually not sensible, vision of achieving zero percent defects and the good old 80-20 rule about aiming at the 80% most frequent issues and leaving the 20% not so frequent issues to a random fate.

One of the issues that usually falls into the 20% neglected issues is cross border challenges with contact master data.

In a recent blog post on the Postcode Anywhere blog Graham Rhind describes the data quality flaws arising from his relocation from Holland in the Netherlands to Germany. The post is called Validate … intelligently.

Personally I have had a lot of similar issues when moving from Denmark to England in the United Kingdom as for example described in the post Staying in Doggerland.

My guess is that we will see an increasing demand for cross border data quality services not at least as regulators are increasingly looking into cross border issues. The FATCA regulation from the United States tax authorities is an example as described in the post The Taxman: Data Quality’s Best Friend.

As globalization moves forward organizations will increasingly work cross border, people will move between countries and more frequently live in one country and work in another country and buy services in another country. In coping with this reality you can’t keep up with data quality by just using a National Change of Address service and other data quality services focused on and optimized for a single country.

Bookmark and Share

instant Data Quality at Work

DONG Energy is one of the leading energy groups in Northern Europe with approximately 6,400 employees and EUR 7.6 billion in revenue in 2011.

The other day I sat down with Ole Andres, project manager at DONG Energy, and talked about how they have utilized a new tool called iDQ™ (instant Data Quality) in order to keep up with data quality around customer master data.

iDQ™ is basically a very advanced search engine capable of being integrated into business processes in order to get data quality for contact data right the first time and at the same time reduce the time needed for looking up and entering contact data.

Fit for multiple business processes

Customer master data is used within many different business processes. Dong Energy has successfully implemented iDQ™ within several business processes, namely:

  • Assigning new customers and ending old customers on installation addresses
  • Handling returned mail
  • Debt collection

Managing customer master data in the utility sector has many challenges as there are different kinds of addresses to manage such as installation addresses, billing addresses and correspondence addresses as well as different approaches to private customers and business customers including considering the grey zone between who is a private account and who is a business account.

New technology requires change management

Implementing new technology into a large organization doesn’t just go by itself. Old routines tend to stick around for a while. DONG Energy has put a lot of energy, so to say, into training the staff in reengineering business processes around customer master data on-boarding and maintenance including utilizing the capabilities of the iDQ™ tool.

Acceptance of new tools comes with building up trust in the benefits of doing things in a new way.

Benefits in upstream data quality 

A tool like iDQ™ helps a lot with safeguarding the quality of contact data where data is born and when something happens in the customer data lifecycle. A side effect, which is at least as important stresses Ole Andres, is that data collection is going much faster.

Right now DONG Energy is looking into further utilizing the rich variety of reference data sources that can be found in the iDQ™ framework.

Bookmark and Share

”Fitness for Use” is Dead

The definition of data quality as being ”fitness for use” is challenged. “Real world alignment” or similar expressions are gaining traction.

Back in May Malcolm Chisholm made a tweet about the shortcomings of the “fitness for use” definition reported here on the blog in the post The Problem with Multiple Purposes of Use.

Last week the tweet was elaborated on the Information Management article called Data Quality is Not Fitness for Use. Today Jim Harris has a follow post called Data and its Relationships with Quality.

When working with data quality in the domain with far the most data quality issues being the quality of contact data (customer, supplier, employee and other party master data) I have many times experienced that making data fit for more than a single purpose of use almost always is about better real world alignment. Having data that actually represents what it purports to represent always helps with making data fit for use, even with more than one purpose of use.

In practice that in the contact data realm for example means:

  • Getting a standardized address at contact data entry makes it possible for you to easily link to sources with geo codes, property information and other location data for multiple purposes.
  • Obtaining a company registration number or other legal entity identifier (LEI) at data entry makes it possible to enrich with a wealth of available data held in public and commercial sources making data fit for many use cases.
  • Having a person’s name spelled according to available sources for the country in question helps a lot with typical data quality issues as uniqueness and consistency.

Also, making data real world aligned from the start is a big help when maintaining data as the real world will change over time.

Data quality tools will in my eyes also have to apply to this trend as discussed with Gartner in the post Quality of Data behind the Data Quality Magic Quadrant.

Bookmark and Share

instant Data Quality and Business Value

During the last couple of years I have been working with a cloud service called instant Data Quality (iDQ™).

iDQ™ is basically a very advanced search engine capable of being integrated into business processes in order to get data quality for contact data right the first time and at the same time reduce the time needed for looking up and entering contact data.

With iDQ™ you are able to look up what is known about a given address, company and individual person in external sources (I call these big reference data) and what is already known inside your internal master data.

Orchestrating the contact data entry and maintenance processes this way does create better data quality along with creating business value.

The testimonials from current iDQ™ clients tells that story.

Dong Energy, a leader in providing clean and reliable energy, says:

Dong says

From the oil and gas industry Kuwait Petroleum, a company with trust as a core value, adds in:

Q8 says

In the non-profit sector the DaneAge Association, an organization supporting and counselling older people to make informed decisions, also get it:

DaneAge says

You may learn more about iDQ™ on the instant Data Quality site.

Bookmark and Share