Rapid and Vanity Addressing – and the Apple Hotel

Mid next month iDQ will move our London office to a new address:

iDQ A/S
2nd Floor
Berkeley Square House
Berkeley Square
London
W1J 6BD
United Kingdom

It’s a good old English address including a lot of lines on an envelope.

The address could be either shorter or longer.

The address below will in fact be enough to have a letter delivered:

iDQ A/S
2nd Floor
W1J 6BD
UK

Due to the granular UK postal code system a single post code may either be a single address a part of a long road or a small street.

This structure is also what is exploited in what is called rapid addressing, where you only type in the need data and the rest is supplied by a (typically cloud) service.

But sometimes people want their addresses presented in a different way than the official way. Maybe I want our address to be:

iDQ A/S
2nd Floor
Berkeley Square House
Berkeley Square
Mayfair
London
W1J 6BD
United Kingdom

Mayfair is a nice part of London. Insisting in including this element in the address is an example of vanity addressing.

Here’s the map of the area:

Notice the place in the upper right corner of the Google Map: Apple Store Regent Street. With an icon with a bed. This means it’s a hotel. Is the Apple Store really a hotel? No – except for some while ago when people slept in front of the store waiting for a product with a notable map service as reported by Richard Northwood (aka The Data Geek) in the post Data Quality Failure – Apple Style.

Well Google, you can’t win them all.

Bookmark and Share

Hierarchical Single Source of Truth

Most data quality and master data management gurus, experts and practitioners agree that achieving a “single source of truth” is a nice term, but is not what data quality and master data management is really about as expressed by Michele Goetz in the post Master Data Management Does Not Equal The Single Source Of Truth.

Even among those people, including me, who thinks emphasis on real world alignment could help getting better data and information quality opposite to focusing on fitness for multiple different purposes of use, there is acknowledgement around that there is a “digital distance” between real world aligned data and the real world as explained by Jim Harris in the post Plato’s Data. Also, different public available reference data sources that should reflect the real world for the same entity are often in disagreement.

When working with improvement of data quality in party master data, which is the most frequent and common master data domain with issues, you encounter the same issues over and over again, like:

  • Many organizations have a considerable overlap of real world entities who is a customer and a supplier at the same time. Expanding to other party roles this intersection is even bigger. This calls for a 360° Business Partner View.
  • Most organizations divide activities into business-to-business (B2B) and business-to-consumer (B2C). But the great majority of business’s are small companies where business and private is a mixed case as told in the post So, how about SOHO homes.
  • When doing B2C including membership administration in non-profit you often have a mix of single individuals and households in your core customer database as reported in the post Household Householding.
  • As examined in the post Happy Uniqueness there is a lot of good fit for purpose of use reasons why customer and other party master data entities are deliberately duplicated within different applications.
  • Lately doing social master data management (Social MDM) has emerged as the new leg in mastering data within multi-channel business. Embracing a wealth of digital identities will become yet a challenge in getting a single customer view and reaching for the impossible and not always desirable single source of truth.

A way of getting some kind of structure into this possible, and actually very common, mess is to strive for a hierarchical single source of truth where the concept of a golden record is implemented as a model with golden relations between real world aligned external reference data and internal fit for purpose of use master data.

Right now I’m having an exciting time doing just that as described in the post Doing MDM in the Cloud.

Bookmark and Share

The Three Big M’s of Data Quality

Most organizations have a lot of data quality issues where there is a wealth of possible solutions to deal with these challenges.

What you usually do is that that you categorize the problems into three different types of best resolutions:

Mañana

You could go ahead with solving the data quality problems today but probably you have better and more important things to do right now.

Your organization may have a global SAP rollout going on or other resource demanding implementations. Therefore it is most wise to deal with the data quality issues when everything is running smoothly.

Mission impossible

Maybe a resolution has been tried before and didn’t work. Chances that alternate people management, different orchestration of processes and development in available technology will change that are very slim.

May the force be with you

Many problems solve themselves over time or hopefully don’t get noticed by anyone. If things get ugly you always have your lightsaber.

Bookmark and Share

Doing MDM in the Cloud

As reported in the post What to do in 2012 doing Master Data Management (MDM) in the cloud is one of three trends within MDM that according to Gartner (the analyst firm) will shape the MDM market in the coming years.

Doing MDM in the cloud is an obvious choice if all your operational applications are in the cloud already. Such a solution was presented on Informatica Perspectives in the blog post Power the Social Enterprise with a Complete Customer View. The post includes a Video where the situation with multiple instances of SalesForce.com solutions within the same enterprise is supported by a master data backbone in the cloud.

But even if all your operational applications are on premise you may start with lifting some master data management functionality up in the cloud. I am currently working with such a solution.

When onboarding customer (and other party) master data much of the basic information needed is already known in the cloud. Therefore lifting the onboarding functionality up into the cloud makes a lot of sense. This is the premise, so to speak, for the MDM edition of the instant Data Quality (iDQ) solution that we are working on these days.

Cloud services for the other prominent MDM domain being product master data also makes a lot of sense. As told in the post Social PIM a lot of basic product master data may be shared in the cloud embracing the supply chain of manufacturers, distributors, retailers and end users.

In both these cases some of the master data management functionality is handled in the cloud while the data integration stuff takes place where the operational applications resides be that in the cloud and/or on premise.

Bookmark and Share

Free and Open Public Sector Master Data

Yesterday the Danish Ministry of Finance announced an agreement between local authorities and the central government to improve and link public registers of basic data and to make data available to the private sector.

Once the public authorities have tidied up, merged the data and put a stop to parallel registration, annual savings in public administration could amount to 35 million EUR in 2020.

Basic open data includes private addresses, companies’ business registration numbers, cadastral numbers of real properties and more. These master data are used for multiple purposes by public sector bodies.

Private companies and other organizations can look forward to large savings when they no longer have to buy their basic data from the public authorities.

In my eyes this is a very clever move by the authorities exactly because of the two main opportunities mentioned:

  • The public sector will see savings and related synergies from a centralized master data management approach
  • The private sector will gain a competitive advantage from better and affordable reference data accessibility and thereby achieve better master data quality.

Denmark have, along with the other Nordic countries, always had a more mature public sector master data approach than we see in most other countries around the world.

I remember I worked with the committee that prepared a single registry for companies in Denmark back in the 80’s as mentioned in the post Single Company View.

Today I work with a solution called iDQ (instant Data Quality) which is about mashing up internal master data and a range of external reference data from social networks and not at least public sector sources. In that realm there is certainly not something rotten in Denmark. Rather there is a good answer to the question about to be free and open or not to be.

Bookmark and Share

Data that is not aligned with the real world usually provides bad information

The shortcomings of data being fit for some purpose of use compared to data that is aligned with the real world is a repeating topic on this blog latest in the post “Fitness for Use” is Dead.

Today I had a reminder of that when waiting for baggage at Copenhagen Airport.

There is an information screen telling when your baggage will start rolling in. What actually seems to happen is that a fixed time is assigned to every flight and then it starts counting down the minutes. Most baggage then starts rolling in (and this is showed on the screen) before zero minutes is reached. If it, as with my flight, happens that zero minutes is reached without delivery, the information screen shows that the baggage from this flight is delayed – but not how long.

So, the information provided is when you could expect your baggage probably according to some service level goal. OK, fit for that purpose. But in fact that doesn’t help you as a passenger a lot and doesn’t help at all when that goal isn’t reached.

End of rant.

Bookmark and Share

Customer Management, Data Quality and MDM

Today I am visiting the Call Centre and Customer Management Expo 2012 in London and have a chance to learn about what’s going on in this area – and what happens to data quality and master data management.

Postcodes Anywhere

At the PostcodeAnywhere stand the talk is about data quality. PostcodeAnywhere has become a well known vendor of services for validating addresses in the United Kingdom based on the unique structure of the UK postal code and addressing system. I had a chat with Marketing Executive Ed Nash about the challenges of delivering similar services for all the other countries on the planet with their particular ways of addressing.

Phone Number Testing

Peter Muswell of ”ThePhone Number Testing Company” describes his company as the best kept secret in customer management. Indeed, I haven’t heard of this service before. The trick is a service for testing if a phone number is alive or not – notably without making any ghost calls. The service works in the UK. It works in some other countries and it doesn’t work in some other other countries. Just like most other data quality services.

Social Customer Service

The Salesforce.com stand is all about Social Customer Service. There is plenty of functionality offered for getting social with CRM (Customer Relationship Management). The tricky part, as confirmed by the Salesforce.com representative, is to manage customer master data embracing all the traditional data as addresses and phone numbers and the new keys to social data being social network profile identifiers. Sure, there will be a huge demand for Social Master Data Management (Social MDM).

Bookmark and Share

Data Quality along the Timeline

When working with data quality improvement it is crucial to be able to monitor how your various ways of getting better data quality is actually working. Are things improving? What measures are improving and how fast? Are there things going in the wrong direction?

Recently I had a demonstration by Kasper Sørensen, the founder of the open source data quality tool called DataCleaner. The new version 3.0 of the tool has comprehensive support of monitoring how data quality key performance indicators develop over time.

What you do is that you take classic data quality assessment features as data profiling measurements of completeness and duplication counting. The results from periodic executing of these features are then attached to a timeline. You can then visually asses what is improving, at what speed and eventually if anything is not developing so well.

Continuously monitoring how data quality key performance indicators are developing is especially interesting in relation to using concepts of getting data quality right the first time and follow up by ongoing data maintenance through enrichment from external sources.

In a traditional downstream data cleansing project you will typically measure completeness and uniqueness two times: Once before and once after the executing.

With upstream data quality prevention and automatic ongoing data maintenance you have to make sure everything is running well all the time. Having a timeline of data quality key performance indicators is a great feature for doing just that.

Bookmark and Share

Hotel Rating Data Quality

Whether you are traveling for business or pleasure you like to stay in a hotel that suites your expectations.

What is good and what is bad differs between us individuals. But we may all belong to some type of stereotype depending on from where in the world we are from. For example, if I walk into an even modest rated American driven (managed) hotel anywhere in the world, I am pretty sure that there will be a bed much larger that I actually need. On a local driven hotel I’m not so sure.

The most common used hotel rating methodology are one to five stars rating systems. However, the classification criteria are not universal. They differ from country to country. Some countries have a public regulated system, in some countries the industry sets the standards and in some countries there are competing systems.

So, I can’t be sure that three stars in one country means the same as three stars in another country. One of my personal foremost requirements is that there is a WiFI available. In the Swiss criteria that will be only 2 out of 863 possible points. So I couldn’t be sure even on a five star hotel. Using the English criteria I will have to go for a four star hotel to be sure.

Besides official ratings social ratings has become more and more popular. Typically guests rates the hotels on the portal where they booked using a scale from 1 to 10 and you may add verbal descriptions about the appealing things and even more popular the appalling things.

Bookmark and Share

Killing Keystrokes

Keystrokes are evil. Every keystroke represents a potential root cause of poor data quality by spelling things wrongly, putting the right thing in the wrong place, putting the wrong thing in the right place and so on. Besides that every keystroke is a cost of work summing up with all the other keystrokes to gigantic amounts of work costs.

In master data management (MDM) you will be able to getting things right, and reduce working costs, by killing keystrokes wherever possible.

Killing keystrokes in Product Information Management (PIM)

I have seen my share of current business processes where product master data are reentered or copied and pasted from different sources extracted from one product master data container and, often via spreadsheets, captured into another product master data container.

This happens inside organizations and it happens in the ecosystem of business partners in supply chains encompassing manufactures, distributors and retailers.

As touched in the post Social PIM there might be light at the end of the tunnel by the rise of tools, services and platforms setting up collaboration possibilities for sharing product master data and thus avoiding those evil keystrokes.

Killing keystrokes in Party Master Data Management

With party master data there are good possibilities of exploiting external data from big reference data sources and thus avoiding the evil keystrokes. The post instant Data Quality at Work tells about how a large utility company have gained better data quality, and reduced working costs, by using the iDQ™ service in that way within customer on-boarding and other business processes related to customer master data maintenance.

The next big thing in this area will be the customer data integration (CDI) part of what I call Social MDM, where you may avoid the evil keystrokes by utilizing the keystrokes already made in social networks by who the master data is about.

Bookmark and Share