Undertaking in MDM

Pluto's moon CharonIn the post Last Time Right the bad consequences of not handling that one of your customers aren’t among us anymore was touched.

This sad event is a major trigger in party master data lifecycle management like The Relocation Event I described last week.

In the data quality realm handling so called deceased data has been much about suppression services in direct marketing. But as we develop more advanced master data services handling the many aspects of the deceased event turns up as an important capability.

Like with relocation you may learn about the sad event in several ways:

  • A message from relatives
  • Subscription to external reference data services, which will be different from country to country
  • Investigation upon returned mail via postal services

Apart from in Business-to-Consumer (B2C) activities the deceased event also has relevance in Business-to-Business (B2B) where we may call it the dissolved event.

One benefit of having a central master data management functionality is that every party role and related business processes can be notified about the status which may trigger a workflow.

An area where I have worked with handling this situation was in public transit where subscription services for public transport is cancelled when learning about a decease thus lifting some burden on relatives and also avoiding processes for paying back money in this situation.

Right now I’m working with data stewardship functionality in the instant Data Quality MDM Edition where the relocation event, the deceased event and other important events in party master data lifecycle management must be supported by functionality embracing external reference data and internal master data.

Bookmark and Share

Know Your Fan

A variant of the saying “Know Your Customer” for a football club will be “Know Your Fan” and indeed fans are customers when they buy tickets. If they can.

FC Copenhagen

FC Copenhagen cruised into stormy waters when they apparently cancelled all purchases for the upcoming Champions League (European soccer club paramount tournament) clashes against Real Madrid, Juventus and Galatasaray if the purchasers didn’t have a Danish sounding name. The reason was to prevent mixing fans of the different clubs, but surely this poorly thought screening method wasn’t received well among the FC Copenhagen fans not called Jensen, Nielsen or Sørensen.

The story is told in English here on Times of India.

Actually methods of verifying identities are available and cheap in Denmark so I’m surprised to see FC Copenhagen caught offside in this situation.

Bookmark and Share

Time To Turn Your Customer Master Data Management Social?

The title of a post on the Nimble blog has this question: Time To Turn Your Sales Team Social?´ The post has a lot of evidence on why sales teams that embrace social selling are doing better than teams that doesn’t do that.

We do see new applications supporting social selling where Nimble is one example from the Customer Relationship  Management (CRM) sphere as explored in the post Sharing Social Master Data. Using social services and exploiting social data in sales related business processes will over time affect the way we are doing customer master data management.

Social MDM2Apart from having frontend applications being social aware we also need social aware data integration services and we do indeed need social aware Master Data Management (MDM) solutions for handling data quality issues and ensuring a Single Customer View (SCV) stretching from the old systems of record to the new systems of engagement.

One service capable of doing data integration between the old world and the new world is FlipTop and some months ago I was interviewed on the FlipTop blog about the links to Social MDM here. Currently I’m working with a social aware Master Data Management solution being the iDQ™ MDM Edition.

What about you? Are your Customer Master Data Management and related data quality activities becoming social aware?

Bookmark and Share

The Data Enrichment ABC

A popular and indeed valuable method of avoiding decay of data quality in customer master data and other master data entities is setting up data enrichment services based on third party reference data sources. Examples of such services are:

  • Relocation updates like National Change Of Address services from postal services
  • Change of name, address and a variety of status updates from business directories and in some countries citizen directories too

When using such services you will typically want to consider the following options for how to deal with the updates:

A: Automatic Update

Here your internal master data will be updated automatically when a change is received from the external reference data source.

C: Excluded Update

Here an automated rule will exclude the update as there may be a range reasons for why you don’t want to update certain entity segments under certain circumstances.

B: Interactive Update

Here the update will require a form of manual intervention either to be fulfilled or excluded based on human decision.

An example will be if a utility supplier receives a relocation update for the occupier at an installation address. This will trigger/support a complex business process far beyond changing the billing address.

iDQ logo
iDQ

As explained in the post When Computer Says Maybe we need functionality within data quality tools and Master Data Management (MDM) solutions to support data stewards in cost effectively handling these situations and this certainly also applies to the B pot in data enrichment.

Right now I’m working with designing such data stewardship functionality within the instant Data Quality environment.

Bookmark and Share

What’s so special about your party master data?

My last blog post was called Is Managing Master Data a Differentiating Capability? The post is an introduction to a conference session being a case story about managing master data at Philips.

During my years working with data quality and master data management it has always struck me how different organizations are managing the party master data domain while in fact the issues are almost the same everywhere.

business partnersFirst of all party master data are describing real world entities being the same to everyone. Everyone is gathering data about the same individuals and the same companies being on the same addresses and having the same digital identities. The real world also comes in hierarchies as households, company families and contacts belonging to companies which are the same to everyone. We may call that the external hierarchy.

Based on that everyone has some kind of demand for intended duplicates as a given individual or company may have several accounts for specific purposes and roles. We may call that the internal hierarchy.

A party master data solution will optimally reflect the internal hierarchy while most of the business processes around are supported by CRM-systems, ERP-systems and special solutions for each industry.

Fulfilling reflecting the external hierarchy will be the same to everyone and there is no need for anyone to reinvent the wheel here. There are already plenty of data models, data services and data sources out there.

Right now I’m working on a service called instant Data Quality that is capable of embracing and mashing up external reference data sources for addresses, properties, companies and individuals from all over the world.

The iDQ™ service already fits in at several places as told in the post instant Data Quality and Business Value. I bet it fits your party master data too.

Bookmark and Share

Business in the Driver’s Seat for MDM

It has always been a paradox in Master Data Management (MDM), and many other IT enabled disciplines, that while most people agree that the business part of business should take the lead, often it is the IT part of business that is running the projects.

However, at Tetra Pak, a multi-national company of Swedish origin, MDM has been approached as a business problem rather than as an IT problem.

Yesterday I touched base with Program Manager Jesper Persson at Tetra Pak.

A main reason for Tetra Pak to focus on MDM was having a very specific business problem related to master data, not an IT problem. Taking it from there the business has been in the driver’s seat for the MDM journey.

Master data quality and related data quality dimensions are seen as triggers for the essential KPI’s related to process performance. The model for getting this right is starting with the business requirements, putting the needed data governance in place, getting on with managing master data which leads to the actual master data maintenance all as part of business process management.

Jesper is telling a lot more at the Master Data Management Summit Europe 2013 in London in the session Business in the Driver’s Seat for MDM – Integrating MDM with BPM.

MDM Summit Europe 2013

Bookmark and Share

Tomorrow’s Data Quality Tool

In a blog post called JUDGEMENT DAY FOR DATA QUALITY published yesterday Forrester analyst Michele Goetz writes about the future of data quality tools.

Michele says:

“Data quality tools need to expand and support data management beyond the data warehouse, ETL, and point of capture cleansing.”

and continues:

“The real test will be how data quality tools can do what they do best regardless of the data management landscape.”

As described in the post Data Quality Tools Revealed there are two things data quality tools do better than other tools:

  • Data profiling and
  • Data matching

Some of these new challenges I have worked with within designing tomorrow’s data quality tools are:

  • open-doorPoint of capture profiling
  • Searching using data matching techniques
  • Embracing social networks

Point of capture profiling:

The sweet thing about profiling your data while you are entering your data is that analysis and cleansing becomes part of the on-boarding business process. The emphasis moves from correction to assistance as explained in the post Avoiding Contact Data Entry Flaws. Exploiting big external reference data sources within point of capture is a core element in getting it right before judgment day.

Searching using data matching techniques:

Error tolerant searching is often the forgotten capability when core features of Master Data Management solutions and data quality tools are outlined. Applying error tolerant search to big reference data sources is, as examined in the post The Big Search Opportunity, a necessity to getting it right before judgment day.

Embracing social networks:

The growth of social networks during the recent years has been almost unbelievable. Traditionally data matching has been about comparing names and addresses. As told in the post Addressing Digital Identity it will be a must to be able to link the new systems of engagement with the old systems of record in order to getting it right before judgment day.

How have you prepared for judgment day?

Bookmark and Share

The Real Estate Domain

In the comments on the recent blog post about multidomain MDM (Master Data Management) it was discussed in what degree multidomain MDM is much more than CDI (Customer Data Integration) and PIM (Product Information Management).

While customer (or rather party) and product are important master entity types, there are of course a lot of other master entity types. The location domain is often mentioned as the third domain in MDM, and then there are some entity types most relevant for specific industries like an insurance policy or a vehicle in public transit, and in public transit we also have the calendar as an important master entity type.

Real estateOne of the entity types that doesn’t belong to party and in many ways is a different thing than a product is real estate (or real property or just property if you like).

For a realtor a real estate looks like a product of course. And it’s all about location, location, location.

Right now I’m working with the instant Data Quality framework. Here we are embracing the party domain by having access to external reference sources about individuals and companies, we are embracing the location domain by having access to external reference sources about addresses and then we are also embracing the real estate domain by having access to external reference sources about properties.

Real properties have addresses in many cases and are therefore close to the location domain. For some business processes it is a product with a product key like mentioned for realtors. For some business processes it is a security often identified by other keys than the postal address. It is related to different party roles like an occupier (or several) and an owner (or several) that may or may not be the same party (or parties).

What about you. Do you feel at home with the real estate entity type?

Bookmark and Share

MDM Summit Europe 2013 Wordle

The Master Data Management Summit Europe 2013, co-located with the Data Governance Conference Europe 2013, takes place in London the 15th to 17th April.

Here is a wordle with the session topics:

MDMDG 2013 wordle

Some of the words catching my eyes are:

Global is part of several headlines. There is no doubt about that governing master data on a global scale is a very timely subject. Handling master data in a domestic context can be hard enough, but enterprises are facing a daunting task when embracing party master data, product master data and location master data covering the diversity of languages, script systems, measuring systems, national standards and regulatory requirements. However, there is no way around the challenges when synergies in global enterprises are to be harvested.

RDM (Reference Data Management) is becoming a popular subject as well. Being successful with governing master data requires a steady hand with the reference data layer that sits on top of the master data. Some reference data sets may be small, but the importance of getting them right must not be underestimated.

Business. Oh yes. All the data stuff is there to enable business processes, drive business transformation and make business opportunities.

Bookmark and Share

While we are waiting for the LEI

As told in the post Business Entity Identifiers there has been a new global numbering system for business entities on the way for some time. The wonder is called LEI (Legal Entity Identifier).

fsb-leiThe implementation work has been adapted by the Financial Stability Board. The latest developments are reported in a publication called Fifth progress note on the Global LEI Initiative.

Surely, while the implementations may be in good hands, the set up doesn’t give hope for a speedy process where every legal entity in the world in a short time will have a LEI.

And then the next question will be how long it will take before organizations will have enriched existing databases with that LEI and implemented on-boarding processes where a LEI is captured with every new insertion of party master data describing a legal entity.

A good way to start to be prepared will be to implement features in on-boarding business processes where available external reference data are captured when new party entities are added to your databases. Having best available information about names, addresses and business entity identifiers available today and a culture of capturing such information will be a great starting point.

And oh, the instant Data Quality concept is precisely all about doing that.

Bookmark and Share