Royal Data Quality

The intersection of royalty and data quality was touched on this blog in the post Royal Exceptions 5 years ago, when the Queen of Denmark turned 70 years old. Now when her Majesty just rounded the 75-year mark, it is time to revisit the subject.

Royal Coat of Arms of DenmarkAs always when a Royal event is around the debate on the reason of being for a Royal House stirs up. Putting the historical and emotional arguments aside, let us have a look on the economic arguments.

In Denmark there are two main arguments in favor of having a Royal House:

  • Having a president instead of a Royal House will cost the same anyway
  • The costs of the Royal House is less than the wins from brand value when exporting goods and services

Cost of having a president versus a Royal House

The idea of an expensive presidency is probably founded in looking at the amount of money countries like the United States and France puts into the safety and glory of their presidency.

On the other hand, countries may make their own choice on the level of costs for a presidency. If you look at countries like Ireland and Finland, countries of similar size of population as Denmark, their costs for the presidency is only a fraction of the costs of the Danish Royal House.

Brand Value of the Royal House

Even high-ranking executives in large Danish companies often make the argument of a high brand value attached to the Royal House. However, I doubt they have checked with their own business intelligence department.

In fact, there have not been made a single public available study on the matter, and I doubt any business college researcher will risk the career on doing so.

There was a comic situation some years ago when it was taunted that there was a correlation between Denmark getting a crown princess from Australia and a sharp rise in the Danish export to Australia. The Mary-effect it was called. Sadly, for royalists at least, a sense check revealed that Norway and Sweden had the same development without importing a crown princess from Australia.

Conclusion

I hope the above examples are Royal Exceptions and most other decisions around are taken based on carefully considered facts.

Bookmark and Share

CDI, PIM, MDM and Beyond

The TLAs (Three Letter Acronyms) in the title of this blog post stands for:

  • Customer Data Integration
  • Product Information Management
  • Master Data Management

CDI and PIM are commonly seen as predecessors to MDM. For example, the MDM Institute was originally called the The Customer Data Integration Institute and still have this website: http://www.tcdii.com/.

Today Multi-Domain MDM is about managing customer, or rather party, master data together with product master data and other master data domains as visualized in the post A Master Data Mind Map. Some of the most frequent other master domains are location master data and asset master data, where the latter one was explored in the post Where is the Asset? A less frequent master data domain is The Calendar MDM Domain.

QuadrantYou may argue that PIM (Product Information Management) is not the same as Product MDM. This question was examined in the post PIM, Product MDM and Multi-Domain MDM. In my eyes the benefits of keeping PIM as part of Multi-Domain MDM are bigger than the benefits of separating PIM and MDM. It is about expanding MDM across the sell-side and the buy-side of the business eventually by enabling wide use of customer self-service and supplier self-service.

The external self-service theme will in my eyes be at the centre of where MDM is going in the future. In going down that path there will be consequences for how we see data governance as discussed in the post Data Governance in the Self-Service Age. Another aspect of how MDM is going to be seen from the outside and in is the increased use of third party reference data and the link between big data and MDM as touched in the post Adding 180 Degrees to MDM.

Besides Multi-Domain MDM and the links between MDM and big data a much mentioned future trend in MDM is doing MDM in the cloud. The latter is in my eyes a natural consequence of the external self-service themes and increased use of third party reference data which all together with the general benefits of the SaaS (Software as a Service) and DaaS (Data as a Service) concepts will make MDM morph into something like MDaaS (Master Data as a Service) – an at least nearly ten year old idea by the way, as seen in this BeyeNetwork article by Dan E Linstedt.

Bookmark and Share

IDQ vs iDQ™

The previous post on this blog was called Informatica without Data Quality? This post digs into the messaging around the recent takeover of Informatica and the future for the data quality components in the Informatica toolbox.

In the comments Julien Peltier and Richard Branch discusses the cloud emphasis in the messaging from the new Informatica owners and especially the future of Master Data Management (MDM) in the cloud.

open-doorMy best experience with MDM in the cloud is with a service called iDQ™ – a service that shares TLA (Three Letter Acronym) with Informatica Data Quality by the way. The former stands for instant Data Quality. This is a service that revolves around turning your MDM inside-out as latest touched on this blog in the post The Pros and Cons of MDM 3.0.

iDQ™ specifically deals with customer (or rather party) master data, how to get this kind of master data right the first time and how to avoid duplicates as explored in the post The Good, Better and Best Way of Avoiding Duplicates.

Bookmark and Share

Informatica without Data Quality?

This week it was announced that Informatica, a large data management tool provider, will be taken over by a London based private equity firm and a Canadian pension invest management organization.

shark_eatThe first analyst reactions and line up of the potential benefits and the potential drawbacks can be found here on searchCIO in an article called Informatica going private could be a good thing for CIOs.

Most quotes in this article are from Ted Friedman, the Gartner analyst who writes the data quality tool magic quadrant, and Friedman notes, that the new owners doesn’t mention data quality as one of the goodies in the Informatica toolbox (opposite to data security, an area Informatica is not well known for).

So, maybe the new owners just don’t know yet what they bought, or they have a clear vision for the data management market where data quality is just being a natural part of cloud integration, master data management, data integration for next-generation analytics, and data security. The alternative routes could be decommissioning or split of, both familiar routes for this kind of take over.

Splitting of the data quality components should not be too hard, as some of these components has come to Informatica as acquisitions of Similarity Systems from Ireland and Identity Systems, which once was SSA with roots in Australia. I was actually a bit surprised when watching an Informatica presentation in London last autumn that the data quality part was the good old SSA Name3 service.

Bookmark and Share

No plan of operations extends with any certainty beyond the first contact with the full load of data

There is a famous saying from the military world stating that: “No plan survives contact with the enemy.” At least one blogger has used the paraphrasing saying: “No plan survives contact with the data.” A good read by the way.

Helmuth_Karl_Bernhard_von_Moltke

Helmuth von Moltke the Elder

Like most famous sayings also this phrase is simplified from the original version. The military observation made by Helmuth von Moltke the Elder is in full length: “No plan of operations extends with any certainty beyond the first contact with the main hostile force.”

Translating the extended military learning into data management makes a lot of sense too. You may plan data management activities using selected examples and you may test those using nice little samples. Like skirmishes before the real battle in warfare. But if your data management solution goes live on the full load of data for the first time, there most often is news for you.

From my data matching days I remember this clearly as explained in the post Seeing is Believing.

The mitigation is to test with a full load of data before going live. In data management we actually have a realistic way of overcoming the observation made by Field Marshall Helmuth Carl Bernard Graf von Moltke and revisit our plan of operations before the second and serious contact with the full load of data.

Bookmark and Share

Business Agility, Continuous Improvement and MDM

Being able to react to market changes in an agile way is the path to the survival of your business today. As you may not nail it in the first go, the ability to correct with continuous improvement is the path for your business to stay alive.

open-doorDoing business process improvement most often involves master data as examined in the post Master Data and Business Processes. The people side of this is challenging. The technology side isn’t a walkover either.

When looking at Master Data Management (MDM) platforms in sales presentations it seems very easy to configure a new way of orchestrating a business process. You just drag and drop some states and transitions in a visual workflow manager. In reality, even when solely looking at the technical side, it is much more painful.

MDM solutions can be hard to maneuver. You have to consider existing data and the data models where the data sits. Master data is typically used with various interfaces across many business functions and business units. There are usually many system integrations running around the MDM component in an IT landscape.

A successful MDM implementation does not just cure some pain points in business processes. The solution must also be able to be maneuvered to support business agility and continuous improvement. Some of the data quality and data governance aspects of this is explored in the post Be Prepared.

Bookmark and Share

The Data Matching Institute is Here

Within data management we already have “The MDM Institute”, “The Data Governance Institute” and “The Data Warehouse Institute (TDWI)” and now we also have “The Data Matching Institute”.

TDMIThe founder of The Matching Institute is Alexandra Duplicado. Aleksandra says: “The reason I founded The Institute of Data Matching is that I am sick and tired of receiving duplicate letters with different spellings of my name and address”. Alex is also pleased about, that she now have found a nice office in edit distance of her home.

Before founding The Matching of Data Institute Alexander worked at the Universal Postal Union with responsibility for extra-terrestrial partners. When talking about the future of The Match Institute Sasha remarks: “It is a matter of not being too false positive. But it is a unique concept”.

One of the first activities for The Data-Matching Institute will be organizing a conference in Brussels. Many tool vendors such as Statistical Analysis System Inc., Dataflux and SAS Instiute will sponsor the Brüssel conference. I hope to join many record linkage friends in Bruxelles says Alexandre.

The Institute of Matching of Data also plans to offer a yearly report on the capabilities of the tool vendors. Asked about when that is going to happen Aleksander says: “Without being too deterministic a probabilistic release date is the next 1st of April”.

Bookmark and Share