Data Governance, Santa Style

Multi-Domain MDM, Santa Style, is the title of a post on this blog made a couple of years ago. This post was about how a multi-domain Master Data Management solution could look like in an organization doing business as we think Santa Claus do.

Many organizations around the world who has recently embraced Master Data Management (MDM) has added Data Governance as an imperative parallel or integrated initiative. I guess Santa could have followed that path too.

Below are some thoughts about data governance considerations at the Santa Claus place based on a concept from The Data Governance Institute mentioned in the post Data Governance: Day 2:

Proactive Rules

What does it mean to be naughty or nice? I guess that must be a key question faced by Santa and his team every day. Santa is probably in no better position here than your are in many real world organizations. It is challenging to document key principles that everyone refer to every day as it turns really hard when you try to put them in a common shared business glossary.

RudoplhOngoing Services

Is Santa able to implement data governance based on the roles that the elves and the reindeers have had in daily operations around Christmas or does he have to ask Rudolph to head up a Data Governance Office or even become Chief Data Officer?

Reactive Issue Resolution

What should Santa do when the logistic elves insist on chimney positions in UTM and the reindeers can only use WGS84 coordinates? If Santa’s data governance programme does not solve this one you better watch out when Santa Claus is coming to Town.

Bookmark and Share

The Countryside Data Quality Journey Through 2015

I guess this is the time for blog posts about big things that is going to happen in 2015. But you see, we could also take a route away from the motorways and highways and see how the traditional way of life is still unfolding the data quality landscape.

LostWhile the innovators and early adopters are fighting with big data quality the late majority are still trying get the heads around how to manage small data. And that is a good thing, because you cannot utilize big data without solving small data quality problems not at least around master data as told in the post How important is big data quality?

ShittertonSolving data quality problems is not just about fixing data. It is very much also about fixing the structures around data as explained in a post, featuring the pope, called When Bad Data Quality isn’t Bad Data.

No Mans LandA common roadblock on the way to solving data quality issues is that things that what are everybody’s problem tends to be no ones problem. Implementing a data governance programme is evolving as the answer to that conundrum. As many things in life data governance is about to think big and start small as told in the post Business Glossary to Full-Blown Metadata Management or Vice Versa.

UgleyData governance revolves a lot around peoples roles and there are also some specific roles within data governance. Data owners have been known for a long time, data stewards have been around some time and now we also see Chief Data Officers emerge as examined in the post The Good, the Bad, and the Ugly Data Governance Role.

As experienced recently, somewhere in the countryside, while discussing how to get going with a big and shiny data governance programme there is however indeed still a lot to do with trivial data quality issues as fields being too short to capture the real world as reported in the post Everyday Year 2000 Problems.

Wales

Bookmark and Share

Be Prepared

Working with data governance and data quality can be a very backward looking quest. It often revolves around how to avoid a recent data disaster or catching up with the organizational issues, the process orchestration and new technology implementations needed to support current business objectives with current data types in a better way.

This may be hard enough. But you must also be prepared for the future.

open-doorThe growth of available data to support your business is a challenge today. Your competitors take advantage of new data sources and better exploitation of known data sources while you are sleeping. New competitors emerge with business ideas based on new ways of using data.

The approach to inclusion of new data sources, data entities, data attributes and digital assets must be a part of your data governance framework and data quality capability. If you are not prepared for this, your current data quality will not only be challenged by decay of current data elements but also of not sufficiently governed new data elements or lack of business agility because you can’t include new data sources and elements in a safe way.

Some essentials in being prepared for inclusion of new kinds of data are:

  • A living business glossary that facilitates a shared understanding of new data elements within your organization including how they relate to or replaces current data elements.
  • Configurable data quality measurement facilities, data profiling functionality and data matching tools so on-boarding every new data element doesn’t require a new data quality project.
  • Self-service and automation being the norm for data capture and data consumption. Self-service must be governed both internally in your organization and externally as explained in the post Data Governance in the Self-Service Age.

Bookmark and Share

Data Governance: Day 2

Much of the talking and writing about data governance these days is about how to start a data governance programme. This includes the roadmap, funding, getting stakeholders interested and that kind of stuff. Focussing on how to get a data governance programme off the ground is natural, as this is the struggle right now in many organizations.

But hopefully when, and not if, the data governance programme has left the ground and is a reality, what does the daily life look like then? I think this drawing can be a good illustration:

Daily Data Governance

The drawing is taken from the Data Governance Institute Framework provided by Gwen Thomas.

As a fan of agile approaches within most disciplines including data governance, it is worth remarking that the daily life should not be seen as an end result of a long implementation. It should certainly be seen as the above concept being upgraded over time in more and more mature versions probably starting with a very basic version addressing main pain points within your organization.

When starting a data governance programme there is typically a lot of existing business rules to be documented in a consistent way. That is one thing. Another thing is to establish the process that deals with data aspects of changing business rules and taking on new business rules as touched in the post Two Kinds of Business Rules within Data Governance.

The ongoing service and the issue resolution part is very much relying on some kind of organizational structure. This could include one of my favourites being collaboration fora between data stewards, maybe a data governance office and usually a data governance council of some name. And perhaps having a Chief Data Officer (CDO) as mentioned in post The Good, the Bad, and the Ugly Data Governance Role.

Bookmark and Share

The Matrix

The data governance discipline, the Master Data Management (MDM) discipline and the data quality discipline are closely related and happens to be my fields of work as told in the post Data Governance, Data Quality and MDM.

Every IT enabled discipline has an element of understanding people, orchestrating business processes and using technology. The mix may vary between disciplines. This is also true for the three above-mentioned disciplines.

But how important is people, process and technology within these three disciplines? Are the disciplines very different in that perspective? I think so.

When assigning a value from 1 (less important) to 5 (very important) for Data Governance (DG), Master Data Management (MDM) and Data Quality (DQ) I came to this result:

The Matrix

A few words about the reasoning for the highs and lows:

Data governance is in my experience a lot about understanding people and less about using technology as told in the post Data Governance Tools: The New Snake Oil?

I often see arguments about that data quality is all about people too. But:

  • I think you are really talking about data governance when putting the people argument forward in the quest for achieving adequate data quality.
  • I see little room for having the personal opinion of different people dictating what adequate data quality is. This should really be as objective as possible.

Now I am ready for your relentless criticism.

Bookmark and Share

Data Quality X-mas Stories

Today is 2nd of December and time for the 2nd x-mas theme on this blog this year following up on the early yuletide post about The Shortcut to Lapland.

In a way it is not in line with a main subject on this blog being diversity to focus too much on Christmas as I know that many readers may have for example Eid, Diwali or Chinese New Year as the main days of celebration during the year.

To me The Holidays is much about having light in a time of year up north that else would be very dark and even depressive. When I am in Copenhagen I live on a cosy square called Gråbrødretorv (Grey Friars Market). In summertime the square is filled with outdoor seating. Not so much in the winter. But then there is a fir tree with lights on.

Gråbrødretorv2

Anyway there is lots of stuff in the x-mas theme you can relate to data quality. Some of the older ones on this blog were:

Bookmark and Share

The Multi-Domain Data Quality Tool Magic Quadrant 2014 is out

Gartner, the analyst firm, has a different view of the data quality tool market than of the Master Data Management (MDM) market. The MDM market has two qudrants (customer MDM and product MDM) as reported in the post The Second part of the Multi-Domain MDM Magic Quadrant is out. There is only one quadrant for data quality tools.

Well, actually it is difficult to see a quadrant for product data quality tools. Most data quality tools revolves around the customer (or rather party) domain, with data matching and postal address verification as main features.

For the party domain it makes sense to have these capabilities deployed outside the MDM solution in some cases as examined in the post The place for Data Matching in and around MDM. And of course data quality tools are used in heaps of organizations who hasn’t a MDM solution.

For the product domain it is hard to see a separate data quality tool if you have a Product Information Management (PIM) / Product MDM solution. Well, maybe if you are an Informatica fan. Here you may end up with a same branded PIM (Heiler), Product MDM (Siperian) and data quality tool (SSA Name3) environment as a consequence of the matters discussed in the post PIM,  Product MDM and Multi-Domain MDM.

What should a data quality tool do in the product domain then? Address verification would be exotic (and ultimately belongs to the location domain). Data matching is a use case, but not usually something that eliminates main pain points with product data.

Some issues that have been touched on this blog are:

Anyway the first vendor tweets about the data quality tools quadrant 2014 is turning up, and I guess some of the vendors will share the report for free soon.

Magic Quadrant for Data Quality Tools 2014

Update 3rd December: I received 3 emails from Trillium Software today with a link to the report here.

Bookmark and Share