The question about if you can successfully make a data quality program without doing data governance is a recurring subject in the data management realm. This question was again discussed by Rachel Haines in a recent article called Is the Data Governance Value Message Getting Lost?
I think we have used the term data quality much longer than we have used the term data governance. Before data governance became a popular term organizations did make data quality programs without doing something called data governance. However, doing something about data quality is an act of data governance just maybe without some of the formalized things we just recently have put under the umbrella called data governance.
As I remember, we have always worked with assigning responsibilities, understanding and documenting business rules and some of the other good stuff now seen to be embraced by data governance. Doing data quality improvement without such considerations has always been pointless.
Today we have good frameworks available for data governance. Of course you should take advantage of using the maturing data governance discipline to support achieving and sustaining better data quality in order to provide better business outcomes.
This is an excellent question. However, the fact that it needs to be asked is indicative of the fragmentation that has happened in the area of data quality.
The first, and biggest, separation that occurred was when the quality of data got separated from the quality of function and process. True data quality is a fully integrated part of everything an enterprise does with quality data being created and transformed automatically at the execution of every function.
This separation created the pseudo industry of data quality. Once one pseudo industry was created another was sure to follow. The next was data governance (this used be adequately covered by having data standards). After that came data management and then master data management.
All of these self-perpetuating pseudo data industries are a huge and costly millstone for every enterprise.
The irony is that their existence, contrary to what enterprises are led to believe, is a major barrier to automatically creating quality data at ever step in everything an enterprise does.
Hello Henrik –
The quick answer to your question is “Yes,” and I’ve seen it happen in far too many organizations. Data quality operations, such as validation, standardization, and match-merge occur in various systems, but there is no coordination between the efforts or thought to the higher concept that the data is meant to represent, such as Customer or Product. After a while, you have a situation where efforts have different rules for the same concept, and are difficult to maintain and extend since they are in many different places.
Data governance places a structure into the organization that standardizes the disparate quality efforts with the larger data concept in mind, places the definition of quality rules in the business where it belongs, and coordinates the execution of quality rules to minimize maintenance efforts and cost.
So, you can do Quality without Governance, but you foster an environment that encourages “urban legend” and is not scalable of effective for business needs.
SAS Data Management Consulting
Thanks for commenting John and Stephen.
The reason why data quality tools emerged long time ago was that there was very little data quality thinking within existing applications and among system integrators. MDM platforms and related methodology is an answer to a situation in many organizations where master data cannot be administered and utilized effectively in the current IT landscape. Data governance as a discipline is indeed needed to align these efforts with people and process stuff. It would have been nice if all this was handled in parallel by the whole community of users and vendors. But that is not the way of the world.