The idiom turning a blind eye originates from the sea battle at Copenhagen where Admiral Nelson ignored a signal with permission to withdraw by raising the telescope to his blind eye and say “I really do not see the signal”.
Nelson went on and won the battle.
As a data quality practitioner you are often amazed by how enterprises turns the blind eye to data quality challenges and despite horrible data quality conditions keeps on and wins the battle by growing as a successful business.
The evidence about how poor data quality is costing enterprises huge sums has been out there for a long time. But business success are made over and again despite of bad data. There may be casualties, but the business goals are met anyway. So, the poor data quality is just something that makes the fight harder, not impossible.
I guess we have to change the messaging about data quality improvement away from the doomsday prophesies, which make decision makers turn a blind eye to data quality challenges, and be more specific on maybe smaller but tangible wins where data quality improvement and business efficiency goes hand in hand.
Great Post Henrik! As a data quality practitioner I must admit that I sometimes have trouble seeing the “Forest for the chlorophyll”. The worst days may very well be when it seems obvious to me that data quality should be “job one” whereas the organization is winning strategically by fighting all sorts of tactical battles that don’t include DQ. And they ignore me like I am saying the sky is falling.
So thanks Henrik, It really does help explain the situation, doesn’t make the data quality issues go away, but I understand better now why it is that organizations “turn a blind eye” to DQ. I will just have to dig a bit deeper and show them that their tactical battles will be won more frequently if they sometimes improve DQ at the same time.
Thanks Gordon. I like the blindness term with chlorophyll 🙂
Nice analogy, Henrik!
Obviously, there are business branches which are more resistant to data quality issues than others. So neglecting data quality will not make them fail, but definitely less successful than possible. However, as long as a business is “sufficiently” successful, executive management will rarely raise the question: “Could we be more successful, if we implemented a data quality program?”
Therefore, I share your conclusion that data quality consultants should suggest an incremental approach to prove the ROI.
A few observations: “There may be casualties, but business goals are met anyway” . I think the main problem in communicating the issues/cost of data consistency issues(data quality) to a business is the ease of conceptualization. A business typically overcomes large generalized problems late paying customers, poor employees, bad vendors etc.. All of which they usually overcome with the passage of time, more then anything else, delayed implementation and acceptance. the just get built into the planning. As with everything else in order to convince an executive to “fix” something it has to be really easy to do, not involve a lot of collaboration and be cheap.
Great post, Henrik, raising vital questions, the biggest of these being, ‘Does poor data quality really have a negative impact on the business’.
All DQ practitioners will shout out in unison, ‘Of course it does!’ But that raises more questions, like:
o How do they know that it does?
o Can they quantify the costs?
o Can they demonstrate the tangible financial benefits of all of the expensive DQ initiatives?
This may point DQ practitioners in a different direction. Instead of their conversations with the business being about ‘deduplication’, ‘golden records’ and advocating the purchase of expensive and ‘miraculous’ DQ software, they will change to conversations about:
o Identifying where data quality issues are costing the business money
o Quantifying the loses involved
o Costing initiatives to correct the issues
o Calculating the ROI for each initiative
o Only embarking on those that show a sufficient ROI
o Demonstrating that real savings have been made as the result of each initiative
This would turn Data Quality from an abstract, intellectual and growing technical exercise to a targeted set of key initiatives, with a beginning. middle and END, that bring about real business benefits.
The DQ department would ultimately prove to the business that its initiatives had all worked by making itself redundant closing down!!
Thanks for joining Axel, Ira and John.
We can’t avoid talking Return on Investment (ROI) for sure.
Calculating ROI has always been hard and will always be that within data quality and the most related discipline being Master Data Management (MDM). As said in the MDM book by Dalton Cervo and Mark Allen: “Attempts to try to calculate and project ROI will be swag at best and probably miss the central point that MDM is really an evolving business practice that is necessary to better manage your data, and not a specific project with a specific expectation and time-based outcome that can be calculated up front”.
Returns are often seen in conjunction with the business initiatives that data quality improvement activities are supporting.
On the investment side we can of course try to use less costly resources be that in terms of manpower and the price tag on the tools used.
Data Quality has just as much chance to excite business people as any other technology but we have to remind ourselves that business people think quite differently from technologists, & the subject matter doesn’t change that. The basic philosophy of a business person is: “show me the money”. Cash straight to the bottom line is the most persuasive argument. But we will also settle in many cases for indirect routes such as customer satisfaction, productivity, cost savings & even brand reputation. Whatever the ROI is though, get to the point & “show me the money”
To reinforce the point re demonstrating ROI – if data quality is a journey not a destination, it follows that there are different routes that an organization can take, based on differing priorities in relation to DQ. Some DQ projects are essential and have to be tackled up front so that users have sufficient confidence in the data e.g. matching to create a customer-centric database prior to implementing a new CRM system. Others can be delayed, but with benefits also being delayed – often, establishing a single customer view across multiple systems will fall into this camp. Part of the job of the DQ specialist is to make it clear what the downside of delay to DQ projects is and to quantify it.
@Steve I think you list the major reasons why the enterprises might not trust Data Quality practitioners and why the current approach to DQ is doomed to fail.
The first is that DQ should definitely be a destination. Making it a journey is giving DQ practitioners, and their grandchildren, a job for life. Data Quality needs to be an embedded in everything that the business does. Quality data should be created as an integral part of doing business day-to-day.
Having a ‘Customer centric’ database and single view of ‘Customer’ is a totally flawed approach that DQ persists with. This needs to be a Party Centric view. Customer is merely a role. It is NOT a Master Data Entity.
Huzzah!! Let the games begin, John, I think you are spot on!! Too further “pile on” if you consider some of the emerging approaches to “Big Data” such as Magnetic, Agile , Deep(MAD) http://mdavey.wordpress.com/2011/04/28/magnetic-agile-deep-mad/.
The focus will be more on adaptive models and data driven approaches , leveraging current hardware cost and Hadoop processing capabilities and less on traditional EDW approaches.
Traditional Data Quality processes will have to be timely and immediate and definitely an embedded part of the solution, not a process.
I love this post Henrik. You could have easily written hundreds of words on this topic but I think by keeping it short it is far more purposeful and direct.
I agree that a big problem we face with data quality is too often using stories such as DQ trainwrecks to emphasize the need for action. Whilst crashing space probes and huge international financial scandals obviously add weight to the cause, most execs simply can’t relate to these issues because they’re not happening in their organisations and are probably unlikely to.
Like you say, the real issue is demonstrating the value of smaller, more tangible initiatives that everyone can see the value for.
I’ve been writing about this recently on Data Quality Metaphors:Data Quality = Sick Days (http://www.dataqualitypro.com/members/blog_view.asp?id=703684&post=138805)
My point in that post, and I think it ties in here, is that imagine if you could reduce your annual sick days within a company, even by a small percentage. It would have a major impact on the bottom line, customer service, project lead times etc.
Great post, love the metaphor you use.
Thanks Andy, Steve, John, Ira and Dylan for adding in with good points around future messaging and positioning of data quality tools and services.
Great conversation you have started, Henrik. It inspired me to dive deeper than just a comment here on your blog. In short, I now pose the question “Data Quality – How Much is Enough”?
Thanks for the inspiration.
Bryan, thanks a lot for your comment and a great follow up on your site.
Henrik – a great post that is very much in line with our approach.
This is one of the key excuses for poor data quality as discussed in my post http://dataqualitymatters.wordpress.com/2011/08/02/top-5-excuses-for-bad-data/.
the same principle – start small and prove value is just as critical for data governance initiatives. It is highly optimistic to expect business to fork out budget for data governance without seeing any kind of return.
Thanks a lot Gary. Excellent post on top 5 bad excuses for not buying our tools and services 🙂
Great discussion guys and I find the debate around whether data quality is a journey or a destination fascinating. I actually agree with both sides of the debate so far (nothing like sitting on the fence!), let me explain. I think John is absolutely correct in his point that “Quality data should be created as an integral part of doing business day-to-day”, however I would class this as just one destination of many on a long and somewhat recursive journey. In fact there are numerous journeys each with their own destinations (maybe destination is the wrong word as it does suggest a terminus). I think the point should be that there is no “final destination”. I also believe that this is where the concept of a maturity model can come into play (perhaps a blog post coming up).
I actually disagree with the problem of creating “a job for life”. Data quality *is* a job for life, just because it transitions into a BAU process does not mean that that focus should terminate. The baton passes from out-and-out DQ practitioners to BAU staff but the journey continues.
Garry, thanks for joining the discussion.
There are in my eyes two things that will make new data quality movement necessary:
• Business objectives changes over time which will affect data quality benchmark and prioritizing.
• Technology evolves which will make thing previously not possible an option in the future. Sticking to old standards will eventually put you out of business.
(On the latter note if it were today I guess Nelson wouldn’t even have made it across the North Sea with his ships of the line before being driftwood).
Great post. I am experiencing this firsthand in the work I do so it’s encoraging to know others have the same issue.
Really a nice article. It really explained the situation to make the data quality problems go away. Thanks for sharing this post!
Suzy, thanks for the comment and the link to the services from Experian QAS. I visited QAS in London only a month ago. Great people.