No, I am not going to continue some of the recent fine debates on who within a given company is data owner, accountable and responsible for data quality.
My point today is that many views on data ownership, the importance of upstream prevention and fitness for purpose of use in a business context is based on an assumption that the data in a given company is entered by that company, maintained by that company and consumed by that company.
This is in the business world today not true in many cases.
Direct marketing campaigns
Making a direct marketing campaign and sending out catalogues is often an eye opener for the quality of data in your customer and prospect master files. But such things are very often outsourced.
Your company extracts a file with say 100.000 names and addresses from your databases and you pay a professional service provider a fee for each row for doing the rest of the job.
Now the service provider could do you the kind favour of carefully deduplicating the file, eliminate the 5.000 purge candidates and bring you the pleasant message that the bill will be reduced by 5 %.
Yes I know, some service providers actually includes deduplication in their offerings. And yes, I know, they are not always that interested in using an advanced solution for that.
I see the business context here – but unfortunately it’s not your business.
Sending out invoices is often a good test on how well customer master data is entered and maintained. But again, using an outsourced service for that like factoring is becoming more common.
Your company hands over the name and address, receives the most of the money, and the data is out of sight.
Now the factoring service provider has a pretty good interest in assuring the quality of the data and aligning the data with a real world entity.
Unfortunately this can not be done upstream, it’s a downstream batch process probably with no signalling back to the source.
Customer self service
Today data entry clerks are rapidly being replaced as the customer is doing all the work themselves on the internet. Maybe the form is provided by you, maybe – as often with hotel reservations – the form is provided by a service provider.
So here you basically either have to extend your data governance all the way to your customers living room or office or in some degree (fortunately?) accept that the customer owns the data.
I liked the emphasis on billing operations! I’m putting together a similar post and billing ops is the center piece.
I think what this post advocates is that in order to ensure successful operations, data governance & data quality solutions need to be either extended to 3rd parties or those operations previously out-sourced need to be brought back in-house. Is that indeed the subtle message you are trying to convey?
Nice job again. You’re now my daily fix for DQ info! 🙂
Thanks William for your comments.
The question about how to govern data that is outsourced is indeed essential.
I remember some years ago I took part in the Data Quality course presented by Larry English. It was an excellent course, but during the course I asked Larry the question about how to deal with outsourced data. Larry’s answer was that you should never outsource if it meant letting outsiders manage your core data.
Perhaps a good answer, but unfortunately decisions around outsourcing are taken based on a lot of other criteria than data quality, which means, that we have to face that core data maintenance is outsourced and as data quality professionals we have to have solutions for that whether we like it or not.
Then on top of that we also have to face consequences of having customer driven databases.
Henrik good post.
You’ve raised another interesting aspect to the whole data ‘ownership’/governance issue.
With the increased used of externally managed services the governance of third parties is likely to become increasingly important.
As for insourcing core data – as you mentioned this is not always possible and sometimes would not be effective.
For example many organisation used external sources for core data about other organisations – to use when carrying out risk and compliance checks eg company ownership – ‘who owns who’.
Just a thought – is how we manage external organisations likely to be very different to internal independent/siloed business units?
Michael, thanks for the comment and asking the question about if data governance is different with outsourced data opposite to maintained by internal business units.
Maybe someone out there has first hand experience?
From what I have seen it seems that technology plays a larger role with outsourced data and customer driven databases.
This is an environment I would be fairly familiar with – the data governance is not necessarily different. I think you need to separate the issues of ‘data governance with respect to external data sources’ and ‘data governance with external data management service providers’ who could in theory be dealing with internal, external or a mixture of both data sources….
Thanks for starting this debate – which has led to an excellent question regarding governance of outsourced Vs internal data.
I suggest we step back for a moment. The same question arises when any process is outsourced, not just data handling.
I spent some time working on Sarbanes Oxley (SOX), advising how to achieve compliance. As you know, SOX requires Senior Management to ensure that appropriate processes, and critically “process controls” are in place to ensure compliance with accounting practices. Failure to do so can result in a jail sentence.
While organisations are free to outsource processes, and process parts, accountability for the process results, and critically accountability for “process controls” always remains with the parent organisation.
Thus, to comply with SOX, accountability for “data governance” remains with the organisation, regardless of whether it outsources parts of its data processing, or handles it all internally.
SOX has a concept of a “Master Control”. If one has a “Master control process” that can verify the “correctness” of the output from a series of process steps (some internal, some outsourced), then one does not require visibility of the individual controls within the process steps.
Unfortunately, it may not be possible to have a “Master Control” to verify the outputs from many outsourced data processes. In such instances, the outsourcing organisation must satisfy itself that the process controls used by the outsourced organisation meet the standard the outsourcing organisation requires. The outsourcing organanisation must also ensure that they are informed of process exceptions, and how such exceptions are handled on an ongoing basis.
I hope the above helps,
Thanks for commenting Ken and Ronan.
I have a feeling that Data Governance also may be a lot about balancing the use of internal (private) data and external (public) data in the future.