There is an issue I have come over and over again when creating a master data hub, making a golden copy, establishing a single version of the truth or whatever we like the name to be. The issue is about the scope of data sources.
Basically you take (practically) all the master data sources from within your organization and consolidate these data. Often you match with external sources as business directories and so. But what you often miss is the master data operated by your partners. These are partners like:
- Your suppliers of products, be that raw materials or finished products for resale
- Your sales agents and distributors
- Your service providers as direct marketing agencies and factoring partners
These partners are part of your business processes and they often create and consume master data which are only shared with you in a limited way via some form of interface.
I know that even handling master data from within most organizations is a complex issue. Integrating with external reference data doesn’t add simplicity. But without embracing the master data life at your partners, the hub isn’t complete; the copy is only made of plated gold and the single version of the truth isn’t the only truth.
My guess is that many master data programs in the future will extend to embrace internal (private) data, as well as external (public) data and bilateral data as described on the page about Data Quality 3.0.
My organization’s technology allows you to do just what you are talking about via private web connections, e.g., partner portals or public web sites. Internet Data Management is a growing part of the MDM stack and many companies are using it to update internal data resources after they discover the limitations of MDM with data only inside the firewall.
You are absolutely correct. We live in a connected world, and more and more data used by organisations comes from external sources. Regardless of whether data comes from within or outside an organisation, it is critical that the organisation has controls in place to verify that the data is “fit for purpose”.
This is now a regulatory requirement for any Insurance company selling services within Europe. I quote from the SOLVENCY II CEIOPS guidance paper 43:
“Appropriateness, completeness and accuracy of data
3.61 The assessment of the quality of data used in the calculation of technical provisions… The assessment shall take into account the set of available data which is necessary and relevant to carry out the intended analysis. This includes both internal and external information to the undertaking.”
And here’s a quote from CEIOPS advice paper 56:
“5.133. Furthermore, CEIOPS is not aware of any reasons that justify treating external data differently from internal data as regards data quality. From a practical point of view, there will be differences in the type of assessment that can be made (e.g. the assessment of accuracy for external data will necessarily need to follow a different route, as the data has not been collected and compiled by the undertaking), but this does not justify setting different requirements for external data. Therefore,
data quality requirements should apply to data irrespective of the source.”
Thanks Ken for sharing these quotes from the compliance world. As we discussed when we met in Dublin earlier this year, the two main drivers for better data quality are the need for better business intelligence and the demand for better compliance.