This year I will be joining FIMA: Europe’s Premier Financial Reference Data Management Conference for Data Management Professionals. The conference is held in London from 8th to 10th November.
I will present “Diversities In Using External Registries In A Globalised World” and take part in the panel discussion “Overcoming Key Challenges In Managing Client On-Boarding Data: Opportunities & Efficiency Ideas”.
As said in the panel discussion introduction: The industry clearly needs to normalise (or is it normalize?) regional differences and establish global standards.
The concept of using external reference data in order to improve data quality within master data management has been a favorite topic of mine for long.
I’m not saying that external reference data is a single source of truth. Clearly external reference data may have data quality issues as exemplified in my previous blog post called Troubled Bridge Over Water.
However I think there is a clear trend in encompassing external sources, increasingly found in the cloud, to make a shortcut in keeping up with data quality. I call this Data Quality 3.0.
The Achilles Heel though has always been how to smoothly integrate external data into data entry functionality and other data capture processes and not to forget, how to ensure ongoing maintenance in order to avoid else inevitable erosion of data quality.
Lately I have worked with a concept called instant Data Quality. The idea is to make simple yet powerful functionality that helps with hooking up with many external sources at the same time when on-boarding clients and making continuous maintenance possible.
One aspect of such a concept is how to exploit the different opportunities available in each country as public administrative practices and privacy norms varies a lot over the world.
I’m looking forward to present and discuss these challenges and getting a lot of feedback.
One of the greatest all pervading problems of databases is maintaining the occurrences what is seen as ‘static’, ‘reference’ or ‘master’ data.
Databases have always been seen as a wonderful places to go and get data from, but how do you get quality data into them?
Well, the only truly effective way of doing this is to make the creation and transformation of all data in an database an integrated part of doing the business of the enterprise, that is, executing Business Functions.
Data cannot be effectively maintain by pseudo functions to maintain files such as “Maintain the Postcodes File”. The only place that this type of information will be adequately maintained is by an enterprise in which there is a Business Function such as “Define Valid Postal Addresses for Germany (or France, or italy….)”
If an enterprise needs such valid addresses then it should purchase them from an enterprise whose business it is to define them and then have an internal Business Functions such as a) Define Valid Countries for Trading (our) Enterprise and b) Define Valid Postal Addresses for (our) Valid Trading Countries.
As all Business Functions are there the responsibility of a business manager (as opposed to an IT manager), then it is the responsibility of the business to ensure that all these functions are carried out in a timely manner and as in integrated part of doing business day-to-day.
Thanks for adding in John. I agree with one little twist. We must see business mangers and IT managers not as opposed to each other but as collaborating with each other. Some data are used by many business functions and therefore sometimes responsibilities are placed with someone having the employee record assigned to the IT department.
For Security and Price Masters you cannot assume that a vendor is 100% correct. The best you can do is:
1- Use multiple vendors where some may be good at one subsection of data, then drive the associated data using that vendor’s data.
1.A- As an example use Thomson Reuters for RICs and Listing Level Coverage. Use Bloomberg for the BB-ID and Pricing, including historical ticks,
2- Use one another to define agreement and outliers.
3- Have a really efficient research team that cover Global Corporate Actions.
* Are the data for Trading, Mid-office, Back office or custodial processing. This will define the need and requirements for the manual efforts.
My associates and I have and continue to build this processing for the financial community.
Regarding reference data such as ISO CCY Codes, ISO Country Codes among others can be purchased or sucked off the web, however, it must be vetted and enhacced requiring a good team.
This Data Quality Czar position appears to be another level of politics and really should be inspired by quality management with a good and well trained team.