Quality of Data Behind the Data Quality Magic Quadrant

Last week the Gartner Magic Quadrant for Data Quality Tools was published. You may have a free look thru some of the vendor’s sites. For example SAP has a link here.

I’m not going into who are leaders, visionaries, challengers or niche players. I’m a bit puzzled about who is in there at all.

We may look at two UK based vendors:

  • Datactics has a good position among the niche players
  • Experian QAS is not in the quadrant, but is mentioned among the vendors not meeting the inclusion criteria

If you look up Datactics on LinkedIn there are 14 employees there. If you look up Experian QAS UK on LinkedIn there are 369 employees there (and QAS has subsidiaries around the world too). This balance of strength resembles what I know from business directories.

Now, the inclusion criteria set up by Gartner may make a lot of sense, but I find it strange that it so obviously fails to reflect market reality.

Please find more information about how another analyst includes players (compared to Gartner) in the post The Data Quality Tool Vendor Difference.

Bookmark and Share

Searching for Data Quality (and Decency)

As I have mentioned here on the blog (and maybe even too often) I am right now involved in making the roadmap for and promoting a tool for getting better data quality by searching and mashing up available external information in the cloud and in internal master databases.

The tool is called iDQ (instant Data Quality).

In promoting such a solution we are interested in engaging in a dialogue with people who are searching for data quality.

So are a lot of other vendors in the data quality tool market of course.

In that quest vendors are looking for having a better ranking in search engines when people are searching for data quality, data cleansing and similar terms.

An often used technique for that is link building. Here you (over) use the terms data quality, data cleansing and so and every time you make a link from the term to your home page.

Examples are the blog posts form DQ Global and an endless stream of data quality news from Experian QAS.

However some vendors link building is done not only on own blogs and news lists but also on other sites for example by making comments on this blog.

Examples are this one linking to Experian QAS and this one linking to HelpIT.

It is my impression that these comments are made by SEO agencies hired by the vendors. The agencies make comments with a random name like in these cases “Smith” (ah, John Smith, I know him) and “Peter Parker” (or is it Spider-Man).

Methinks: This may help promoting tools when searching for data quality. But it doesn’t help with finding decency.

Bookmark and Share

Hot and Magic Medal Counting

In the ongoing Olympic Games one often displayed list is the list of medals per nation.

The list reminds me about the occasional analyst report ranking of Data Quality tools and Master Data Management (MDM) solutions. The latest one is fresh pressed as told in the post called Product Information Management is HOT for Business by Ventana Research, where the PIM vendors are ranked with Stibo Systems being the most HOT.

The counting of medals in the Olympic Games in London this afternoon looks like this:

As expected the top race is between the big teams from United States and China just as the mega vendors of tools also always receives good rankings by analysts though with a few exceptions as reported in the post The Data Quality Tool Vendor Difference, where the Gartner MAGIC Quadrant is compared with the ranking from Information Difference.

As often seen the home team, Great Britain and Northern Ireland, is also doing very well. With tools we also see that the Most Times the Home Team Wins despite of analyst ranking when a local client selects a tool.

Other big teams as Russia, Japan and Australia are currently struggling to get more gold medals to climb the list if ranked by gold (instead of total number of medals). Perhaps we will see a closer race with more teams in the last week just as expected with MDM tools as reported in the post Photo Finish in MDM Vendor Race.

The smaller nations often does it better in a small range of disciplines, like Ethiopia in running and Denmark in rowing and sailing resembling the situation described in the post Who is not Using Data Quality MAGIC, as there are plenty of Data Quality tools out there very feasible in certain tasks and local circumstances.

Bookmark and Share

Return on Investment in Big Reference Data

Currently I’m working with a cloud based service where we are exploiting available data about addresses, business entities and consumers/citizens from all over the world.

The cost of such data varies a lot around the world.

In Denmark, where the product is born, the costs of such data are relatively low. The joys of the welfare state also apply to access to open public sector data as reported in the post The Value of Free Address Data. Also you are able to check the identity of an individual in the citizen hub. Doing it online on a green screen you will be charged (what resembles) 50 cent, but doing it with cloud service brokerage, like in iDQ™, it will only cost you 5 cent.

In the United Kingdom the prices for public sector data about addresses, business entities and citizens are still relatively high. The Royal Mail has a license tag on the PAF file even for government bodies. Ordnance Survey is given the rest of AddressBase free for the public sector, but there is a big tag for the rest of the society. The electoral roll has a price tag too even if the data quality isn’t considered for other uses than the intended immediate purpose of use as told in the post Inaccurately Accurate.

At the moment I’m looking into similar services for the United States and a lot of other countries. Generally speaking you can get your hands on most data for a price, and the prices have come down since I checked the last time. Also there is a tendency of lowering or abandoning the price for the most basic data as names and addresses and other identification data.

As poor data quality in contact data is a big cost for most enterprises around the world, the news of decreasing prices for big reference data is good news.

However, if you are doing business internationally it is a daunting task to keep up with where to find the best and most cost effective big reference data sources for contact data and not at least how to use the sources in business processes.

Wednesday the 25th July I’m giving a presentation, in the cloud, on how iDQ™ comes to the rescue. More information on DataQualityPro.

Bookmark and Share

Beyond Address Validation

The quality of contact master data is the number one data quality issue around.

Lately there has been a lot of momentum among data quality tool providers in offering services for getting at least the postal address in contact data right. The new services are improved by:

  • Being cloud based offering validation services that are implemented at data entry and based on fresh reference data.
  • Being international and thus providing address validation for customer and other party data embracing a globalized world.

Capturing an address that is aligned with the real world may have a significant effect on business outcomes as reported by the tool vendor WorldAddresses in a recent blog post.

However, a valid address based on address reference data only tells you if the address is valid, not if the addressee is (still) on the address, and you are not sure if the name and other master data elements are accurate and complete. Therefore you often need to combine address reference data with other big reference data sources as business directories and consumer/citizen reference sources.

Using business directories is not new at all. Big reference sources as the D&B WorldBase and many other directories have been around for many years and been a core element in many data quality initiatives with customer data in business-to-business (B2B) environments and with supplier master data.

Combining address reference data and business entity reference data makes things even better, also because business directories doesn’t always come with a valid address.

Using public available reference data when registering private consumers, employees and other citizen roles has until now been practiced in some industries and for special reasons. Therefore the big reference data and the services are out there and being used today in some business processes.

Mashing up address reference data, business entity reference data and consumer/citizen reference data is a big opportunity for many organizations in the quest for high quality contact master data, as most organizations actually interact with both companies and private persons if we look at the total mix of business processes.

The next big source is going to be exploiting social network profiles as well. As told in the post Social Master Data Management social media will be an additional source of knowledge about our business partners. Again, you won’t find the full truth here either. You have to mashup all the sources.

Bookmark and Share

Olympic Moments

The London 2012 Olympic Games is approaching. You feel that very well in London. For example my usual walking path thru Hyde Park is closed because of the upcoming sport event.

I’m sure these games are going to produce some great moments. Some of the moments I’m remembering from previous games have a touch of data quality technology learning attached.

The Fosbury Flop

In 1968 the American athlete Dick Fosbury introduced a better way of doing the high jump. What I find interesting about the Fosbury Flop is that this technique hasn’t always been possible. In the old days the jumpers landed in a sandpit. If you did the flop then, it would certainly be a flop most probably getting you injured after the first attempt. But after deep foam matting was put in place, the flop has been a good choice.

It’s the same with data quality technology. Some techniques for improvement you have found to be a flop previously may because of new circumstances be a good choice today. The high esteemed scissors jump didn’t prevail forever.

Eddie the Eagle

In 1988, at the winter event, a Brit made a lot of headlines by being totally bad at ski jumping. Eddie the Eagle finished not surprisingly far behind natural born Finnish, Norwegian and Czech ski jumpers coming from a country where the first sign of the white fluffy stuff from above isn’t considered a severe weather condition. But Eddie set a new British record.

It’s the same with data quality technology. Some tools and services are leading in some countries, but have a hard time when challenged internationally.

Sailing under Wrong Flag

In the 2008 games something spectacular happened in the sailing competitions. The Danish 49er boat was in first place but broke the mast when leaving the harbor for the last race. The Croatian team offered their boat. The Danes sailed into the race long after the other boats have started, but managed to get a result just good enough to secure the gold. The other teams might have been confused by the wrong flag.

As told in the post Most Times the Home Team Wins flags are important – in sports, in data quality and other data management disciplines too.

2012

What do you guess will make a difference in this year’s Olympic Games? – And in Data Quality improvement?

Bookmark and Share

State of this Data Quality Blog

Today is a big day on this blog as it has been live for 3 years.

Success versus Failure

The first entry called Qualities of Data Architecture was a promise to talk about data quality success stories. The reason for emphasizing on success stories related to data quality is a feeling that data quality improvement is too often promoted by horror stories telling about how bad your business may go if you don’t pay attention to data quality.

The problem is that stories about failure usually aren’t taken too seriously. Jim Harris recently had a very good take on that in the post Data Quality and Chicken Little Syndrome.

So, I plan to tell even more success stories along with the inevitable stories about failure that so easily and obviously could have been avoided.

Getting Social

Using social networks to promote your blogging is quite natural.

At the same time social networks has emerged as new source in doing master data management (I call this Social MDM).

Exploring this new discipline over the hype peak, down through the valley of disappointment and up to the plateau of productivity will for sure be a recurring subject on this blog.

People, Processes and Technology

Sometimes you see a statement like “Data Quality is not about technology, it’s all about people”.

Well, most things we can’t solve easily are not just about one thing. In my eyes the old cliché about addressing people, processes and technology surely also relates to getting data quality right.

There are many good blogs around about people and processes. On this blog I’ll try to tell about my comfort zone being technology without forgetting people and processes.

The Hidden Agenda

Most people blogging are doing this to promote our (employers) expertise, services and tools and I am not different.

Lately I have written a lot about a second to none cloud based service for upstream data quality prevention. The wonder is called instant Data Quality.

While upstream prevention is the best approach to data quality still a lot of work must be done every day in downstream cleansing as told in the post Top 5 Reasons for Downstream Cleansing.

As I’m also working with a new stellar cloud based platform for data quality improvement productivity I will for sure share some props for that in the near future.

Bookmark and Share

Data Driven Data Quality

In a recent article Loraine Lawson examines how a vast majority of executives describes their business as “data driven” and how the changing world of data must change our approach to data quality.

As said in the article the world has changed since many data quality tools were created. One aspect is that “there’s a growing business hunger for external, third-party data, which can be used to improve data quality”.

Embedding third-party data into data quality improvement especially in the party master data domain has been a big part of my data quality work for many years.

Some of the interesting new scenarios are:

Ongoing Data Maintenance from Many Sources

As explained in the article on Wikipedia about data quality services as the US National Change of Address (NCOA) service and similar services around the world has been around for many years as a basic use of external data for data quality improvement.

Using updates from business directories like the Dun & Bradstreet WorldBase and other national or industry specific directories is another example.

In the post Business Contact Reference Data I have a prediction saying that professional social networks may be a new source of ongoing data maintenance in the business-to-business (B2B) realm.

Using social data in business-to-consumer (B2C) activities is another option though also haunted with complex privacy considerations.

Near-Real-Time Data Enrichment

Besides updating changes of basic master data from business directories these directories typically also contains a lot of other data of value for business processes and analytics.

Address directories may also hold further information like demographic stereotype profiles, geo codes and property data elements.

Appending phone numbers from phone books and checking national suppression lists for mailing and phoning preferences are other forms of data enrichment used a lot related to direct marketing.

Traditionally these services have been implemented by sending database extracts to a service provider and receiving enriched files for uploading back from the service provider.

Lately I have worked with a new breed of self service data enrichment tools placed in the cloud making it possible for end users to easily configure what to enrich from a palette of address, business entity and consumer/citizen related third-party data and executing the request as close to real-time as the volume makes it possible.

Such services also include the good old duplicate check now much better informed by including third-party reference data.

Instant Data Quality in Data Entry

As discussed in the post Avoiding Contact Data Entry Flaws third-party reference data as address directories, business directories and consumer/citizen directories placed in the cloud may be used very efficiently in data entry functionality in order to get data quality right the first time and at the same time reduce the time spend in data entry work.

Not at least in a globalized world where names of people reflect the diversity of almost any nation today, where business names becomes more and more creative and data entry is done at shared service centers manned with people from cultures with other address formatting rules, there is an increased need for data entry assistance based on external reference data.

When mashing up advanced search in third-party data and internal master when doing data entry you will solve most of the common data quality issues around avoiding duplicates and getting data as complete and timely as needed from day one.

Bookmark and Share

Pulling Data Quality from the Cloud

In a recent post here on the blog the benefits of instant data enrichment was discussed.

In the contact data capture context these are some examples:

  • Getting a standardized address at contact data entry makes it possible for you to easily link to sources with geo codes, property information and other location data.
  • Obtaining a company registration number or other legal entity identifier (LEI) at data entry makes it possible to enrich with a wealth of available data held in public and commercial sources.
  • Having a person’s name spelled according to available sources for the country in question helps a lot with typical data quality issues as uniqueness and consistency.

However, if you are doing business in many countries it is a daunting task to connect with the best of breed sources of big reference data. Add to that, that many enterprises are doing both business-to-business (B2B) and business-to-consumer (B2C) activities including interacting with small business owners. This means you have to link to the best sources available for addresses, companies and individuals.

A solution to this challenge is using Cloud Service Brokerage (CSB).

An example of a Cloud Service Brokerage suite for contact data quality is the instant Data Quality (iDQ™) service I’m working with right now.

This service can connect to big reference data cloud services from all over the world. Some services are open data services in the contact data realm, some are international commercial directories, some are the wealth of national reference data services for addresses, companies and individuals and even social network profiles are on the radar.

Bookmark and Share

Social MDM, Privacy and Data Quality

The term “Social MDM” has been promoted quite well this week not at least as part of the social media information stream from the ongoing user conference of the tool vendor Informatica.

In a blog post called Informatica 9.5 for Big Data Challenge #2: Social Jody Ko of Informatica introduces the opportunities and challenges.

In the closing remarks Judy says: “There’s still a long way to go to bring social data into the mainstream enterprise, in part due to concerns over privacy and the potential “creepiness” factor of mining social data.”

As I understand it the spearhead Social MDM part of the tool release is a Facebook App that provides connectivity between Facebook and the MDM solution.

Industry analyst R “Ray” Wang examines this in the blog post News Analysis: Informatica Launches MDM 9.5. The analysis states that it now is time to “drive data out of Facebook and not into Facebook”.

The opportunities and challenges of driving data out of Facebook was discussed in a post called exactly Out of Facebook here on the blog some years ago.

Balancing privacy with data hoarding is still for sure a subject that in no way is settled and probably never will be.

Connecting systems of record in traditional MDM solutions with social network profiles is in no way a walk over too. The classic data quality challenges with uniqueness of records and completeness of data only gets more difficult, but also, there are great opportunities for getting a better picture of your customers and other business partners.

If you are interested in Social MDM and the related challenges and opportunities there is a LinkedIn group for Social MDM.

The group is new, less than a month old at the present time, but there is already a lot of content to dip into, including:

Bookmark and Share