Rapid and Vanity Addressing – and the Apple Hotel

Mid next month iDQ will move our London office to a new address:

iDQ A/S
2nd Floor
Berkeley Square House
Berkeley Square
London
W1J 6BD
United Kingdom

It’s a good old English address including a lot of lines on an envelope.

The address could be either shorter or longer.

The address below will in fact be enough to have a letter delivered:

iDQ A/S
2nd Floor
W1J 6BD
UK

Due to the granular UK postal code system a single post code may either be a single address a part of a long road or a small street.

This structure is also what is exploited in what is called rapid addressing, where you only type in the need data and the rest is supplied by a (typically cloud) service.

But sometimes people want their addresses presented in a different way than the official way. Maybe I want our address to be:

iDQ A/S
2nd Floor
Berkeley Square House
Berkeley Square
Mayfair
London
W1J 6BD
United Kingdom

Mayfair is a nice part of London. Insisting in including this element in the address is an example of vanity addressing.

Here’s the map of the area:

Notice the place in the upper right corner of the Google Map: Apple Store Regent Street. With an icon with a bed. This means it’s a hotel. Is the Apple Store really a hotel? No – except for some while ago when people slept in front of the store waiting for a product with a notable map service as reported by Richard Northwood (aka The Data Geek) in the post Data Quality Failure – Apple Style.

Well Google, you can’t win them all.

Bookmark and Share

Doing MDM in the Cloud

As reported in the post What to do in 2012 doing Master Data Management (MDM) in the cloud is one of three trends within MDM that according to Gartner (the analyst firm) will shape the MDM market in the coming years.

Doing MDM in the cloud is an obvious choice if all your operational applications are in the cloud already. Such a solution was presented on Informatica Perspectives in the blog post Power the Social Enterprise with a Complete Customer View. The post includes a Video where the situation with multiple instances of SalesForce.com solutions within the same enterprise is supported by a master data backbone in the cloud.

But even if all your operational applications are on premise you may start with lifting some master data management functionality up in the cloud. I am currently working with such a solution.

When onboarding customer (and other party) master data much of the basic information needed is already known in the cloud. Therefore lifting the onboarding functionality up into the cloud makes a lot of sense. This is the premise, so to speak, for the MDM edition of the instant Data Quality (iDQ) solution that we are working on these days.

Cloud services for the other prominent MDM domain being product master data also makes a lot of sense. As told in the post Social PIM a lot of basic product master data may be shared in the cloud embracing the supply chain of manufacturers, distributors, retailers and end users.

In both these cases some of the master data management functionality is handled in the cloud while the data integration stuff takes place where the operational applications resides be that in the cloud and/or on premise.

Bookmark and Share

Free and Open Public Sector Master Data

Yesterday the Danish Ministry of Finance announced an agreement between local authorities and the central government to improve and link public registers of basic data and to make data available to the private sector.

Once the public authorities have tidied up, merged the data and put a stop to parallel registration, annual savings in public administration could amount to 35 million EUR in 2020.

Basic open data includes private addresses, companies’ business registration numbers, cadastral numbers of real properties and more. These master data are used for multiple purposes by public sector bodies.

Private companies and other organizations can look forward to large savings when they no longer have to buy their basic data from the public authorities.

In my eyes this is a very clever move by the authorities exactly because of the two main opportunities mentioned:

  • The public sector will see savings and related synergies from a centralized master data management approach
  • The private sector will gain a competitive advantage from better and affordable reference data accessibility and thereby achieve better master data quality.

Denmark have, along with the other Nordic countries, always had a more mature public sector master data approach than we see in most other countries around the world.

I remember I worked with the committee that prepared a single registry for companies in Denmark back in the 80’s as mentioned in the post Single Company View.

Today I work with a solution called iDQ (instant Data Quality) which is about mashing up internal master data and a range of external reference data from social networks and not at least public sector sources. In that realm there is certainly not something rotten in Denmark. Rather there is a good answer to the question about to be free and open or not to be.

Bookmark and Share

Killing Keystrokes

Keystrokes are evil. Every keystroke represents a potential root cause of poor data quality by spelling things wrongly, putting the right thing in the wrong place, putting the wrong thing in the right place and so on. Besides that every keystroke is a cost of work summing up with all the other keystrokes to gigantic amounts of work costs.

In master data management (MDM) you will be able to getting things right, and reduce working costs, by killing keystrokes wherever possible.

Killing keystrokes in Product Information Management (PIM)

I have seen my share of current business processes where product master data are reentered or copied and pasted from different sources extracted from one product master data container and, often via spreadsheets, captured into another product master data container.

This happens inside organizations and it happens in the ecosystem of business partners in supply chains encompassing manufactures, distributors and retailers.

As touched in the post Social PIM there might be light at the end of the tunnel by the rise of tools, services and platforms setting up collaboration possibilities for sharing product master data and thus avoiding those evil keystrokes.

Killing keystrokes in Party Master Data Management

With party master data there are good possibilities of exploiting external data from big reference data sources and thus avoiding the evil keystrokes. The post instant Data Quality at Work tells about how a large utility company have gained better data quality, and reduced working costs, by using the iDQ™ service in that way within customer on-boarding and other business processes related to customer master data maintenance.

The next big thing in this area will be the customer data integration (CDI) part of what I call Social MDM, where you may avoid the evil keystrokes by utilizing the keystrokes already made in social networks by who the master data is about.

Bookmark and Share

instant Data Quality at Work

DONG Energy is one of the leading energy groups in Northern Europe with approximately 6,400 employees and EUR 7.6 billion in revenue in 2011.

The other day I sat down with Ole Andres, project manager at DONG Energy, and talked about how they have utilized a new tool called iDQ™ (instant Data Quality) in order to keep up with data quality around customer master data.

iDQ™ is basically a very advanced search engine capable of being integrated into business processes in order to get data quality for contact data right the first time and at the same time reduce the time needed for looking up and entering contact data.

Fit for multiple business processes

Customer master data is used within many different business processes. Dong Energy has successfully implemented iDQ™ within several business processes, namely:

  • Assigning new customers and ending old customers on installation addresses
  • Handling returned mail
  • Debt collection

Managing customer master data in the utility sector has many challenges as there are different kinds of addresses to manage such as installation addresses, billing addresses and correspondence addresses as well as different approaches to private customers and business customers including considering the grey zone between who is a private account and who is a business account.

New technology requires change management

Implementing new technology into a large organization doesn’t just go by itself. Old routines tend to stick around for a while. DONG Energy has put a lot of energy, so to say, into training the staff in reengineering business processes around customer master data on-boarding and maintenance including utilizing the capabilities of the iDQ™ tool.

Acceptance of new tools comes with building up trust in the benefits of doing things in a new way.

Benefits in upstream data quality 

A tool like iDQ™ helps a lot with safeguarding the quality of contact data where data is born and when something happens in the customer data lifecycle. A side effect, which is at least as important stresses Ole Andres, is that data collection is going much faster.

Right now DONG Energy is looking into further utilizing the rich variety of reference data sources that can be found in the iDQ™ framework.

Bookmark and Share

instant Data Quality and Business Value

During the last couple of years I have been working with a cloud service called instant Data Quality (iDQ™).

iDQ™ is basically a very advanced search engine capable of being integrated into business processes in order to get data quality for contact data right the first time and at the same time reduce the time needed for looking up and entering contact data.

With iDQ™ you are able to look up what is known about a given address, company and individual person in external sources (I call these big reference data) and what is already known inside your internal master data.

Orchestrating the contact data entry and maintenance processes this way does create better data quality along with creating business value.

The testimonials from current iDQ™ clients tells that story.

Dong Energy, a leader in providing clean and reliable energy, says:

Dong says

From the oil and gas industry Kuwait Petroleum, a company with trust as a core value, adds in:

Q8 says

In the non-profit sector the DaneAge Association, an organization supporting and counselling older people to make informed decisions, also get it:

DaneAge says

You may learn more about iDQ™ on the instant Data Quality site.

Bookmark and Share

Developing LEGO® bricks and SOA components

These days the Lego company is celebrating 80 years in business. The celebration includes a Youtube video telling The LEGO® Story.

As I was born close to the Lego home in Billund, Denmark, I also remember having a considerable amount of Lego bricks to play with as a child in the 60’s.

In computer software the use of Lego bricks is often used as a metaphor for building systems with Service Oriented Architecture (SOA) components as discussed for example in this article called Can SOA and architecture really be described with ‘Lego blocks’?

Today using SOA components in order to achieve data quality improvement with master data is a playground for me.

As described in the post Service Oriented Data Quality SOA components have a lot to offer:

• Reuse is one of the core principles of SOA. Having the same data quality rules applied to every entry point of the same sort of data will help with consistency.

• Interoperability will make it possible to deploy data quality prevention as close to the root as possible.

Composability makes it possible to combine functionality with different advantages – e.g. combining internal checks with external reference data.

Bookmark and Share

Searching for Data Quality (and Decency)

As I have mentioned here on the blog (and maybe even too often) I am right now involved in making the roadmap for and promoting a tool for getting better data quality by searching and mashing up available external information in the cloud and in internal master databases.

The tool is called iDQ (instant Data Quality).

In promoting such a solution we are interested in engaging in a dialogue with people who are searching for data quality.

So are a lot of other vendors in the data quality tool market of course.

In that quest vendors are looking for having a better ranking in search engines when people are searching for data quality, data cleansing and similar terms.

An often used technique for that is link building. Here you (over) use the terms data quality, data cleansing and so and every time you make a link from the term to your home page.

Examples are the blog posts form DQ Global and an endless stream of data quality news from Experian QAS.

However some vendors link building is done not only on own blogs and news lists but also on other sites for example by making comments on this blog.

Examples are this one linking to Experian QAS and this one linking to HelpIT.

It is my impression that these comments are made by SEO agencies hired by the vendors. The agencies make comments with a random name like in these cases “Smith” (ah, John Smith, I know him) and “Peter Parker” (or is it Spider-Man).

Methinks: This may help promoting tools when searching for data quality. But it doesn’t help with finding decency.

Bookmark and Share

Return on Investment in Big Reference Data

Currently I’m working with a cloud based service where we are exploiting available data about addresses, business entities and consumers/citizens from all over the world.

The cost of such data varies a lot around the world.

In Denmark, where the product is born, the costs of such data are relatively low. The joys of the welfare state also apply to access to open public sector data as reported in the post The Value of Free Address Data. Also you are able to check the identity of an individual in the citizen hub. Doing it online on a green screen you will be charged (what resembles) 50 cent, but doing it with cloud service brokerage, like in iDQ™, it will only cost you 5 cent.

In the United Kingdom the prices for public sector data about addresses, business entities and citizens are still relatively high. The Royal Mail has a license tag on the PAF file even for government bodies. Ordnance Survey is given the rest of AddressBase free for the public sector, but there is a big tag for the rest of the society. The electoral roll has a price tag too even if the data quality isn’t considered for other uses than the intended immediate purpose of use as told in the post Inaccurately Accurate.

At the moment I’m looking into similar services for the United States and a lot of other countries. Generally speaking you can get your hands on most data for a price, and the prices have come down since I checked the last time. Also there is a tendency of lowering or abandoning the price for the most basic data as names and addresses and other identification data.

As poor data quality in contact data is a big cost for most enterprises around the world, the news of decreasing prices for big reference data is good news.

However, if you are doing business internationally it is a daunting task to keep up with where to find the best and most cost effective big reference data sources for contact data and not at least how to use the sources in business processes.

Wednesday the 25th July I’m giving a presentation, in the cloud, on how iDQ™ comes to the rescue. More information on DataQualityPro.

Bookmark and Share

The Big Tower of Babel

3 years ago one of the first blog posts on this blog was called The Tower of Babel.

This post was the first of many posts about multi-cultural challenges in data quality improvement. These challenges includes not only language variations but also different character sets reflecting different alphabets and script systems, naming traditions, address formats, measure units, privacy norms, government registration practice to name some of the ones I have experienced.

When organizations are working internationally it may be tempting to build a new Tower of Babel imposing the same language for metadata (probably English) and the same standards for names, addresses and other master data (probably the ones of the country where the head quarter is).

However, building such a high tower may end up the same way as the Tower of Babel known from the old religious tales.

Alternatively a mapping approach may be technically a bit more complex but much easier when it comes to change management.

The mapping approach is used in the Universal Postal Unions’ (UPU) attempt to make a “standard” for worldwide addresses. The UPU S42 standard is mentioned in the post Down the Street. The S42 standard does not impose the same way of writing on envelopes all over the world, but facilitates mapping the existing ways into a common tagging mapped to a common structure.

Building such a mapping based “standard” for addresses, and other master data with international diversity, in your organization may be a very good way to cope with balancing the need for standardization and the risks in change management including having trusted and actionable master data.

The principle of embracing and mapping international diversity is a core element in the service I’m currently working with. It’s not that the instant Data Quality service doesn’t stretch into the clouds. Certainly it is a cloud service pulling data quality from the cloud. It’s not that that it isn’t big. Certainly it is based on big reference data.

Bookmark and Share