The digital age has a lot of consequences in our life and the next big reform is the end of the time zones.
As most shops nowadays are web shops being open 24/7 and many people work around the clock from home, travel and anywhere else, we really don’t need time zones around the world anymore.
Therefore, the United Nations have decided that everyone will be on UTC from 1st January 2020.
There will only be a few exceptions:
- The US Midwest will g.. d.. it stay one their usual time zone.
- Switzerland will have their separate time zone, the so called cuckoo clock time.
- The UK prime minister has decided that there first will be a referendum about this in the UK if he wins the next three general elections.
Within data management we already have “The MDM Institute”, “The Data Governance Institute” and “The Data Warehouse Institute (TDWI)” and now we also have “The Data Matching Institute”.
The founder of The Matching Institute is Alexandra Duplicado. Aleksandra says: “The reason I founded The Institute of Data Matching is that I am sick and tired of receiving duplicate letters with different spellings of my name and address”. Alex is also pleased about, that she now have found a nice office in edit distance of her home.
Before founding The Matching of Data Institute Alexander worked at the Universal Postal Union with responsibility for extra-terrestrial partners. When talking about the future of The Match Institute Sasha remarks: “It is a matter of not being too false positive. But it is a unique concept”.
One of the first activities for The Data-Matching Institute will be organizing a conference in Brussels. Many tool vendors such as Statistical Analysis System Inc., Dataflux and SAS Instiute will sponsor the Brüssel conference. I hope to join many record linkage friends in Bruxelles says Alexandre.
The Institute of Matching of Data also plans to offer a yearly report on the capabilities of the tool vendors. Asked about when that is going to happen Aleksander says: “Without being too deterministic a probabilistic release date is the next 1st of April”.
11th of November and it’s time for the first x-mas post on this blog this year. My London gym is to blame for this early start.
Santa’s residence is disputed. As told in the post Multi-Domain MDM, Santa Style one option is Lapland.
Yesterday this yuletide challenge was included in an eMail in my inbox:
Nice. Lapland is in Northern Scandinavia. Scandinavia belongs to that half of the world where comma is used as decimal mark as shown in the post Your Point, My Comma.
So while the UK born gym members will be near fainting doing several thousands of kilometers, I will claim the prize after easy 3 kilometers and 546 meters on the cross trainer.
How do you select a Master Data Management (MDM) vendor? There is of course the RFP way of scoring vendors against a bunch of carefully specified requirements within data model, user interface, architecture and so on. But as I have seen it, maybe the multi-domain way is much more used.
The multi-domain MDM vendor selection process has three basic parameters:
- Distance between locations
- Chemistry between parties
- Price of products
Distance between locations:
Here you measure four numbers:
- N1 = Northern UTM geocode of buyers head quarter
- E1 = Eastern UTM geocode of buyers head quarter
- N2 = Northern UTM geocode of vendors head quarter or major regional office
- E2 = Eastern UTM geocode of vendors head quarter or major regional office
Then using the Pythagorean Theorem you get:
(You could make up the distance on Google Maps as well, but that doesn’t look very scientific).
Chemistry between parties:
Here you, at the meetings between the buying team and the vendor team, measure the occurrence of these sentences:
- Could you repeat that question please?
- Could you repeat that answer please?
(Observe that there may be a correlation with distance in cases where distance calls for the use of a webex for a meeting).
Price of products:
I guess everyone knows how to sum up euros/dollars/pounds/whatever.
Magic Quadrants from Gartner are the leading analyst report sources within many IT enabled disciplines. This is also true in the data management realm and one of quadrants here is the Gartner Magic Quadrant for Master Data Management of Product Data Solutions.
The latest version of this quadrant was out in November last year as reported in the post MDM for Product Data Quadrant: No challengers. A half visionary.
Most quotations after a quadrant release are vendors bragging about their position in the quadrant and this habit will possibly also repeat itself when the next quadrant for product MDM is out.
But I think Gartner has got it all wrong here during all the years. As I have seen it, Microsoft is the true leader and the rest of the flock are minor niche players.
Today it has been announced that the European Union will regulate the use of the term “big data”.
“Volumes of misuse of the term big data has gone way over what is acceptable” says an EU spokesperson. Therefore the Commission will initiate a snap roadmap for legislation leading to that every use of the term big data has to be approved by the authorities beforehand.
A variety of ways to declare that your use of the term big data has been approved will be put into force for the different languages used within the Union. So far France has announced that “big data appellation d’originalité contrôlée” will be used there.
Velocity is the word that best describes the planned process for clamping down on the misuse of the term big data. As soon as in 2020 every member state must have started the legislation process and not later than 2025 the rules must be implemented in national laws. However there is a great deal of skepticism over if things could move that fast.
This blog has earlier had some December blog posts about how Santa Claus deals with data quality (Santa Quality) and master data management (Multi-Domain MDM Santa Style).
As I like to be on the top of the hype curve I was preparing a post about how Santa digs into big data, including social data streams, to be better at finding out who is nice and who is naughty and what they really want for Christmas. But then I suddenly had a light bulb moment saying: Wait, why don’t you take your own medicine and look up who that Santa guy really is?
Starting in social media checking twitter accounts was shocking. All profiles are fake. FaceBook, Linkedin and other social networks all turned out having no real Santa Claus. Going to commercial third party directories and open government data had the same result. No real Santa Claus there. Some address directories had a postal code with a relation like the postcode “H0 H0 H0” in Canada and “SAN TA1” in the UK, but they seem to kind of fake too.
So, shifting from relying on the purpose of use to real world alignment I have concluded that Santa Claus doesn’t exist and therefore he can’t have a data store looking like a toy elephant or any other big data operations going on.
Also I won’t, based on the above instant data quality mash up, register Santa Claus (Inc.) as a prospective customer in my CRM system. Sorry.