Facebook is set to fight fake news by using artificial intelligence. A good way to practice may be by playing a bit more around with their geolocation intelligence.
Today I, as far as I know, are on the Canary Islands. This is a part of Spain, though a little bit away from the motherland down the Atlantic Ocean off the North African coast. A main town on the islands is called Las Palmas.
However, according to Facebook I seem to be in a place called Las Palmas Subdivision on Hawaii in the Pacific Ocean on the other side of the globe with Hawaii being a bit away from where it were last time I looked on a map.
Welcome in the class room to Rick Buijserd from The Netherlands as the next guest blog post author:
As a child you were happy when the bell ranged and the school day ended. It was time to play with your friends and don’t think about learning anymore, just play! Most of us look back at this time as the best time of our lives. A time without any worries and enjoying every moment of it. Even though it wasn’t the main focus as a child it was also the time that we learned new ideas and things every day. Are we still learning every day? Are you learning new things about data management every day? You should and here is why…
Data is the new oil and many of us make a decent living by advising or consulting companies in this area of expertise. But when time goes by so are the developments and in the technology world this goes fast, very fast. In the last couple of years the data environment has become bigger and bigger. First there was just data in companies, now you have the combine sources of data to get a clear view about. And the sources keep on changing. Big data used to be a word that was undefined and unable to use. And for many it still is, but others use big data to enrich and enable growth for their companies. By just summing this up you see the changes that happened in the last couple of years and you have to keep up to stay relevant. Learn and gain knowledge is the only key to success in the long term. Artificial Intelligence and Machine Learning powered by optimal use of data and data management will take over many tasks but in the end human creativity and the ability to learn will provide success and the power to make the difference.
Data Management is never finished and neither is learning about it
As you have been in the world of data management you should know that data management is never finished and so is the possibility of gaining knowledge. New books about data management are published recently, research firms keep on researching and find new discoveries. And many companies use the evolution of the technology to grow. Also Communities are built around topics on many different platforms. The possibility to learn is everywhere! Use it in your benefit, data management is never finished…
Rick Buijserd is author and owner of the platform Data Management Experts and a young professional with experience in the world of data. He started his career at a well-known software vendor as channel manager where he learned the skills of indirect sales and managing partners. Financial, HR, Logistics, Warehousing and PSA were the main elements of his software sales. Building relationships with experts and other vendors are part of his DNA.
After a couple of years he decided to make a switch and landed in the world of accountancy firms. In this period he enabled himself to become a trusted advisor of many accountancy firms in The Netherlands. The area of finance, financial reporting, tax, auditing and other accountancy related activities are no secret to him. Together with his clients he developed many solutions to solve their challenges. In this period the love for data management came above. Accountancy firms are the ultimate example of being data driven. It is all they know.
In the most recent period of his career he stepped into the world of multinationals and as off today he is still active in this world advising around data management and selling software solutions to multinationals who have challenges in the area of data management. Also he is an expert in the area of social selling via LinkedIn and this knowledge has been brought into practice via a LinkedIn Group for Dutch Data Management Experts in which he gathers the top data management experts from the largest companies in The Netherlands to discuss all kind of data related topics.
The below figure shows the cross border data flows on this planet. There are inter-regional data flows and there are flows between the worldwide regions:
Now, a small part of this data will be product data exchanged between trading partners participating in global business ecosystems. While I have no data on if product data are distributed by the same proportions as data in general, it will be a qualified guess, that the picture will look somewhat the same.
Exchanging product data across borders has some challenges:
Language is an issue. Product data will eventually have to be translated into the language of the end buyer, if this is not the language in which the product data originally are provided. The definitions (metadata) of product data will also be subject to translation. Even the language of the transmission tools would not be in English all over.
Regulations around product data are different from country to country.
The cultural content of the optimal data describing a product in structured data elements and related digital assets are different between countries and regions.
Our company Product Data Lake has relocated again. Our new address, in local language and format, is:
1058 København K
If our address were spelled and formatted as in England, where the business plan was drafted, the address would have looked like this:
The Old Seed Office
39 Harbour Street
Copenhagen, 1058 K
Across the pond, a sunny address could look like this:
39 Harbor Drive
Copenhagen, CR 1058
U.S. Virgin Islands
Now, the focal point of Product Data Lake is not the exciting world of address data quality, but product data quality.
However, the same issues of local and global linguistic and standardization – or should I say standardisation – issues are the same.
Our lovely city Copenhagen has many names. København in Danish. Köpenhamn in Swedish. Kopenhagen in German. Copenhague in French.
So have all the nice products in the world. Their classifications and related taxonomy are in many languages too. Their features can be spelled in many languages or be dependent of the country were to be sold. The documents that should follow a product by regulation are subject to diversity too.
Handling all this diversity stuff is a core capability for product data exchange between trading partners in Product Data Lake.
In this blog post Gautam examines the challenges, the key questions and the concept options an organization have when embarking on a journey to go from a national (or regional) scale to an international scale in Product Information Management.
In his post Gautam states: “A Global PIM is not a consolidation exercise. Variance is the reality, and it has to be supported.”
This resonates very well with my findings. Very low practical this means that you will not win by translating all product descriptions into English. Even the metadata has to be multilingual, as you will interact with trading partners using different languages. While one public standard for product information may be king in one region, this will most likely not be the case in another region, which again effects how you collaborate with trading partners in different geographies.
In my eyes the global PIM journey does not end with consensus and a common platform of either concept inside your organization. You have to embrace your business ecosystem of trading partners. How to do that is explained in the post What a PIM-2-PIM Solution Looks Like.
One of the most intriguing sides of data quality and Master Data Management (MDM) is, in my eyes, how you can extend a national solution to an international solution.
Many implementations starts with a national scope and we also see many tools and services built for a national scope. Success on a national scale does unfortunately not always guarantee success on an international scale.
Besides all the important stuff around different culture challenges and how to drive change management in an international environment, there are also some things about the master data itself that are challenging.
Location Master Data is probably the most obvious domain where we face issues when going international. Postal addresses are formatted differently around the world. Approximately half of the world puts the house number in front of the street name, approximately half of the world puts the house number after the street name and then in some places you don’t use house numbers on a street, but in blocks. City and postal code has the same issue. The worst solutions here tries to put the rest of the world into the first implemented national solution as told in the post Nationally International.
Party Master Data, also when looking beyond postal addresses, must encompass many national constraints and opportunities, not at least when it comes to exploiting third party data:
Utilizing business directories is one common way. Here you have to balance the use of many different best of breed national providers or taking it from a more harmonized provider of an international directory. Where I (also) work right now, we have chosen the latter solution as reported in the post Using a Business Entity Identifier from Day One.
If you, as I am, are coming from Scandinavia you are also amazed by the difficulties around the world there are in healthcare, elections and other areas when there is no public available national identifier for citizens as examined in the post Counting Citizens.
Product Master Data does in many ways look the same across countries. However, standards for product data often still are specific to a single or a specific range of countries. Also, if the national implementation was not in a country with multiple languages and the international scope includes more languages, you must encompass multilingual capacities for product information management.
What have you experienced when going from national to international?
Within Product Information Management (PIM) there is a growing awareness about that sharing product information between trading partners is a very important issue.
So, how do we do that? We could do that, on a global scale, by using:
2,345,678 customer data portals
901,234 supplier data portals
Spreadsheets is the most common mean to exchange product information between trading partners today. The typical scenario is that a receiver of product information, being a downstream distributor, retailer or large end user, will have a spreadsheet for each product group that is sent to be filled by each supplier each time a new range of products is to be on-boarded (and potentially each time you need a new piece of information). As a provider of product information, being a manufacturer or upstream distributor, you will receive a different spreadsheet to be filled from each trading partner each time you are to deliver a new range of products (and potentially each time they need a new piece of information).
Customer data portals is a concept a provider of product information may have, plan to have or dream about. The idea is that each downstream trading partner can go to your customer data portal, structured in your way and format, when they need product information from you. Your trading partner will then only have to deal with your customer data portal – and the 1,234 other customer data portals in their supplier range.
Supplier data portals is a concept a receiver of product information may have, plan to have or dream about. The idea is that each upstream trading partner can go to your supplier data portal, structured in your way and format, when they have to deliver product information to you. Your trading partner will then only have to deal with your supplier data portal – and the 567 other supplier data portals in their business-to-business customer range.
Product Data Lake is the sound alternative to the above options. Hailstorms of spreadsheets does not work. If everyone has either a passive customer data portal or a passive supplier data portal, no one will exchange anything. The solution is that you as a provider of product information will push your data in your structure and format into Product Data Lake each time you have a new product or a new piece of product information. As a receiver you will set up pull requests, that will give you data in your structure and format each time you have a new range of products, need a new piece of information or each time your trading partner has a new piece of information.