Liliendahl.com

At Liliendahl.com we have two main business areas:

FirstLogoSaveStarting up the Product Data Lake by Liliendahl.com: A cloud service for sharing product data in the business ecosystems of manufacturers, distributors, retailers and end users of product information.

Advisory services: Data Governance, Data Quality  and Master Data Management consultancy for tool and service vendors as well as end users.

Multi-Side MDMSpecialist in Data Matching, International aspects of data quality, Multi-domain Master Data Management (MDM), Product Information Management (PIM), Exploiting Third Party Data and Big Data Quality.

Current and previous end clients include: Pandora, Nordea, Toyota Material Handling, Sanistål, Takeda, Wolseley, University of Bristol, Marks & Spencer, Stibo, Dun & Bradstreet, Experian, Thomas Cook, Bestseller, CP Kelco.

Registered in Denmark under combined registration and VAT number: 21235539. Duns Number: 305630377. Company page on LinkedIn here.

Product Data Lake by Liliendahl.com

FirstLogoSave

The Product Data Lake will be released September 2016

Why are we launching the Product Data Lake?

Sharing product data within business ecosystems has grown dramatically during the last years driven by the increased use of ecommerce and other customer self-service sales approaches.

Most initiatives around handling product data has been focused on internal processes and technology, and there are many viable solutions for that today. However, we have not seen many solutions that solves the problems in the exchange zones between trading partners.

Most companies participating in cross company supply chains uses spreadsheets for exchanging product data. Doing that is very cumbersome, error-prone and does in most cases not provide the needed data quality for having self-service ready product data.

The Product Data Lake is the solution to end the hailstorms of spreadsheets and automate the sharing of product data.

What is the Product Data Lake?

The Product Data Lake is a cloud service for sharing product data in the business ecosystems of manufacturers, distributors, retailers and end users of product information.

The Product Data Lake ensures:

  • Completeness of product information by enabling trading partners to exchange product data in a uniform way
  • Timeliness of product information by connecting trading partners in a process driven way
  • Conformity of product information by encompassing various international standards for product information
  • Consistency of product information by allowing upstream trading partners and downstream trading partners to interact with in-house structure of product information
  • Accuracy of product information by ensuring transparency of product information across the supply chain

Lake Chart newest

Take A Quick Tour around the Product Data Lake.

Get involved

Join the intersection of Big Data and Product Information Management (PIM). Become a:

Meet us in London the 28th and 29th September 2016 at the combined Customer Contact, eCommerce and Technology for Marketing exhibition.

London sep 16

Bookmark and Share

12 thoughts on “Product Data Lake by Liliendahl.com

    • Hi Kate. I think the Product Data Lake and Actualog complements each other very well in the space where PIM solutions don’t go, being the exchange zones between trading partners. The Product Data Lake is not so social I have to regret😉 Actualog is more geared towards complex products I think. There will be a lot of synergy.

    • Thanks for commenting James. That is true on a high level. The difference is equal to the difference between a traditional data warehouse and the data lake concept, which is growing in popularity within big data. In a data warehouse, you do the ETL before putting data in there. In a data lake you do the linking and transforming when data is consumed.

      From recent experience working with product data outside highly regulated industries as pharmaceutics and in some degree food, there is a great need for an agile and process driven approach to sharing product data.

  1. PS: From very recent experience, there is a need in pharmaceuticals as well, when we start looking at other things than active ingredients as the buy side of excipients, supplies, and spare parts.

  2. Pingback: Excellence vs Excel | Liliendahl on Data Quality

    • Hi Caspar. Yes and no. We are on the same mission but have different starting points. The product data lake resembles a social network where trading partners connect and exchange data between two ways of seeing product data that might use the same standards, but probably do not or only do it partly.

  3. Trading Partners could subscribe to a Ipaas like Talend integration cloud. Assuming you have built this on a Hadoop distribution or Amazon Redshift or something like that this would be a great option for them to ensure that data integration standards are compatible.

    • Hi Paul. Thanks for commenting. I surely see users of Talend data integration and not at least Talend MDM services will have huge benefits of using the Product Data Lake when dealing with product master data. We do use Amazon Elastic Cloud for hosting. As data store we use MongoDB, but subscribers to the Product Data Lake can use every kind of own data store and interact with trading partners of their choice of data store.

  4. My concern with data lakes per se is that they feel like a surrender to an easy option for the people putting data into the lake at the expense of those trying to take information out of the lake. The information wranglers face an every growing and thankless task.

    Certain industries have managed to standardise their information. The air traffic industry has IATA and IANA standards. In Britain insurance companies have the ABI with a set of standard reference data. It has to be said that not all insurers comply with the ABI reference set.

    I have noticed that companies that share data do so on a best-endeavours basis, often using resource outside of their IT department and it appears to be this that drives the route of using Excel.

    A healthy company always has more ideas than it has resources to execute those ideas. The IT departments that could drive disciplined information exchange are consumed by the internal priorities of their business and information exchange is often quite low on the agenda. If we had a means of monetising data exchange it would help emphasise the needs for standardisation.

    It feels as if there needs to be an open standard defined for product exchange, possibly aligned to the open data movement.

    • Thanks Dave. The reason of being for the Product Data Lake is exactly to be there in the process of going from a world with scattered standards for product information exchange to a state with a common accepted standard framework very much welcomed as being part of the open standard movement. That process will be long a will be done a different pace for each geography, industry and individual organization. In that space we need something that can work with and without a standard on the providing and receiving end. And that is the Product Data Lake.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

Big Data Quality

The rise of big data naturally brings up questions about the quality of big data. Surely we can’t manage big data the way we manage traditional data as discussed in the post Extreme Data Quality.

The two predominant kinds of big data are:

  • Social data and
  • Sensor data

Read more about the data quality implications for these two kinds of big data in the post Social Data vs Sensor Data.

Not at least the quality of social data is questionable. Read about this in the post Crap, Damned Crap and Big Data.

Besides dealing with quality of big data we are also increasingly learning that data quality for small data is going to be more important with the rise of big data. This is because analyzing big data makes most sense when the big data is matched with small data (first and foremost Master Data). This challenge is examined in the post Small Data with Big Impact.

A trend in ensuring data quality for big data via master data quality is exploiting the increasing number of big reference data sources as explained in the post The Big ABC of Reference data. New forms of identities urge us to be able to mash up many kinds of identities as told in the post Future Identities.

I will be speaking about big data quality and big reference data at the following events:

TDWI UK and IRM UK complimentary London meet-up on big data on the 19th February 2014.  Link here.

You may join the LinkedIn Group for big data quality by clicking the icon below.

BigDataQuality

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

My Been Done List

This is a list of selected projects I have been involved in and I have blogged about or intend to blog about soon:

  • Was a postman while attending high school as described in the post The Right Mail Order
  • Worked at the Danish Tax Authorities with implementing a solution for pension fund taxation as told in the post Big Business.
  • Was secretary for the committee that prepared a joint registry of companies in Denmark. Mentioned in the post Single Company View.
  • Helped with implementing a modernized tax collection system on the Faroe Islands. Mentioned in the blog post Grandpa’s Story.
  • Was CIO in a midsize insurance company (now part of Codan). Mentioned in the blog post A Really Bad Address.
  • Designed a Land Registry solution for Ship’s for the Danish Maritime Authorities. Mentioned in Big Trouble with Big Names. Later helped with migrating to a new system.
  • Managed a Patent Registration and Search solution for the Danish Patent and Trademark Agency – plus later implemented my first data matching solution there.
  • Customized an Enterprise Resource Planning system for a food ingredient manufacturer called CP Kelco today. Mentioned in the blog post What is Multi-Domain MDM.
  • Later helped with migrating the data in the above system to SAP.
  • Helped with an application in the Swedish Healthcare sector.
  • Made a ready-made data matching tool. Mentioned in the post: When computer says maybe.
  • Customized that tool for an international fund raising organization. Mentioned in the post Feasible Names and Addresses.
  • Tuned that tool heavily for Dun & Bradstreet Nordic and Switzerland (now Bisnode). Mentioned in the post the GlobalMatchBox.
  • Works on and off since the last 10 years with a solution for public transit. Mentioned in the posts Multi-Entity Master Data Quality, Real World Alignment and Valuable Accuracy.
  • Managed the introduction of a new data matching tool (Omikron) on the Nordic market. Mentioned in the posts Algorithm Envy and The Worst Best Sale.
  • Worked with Omikron Nordic clients including local branches of international brands as Thomas Cook, Wyndham, Toyota and Avis. The latter one is mentioned in the posts Mixed Identities and Golden Copy Musings.
  • Engaged in the Master Data Management  part of a blueprint for a multichannel and internationalization program at a large UK retailer.
  • Worked as interim data quality and multi-domain MDM specialist at a Master Data Management platform vendor.
  • Involved in the making and promotion of a tool for upstream data quality mentioned in the posts instant Data Quality and Reference Data at Work in the Cloud.
  • Joined the Fliptop advisory board. Fliptop was a pioneer in Social MDM, now acquired by LinkedIn.
  • Helped with setting up a data governance and MDM programme at a top-50 University.
  • Advised in a data governance initiative at a US based financial service tool provider.
  • Helped a Danish government agency with how to register foreign addresses as told in post called Foreign Addresses.
  • Worked as product data development manager at a UK based distributor in the construction sector as told in the post Toilet Seats and Data Quality.
  • Starting up the Product Data Lake, a service for exchanging product data as reported in the post Chinese Whispers and Data Quality.
  • Was engaged as interim MDM specialist at a global pharmaceutical company as mentioned in the post MDM and SCM: Inside and outside the corporate walls.
  • Been interim MDM consultant at a DK based distributor in the construction sector.
  • Adviser in a global customer and counter party MDM program at a large Nordic bank
  • Works as interim MDM specialist at an international jewelry – or is it jewellery? – firm as touched in the post Cultured Freshwater Pearls of Wisdom.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

Data Quality 3.0

Paraphrasing Tim Berners-Lee:

“People may ask what Data Quality 3.0 is. I think what is looking misty on Web 2.0 and Data Quality 2.0 will eventually melt into a semantic Web integrated across a huge space of data where you’ll have access to an unbelievable data resource.”

Another way of putting it will be in a micro-manifesto like:

“While we value that data are of high quality if they are fit for the intended use we value more that data correctly represent the real-world construct to which they refer in order to be fit for current and future multiple purposes”.

My thesis is that there is a breakeven point when including more and more purposes where it will be less cumbersome to reflect the real world object rather than trying to align all known purposes.

You may divide the data held by an enterprise into 3 pots:

  • Global data that is not unique to operations in your enterprise but shared with other enterprises in the same industry (e.g. product reference data) and eventually the whole world (e.g. business partner data and location data). Here “shared data in the cloud” will make your “single version of the truth” easier and closer to the real world.
  • Bilateral data concerning business partner transactions and related master data. If you for example buy a spare part then also “share the describing data” making your “single version of the truth” easier and more accurate.
  • Private data that is unique to operations in your enterprise. This may be a “single version of the truth” that you find superior to what others have found, data supporting internal business rules that make your company more competitive and data referring to internal events.

Data Management in the near future will in my eyes be closely related to the emerging web 3.0:

  • Business Intelligence will embrace internal (private) data and external (public) data in the cloud
  • Data Warehouses – and data lakes – will link internal (private) data and external (public) data in the cloud
  • Master Data Management will align internal (private) data with external (public) data in the cloud
  • Data Quality Tools will profile internal (private) data and match internal (private) data with external (public) data in the cloud
  • Data Governance may be a lot about balancing the use of internal (private) data and external (public) data – and internal and external business rules

Learn about some Data Quality 3.0 services here:

  • The iDQ(tm) (instant Data Quality) service for sharing big global data for the benefit of customer and other party master data.
  • The Product Data Lake for sharing public and bilateral data within business ecosystems for the benefit of product master data.

Great Belt Brdige

Bookmark and Share

10 thoughts on “Data Quality 3.0

  1. Henrik,

    I agree with your vision – and look forward to seeing it realised, soon.

    To be honest, I’m tired of the “fit for purpose” argument, an argument that excuses poor quality data by saying it is “fit for the unique purpose for which it was originally designed”.

    I like the concept of data that is “fit for current and future multiple purposes”.

    The current “bespoke” model is unsustainable. It is similar to requiring a bricklayer to bake his own bricks before he can build a wall.

    Roll on Data Quality 3.0

    Rgds Ken

  2. Excuse my naivety but precisely how CAN data be fit for a future purpose when we do not know the future purpose?

    I ask innocently and came to this blog from google so there may be context I’m not appreciating.

    Naturally, we must define data – I can imagine how we can use a currency amount for multiple future purposes. If we’re talking about a data set at a given level of granularity which a future purpose requires at a lower level of granularity, aren’t we stuck?

    Sorry for what might seem to be a stupid question and thanks for all the effort you people take to write on subjects I’m interested in!

    Carl

    • Thanks for joining Carl.

      What I often see is that one organization won’t go for a certain level of data quality as uniqueness, granularity or other dimensions that another similar organization would do. This is naturally due to the business cases that the current data management efforts are based on. But sooner or later the same kind of organizations will need the same uniqueness, granularity and so, because the business challenges are the same and it’s the same real world these organizations are operating in.

      Therefore looking at the real world will often be a good way to fit those future purposes we know will arise but don’t have on the radar yet.

      One example will be that say your business is currently mostly domestic with a few foreign business partners. Therefore you don’t require any accuracy in storing the country of your foreign business partners and the applicable address format. But if your business will grow internationally, which is the way to grow today in many cases, you will regret that later.

  3. I’d divide data into the following pots
    1. Internation standards compliant (global)
    2. Industry standard compliant (tends to be global but strays to multi-lateral)
    3. National standard compliant (sub-global, multi-lateral)
    4. Partner agreed (bi-lateral)
    5. Proprietary

    • Dave, thanks a lot for commenting.

      Very good breakdown and actually very, very close to what I am working with these days.

  4. I think that lineage is going to be the big challenge in the near future with data.
    By lowering the thresholds to accessing, processing and publishing data, I expect an explosion of transformed data.
    The challenge will be to understand what is relevant and what is not. How can you trust what you see? Where did it come from and how was it transformed?
    The food industry has an equivalent problem at the moment. How do you know that your beef is beef?

    • Thanks for commenting Mark. Indeed data provenance/lineage will be more and more important as we share more and more data.

  5. Hi Henrik,

    Interesting concept, letting cloud providers or other 3rd parties enrich your data. Some of the providers of data quality providers are moving into this space already. I recently spoke to Experian reps about their acquiring the x88 platform. You can see with their existing work with data, they are perhaps positioning themselves to do something similar. But I think they are a while off yet. Good insight.

    Rich

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

The Liliendahl.com 101 on MDM

MDM (Master Data Management) is the new bookkeeping.

Any enterprise is described much better by having a look into the Master Data than by reading the Financial Statements made up from the Transaction Data.

It’s about the numbers and nature of the Master Data and how they are managed.

Master Data is the core entities that describe the ongoing activities in an organisation as:

  • Parties (business partners) in the roles as contacts, prospects, customers, suppliers, members, citizens and any other parties with whom we interact
  • Products (and assets) that we buy, produce, have and sell
  • Places (locations) being production plants, stores and warehouses, delivery points, visit places and so on
  • Periods (calendar) for passed, current and future activities

Most MDM initiatives are focused on a single type of Master Data. The most common areas are Customer Data Integration (CDI) and Product Information Management (PIM).

Multi-Domain (or Multi-Entity) MDM is when you include several of these types of Master Data in a single solution. Such a project is described here.

When working with data quality within master data management you may of course encounter some similarities between the different master data types, but you will certainly also meet a range differences. Read more in the post Same Same But Different.

If you are interested in Multidomain Master Data Management you may join the Multi-Domain MDM group on LinkedIn.

Reference Data is a term often used either instead of Master Data or as related to Master Data. In my eyes reference data is those data defined and (initially) maintained outside the organisation such as a country list, postal code tables and product classification systems. Big Reference Data may be address- business- and citizen directories as explained in Big Reference Data musings.

Social MDM (Social master data management) is about exploiting big reference data in social media as a supplementary source of external data and supporting social collaboration with a MDM solution. Read more about the benefits of Social MDM here.

Data Governance is the essential discipline you need to have in place to manage your Master Data (and Transaction Data). As Wikipedia says: Data Governance is an emerging discipline with an evolving definition. In my eyes Data Governance is practiced in the space between business rules and common sense.

Data quality dimensions apply to Master Data Management as follows:

  • Completeness, conformity, consistency, timeliness and accuracy is related to the concept of “fit for purpose”. As Master Data often are used for mulitple purposes in an organisation the resolution to this is often to reflect real world objects done by data matching with external reference data where possible.
  • Uniqueness is related to the concept of “a single version of the truth”. Deduplication by data matching is a core activity in the reach for this state. As Master Data often are used for mulitple purposes in an organisation hierachy management is an essential destination of deduplication and consolidation.

Hierarchy Management in Party Master Data has a great deal of challenges. Some of them are explained in the blog posts:

A key concept in transformation of data from operational sources into hierarchies is Master Data Survivorship.

Once having built the hierarchies it’s essential to have an ongoing maintenance as hierarchies in the real world are slowly changing.

Handling address information is a hierarchy of it’s own kind within party master data as explained here.

Within Product Information Management (PIM) Hierarchy Management is king as explained in the post What’s a Six Pack.

Customer driven databases are becoming more and more common as eCommerce continues to grow and either takes over or supplements the business processes related to offline sales. Here Master Data meets the customer.

When building enterprise wide master data hubs the eMail address becomes an more and more important element in matching party master data.

Multi-Commerce Master Data Management is about combining data management efforts in online and offline channels. Read more in the post Multi-Commerce Data Quality.

MDM

iDQ(tm) MDM Edition

I have made two MDM book reviews:

Bookmark and Share

3 thoughts on “The Liliendahl.com 101 on MDM

  1. Agree!! Made the same experience and the expression “Data Governance is practiced in the space between business rules and common sense” is something I’m praying for years in my company…

  2. Henrik,
    I’ve been looking for a source to point friends for general information regarding MDM and the associated world of Master Data. Looks like I have found it. A comment regarding your expression of Data Governance is interesting to note. I’m finding in the US large enterprise market that there is a clear need for a Data Governance Organization as a means of adjudicating issues and have an identified path of “to whom do I raise a conflict” when multiple departments have competing values and desires for a particular field. I am seeing as a Data professional a heightened need for DGOs and help on getting there. I’m seeing Big Data as an accelerator of the need. What are your thoughts? Regards,

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

The Emperor’s new clothes

By Hans Christian Andersen

Many years ago there lived an Emperor who was so exceedingly fond of fine new clothes that he spent vast sums of money on dress. To him clothes meant more than anything else in the world. He took no interest in his army, nor did he care to go to the theatre, or to drive about in his state coach, unless it was to display his new clothes. He had different robes for every single hour of the day.

In the great city where he lived life was gay and strangers were always coming and going. Everyone knew about the Emperor’s passion for clothes.

Now one fine day two swindlers, calling themselves weavers, arrived. They declared that they could make the most magnificent cloth that one could imagine; cloth of most beautiful colours and elaborate patterns. Not only was the material so beautiful, but the clothes made from it had the special power of being invisible to everyone who was stupid or not fit for his post.

“What a splendid idea,” thought the Emperor. “What useful clothes to have. If I had such a suit of clothes I could know at once which of my people is stupid or unfit for his post.”

weaversSo the Emperor gave the swindlers large sums of money and the two weavers set up their looms in the palace. They demanded the finest thread of the best silk and the finest gold and they pretended to work at their looms. But they put nothing on the looms. The frames stood empty. The silk and gold thread they stuffed into their bags. So they sat pretending to weave, and continued to work at the empty loom till late into the night. Night after night they went home with their money and their bags full of the finest silk and gold thread. Day after day they pretended to work.

Now the Emperor was eager to know how much of the cloth was finished, and would have loved to see for himself. He was, however, somewhat uneasy. “Suppose,” he thought secretly, “suppose I am unable to see the cloth. That would mean I am either stupid or unfit for my post. That cannot be,” he thought, but all the same he decided to send for his faithful old minister to go and see. “He will best be able to see how the cloth looks. He is far from stupid and splendid at his work.”

So the faithful old minister went into the hall where the two weavers sat beside the empty looms pretending to work with all their might.

The Emperor’s minister opened his eyes wide. “Upon my life!” he thought. “I see nothing at all, nothing.” But he did not say so.

The two swindlers begged him to come nearer and asked him how he liked it. “Are not the colors exquisite, and see how intricate are the patterns,” they said. The poor old minister stared and stared. Still he could see nothing, for there was nothing. But he did not dare to say he saw nothing. “Nobody must find out,”‘ thought he. “I must never confess that I could not see the stuff.”

“Well,” said one of the rascals. “You do not say whether it pleases you.”

“Oh, it is beautiful-most excellent, to be sure. Such a beautiful design, such exquisite colors. I shall tell the Emperor how enchanted I am with the cloth.”

“We are very glad to hear that,” said the weavers, and they started to describe the colors and patterns in great detail. The old minister listened very carefully so that he could repeat the description to the Emperor. They also demanded more money and more gold thread, saying that they needed it to finish the cloth. But, of course, they put all they were given into their bags and pockets and kept on working at their empty looms.

Soon after this the Emperor sent another official to see how the men were ,getting on and to ask whether the cloth would soon be ready. Exactly the same happened with him as with the minister. He stood and stared, but as there was nothing to be seen, he could see nothing.

“Is not the material beautiful?” said the swindlers, and again they talked of ‘the patterns and the exquisite colors. “Stupid I certainly am not,” thought the official. “Then I must be unfit for my post. But nobody shall know that I could not see the material.” Then he praised the material he did not see and declared that he was delighted with the colors and the marvelous patterns.

To the Emperor he said when he returned, “The cloth the weavers are preparing is truly magnificent.”

Everybody in the city had heard of the secret cloth and were talking about the splendid material.

And now the Emperor was curious to see the costly stuff for himself while it was still upon the looms. Accompanied by a number of selected ministers, among whom were the two poor ministers who had already been before, the Emperor went to the weavers. There they sat in front of the empty looms, weaving more diligently than ever, yet without a single thread upon the looms.

“Is not the cloth magnificent?” said the two ministers. “See here, the splendid pattern, the glorious colors.” Each pointed to the empty loom. Each thought that the other could see the material.

“What can this mean?” said the Emperor to himself. “This is terrible. Am I so stupid? Am I not fit to be Emperor? This is disastrous,” he thought. But aloud he said, “Oh, the cloth is perfectly wonderful. It has a splendid pattern and such charming colors.” And he nodded his approval and smiled appreciatively and stared at the empty looms. He would not, he could not, admit he saw nothing, when his two ministers had praised the material so highly. And all his men looked and looked at the empty looms. Not one of them saw anything there at all. Nevertheless, they all said, “Oh, the cloth is magnificent.”

They advised the Emperor to have some new clothes made from this splendid material to wear in the great procession the following day.

“Magnificent.” “Excellent.” “Exquisite,” went from mouth to mouth and everyone was pleased. Each of the swindlers was given a decoration to wear in his button-hole and the title of “Knight of the Loom”.

The rascals sat up all that night and worked, burning more than sixteen candles, so that everyone could see how busy they were making the suit of clothes ready for the procession. Each of them had a great big pair of scissors and they cut in the air, pretending to cut the cloth with them, and sewed with needles without any thread.

There was great excitement in the palace and the Emperor’s clothes were the talk of the town. At last the weavers declared that the clothes were ready. Then the Emperor, with the most distinguished gentlemen of the court, came to the weavers. Each of the swindlers lifted up an arm as if he were holding something. “Here are Your Majesty’s trousers,” said one. “This is Your Majesty’s mantle,” said the other. “The whole suit is as light as a spider’s web. Why, you might almost feel as if you had nothing on, but that is just the beauty of it.”

“Magnificent,” cried the ministers, but they could see nothing at all. Indeed there was nothing to be seen.

“Now if Your Imperial Majesty would graciously consent to take off your clothes,” said the weavers, “we could fit on the new ones.” So the Emperor laid aside his clothes and the swindlers pretended to help him piece by piece into the new ones they were supposed to have made.

The Emperor turned from side to side in front of the long glass as if admiring himself.

“How well they fit. How splendid Your Majesty’s robes look: What gorgeous colors!” they all said.

“The canopy which is to be held over Your Majesty in the procession is waiting,” announced the Lord High Chamberlain.

“I am quite ready,” announced the Emperor, and he looked at himself again in the mirror, turning from side to side as if carefully examining his handsome attire.

The courtiers who were to carry the train felt about on the ground pretending to lift it: they walked on solemnly pretending to be carrying it. Nothing would have persuaded them to admit they could not see the clothes, for fear they would be thought stupid or unfit for their posts.

Emperor_ClothesAnd so the Emperor set off under the high canopy, at the head of the great procession. It was a great success. All the people standing by and at the windows cheered and cried, “Oh, how splendid are the Emperor’s new clothes. What a magnificent train! How well the clothes fit!” No one dared to admit that he couldn’t see anything, for who would want it to be known that he was either stupid or unfit for his post?

None of the Emperor’s clothes had ever met with such success.

But among the crowds a little child suddenly gasped out, “But he hasn’t got anything on.” And the people began to whisper to one another what the child had said. “He hasn’t got anything on.” “There’s a little child saying he hasn’t got anything on.” Till everyone was saying, “But he hasn’t got anything on.” The Emperor himself had the uncomfortable feeling that what they were whispering was only too true. “But I will have to go through with the procession,” he said to himself.

So he drew himself up and walked boldly on holding his head higher than before, and the courtiers held on to the train that wasn’t there at all.

Hans Christian Andersen was born on 2. April 1805 in Odense (Denmark). He was son of a poor shoemaker and could hardly attend school. His father died when he was 11 years old. When H. C.  Andersen was the age of 14 he ran away to Copenhagen. In 1822 he went to the Latin school in Slagelse. He died in Copenhagen 4. August 1875 in the age of 70 years.

One thought on “The Emperor’s new clothes

  1. Pingback: Data Quality and common sense « Liliendahl on Data Quality

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s