Oh yes, in my crazy berserkergang of presenting stupid buzzword suggestions it’s time for “Guerrilla Data Quality”. And this time there is no previous hits on google to point at as the original source.
But I noticed that “Guerrilla Data Governance” is in use and as Data Governance and Data Quality are closely related disciplines, I think there could be something being “Guerrilla Data Quality”.
Also recently an article called “How to set data quality goals any business can achieve” was published by Dylan Jones on DataQualityPro. Here the need for setting short term realistic goals is emphasised in contrast to making a full size enterprise wide all domain massive initiative. This article sets focus on the people and process side of what may be “Guerrilla Data Quality”.
Recently I wrote a blog post called “Driving Data Quality in 2 Lanes” focussing on the tool selection for what may be “Guerrilla Data Quality” and the enterprise wide follow up.
Actually I guess most Data Quality activity going on is in fact “Guerrilla Data Quality”. The problem then is that most literature and teaching on Data Quality is aimed at the massive enterprise wide implementations.
I think I would prefer “Berserkergang Data Quality” – somehow it has a nicer ring to it.
Risking an attack from the Buzzword Police, I would agree that most data quality activity, for better or worse, is not an enterprise-wide implementation.
I think that this reflects the reality that I have blogged about as “Hyperactive Data Quality” – the Buzzword Police are now banging on my door – where most data quality work is reactive:
As I said in that post, proactive data quality is the best practice – as is making data quality an enterprise-wide initiative via a data governance program – but proactive data quality requires a strategic organizational transformation that will not happen easily or quickly (or cheaply).
The reality is that reactive data quality (or “Guerrilla Data Quality” or “Berserkergang Data Quality”) will occasionally be a necessary evil that is used to correct today’s problems.
As you and Dylan have said, setting short term realistic goals is a laudable approach for many organizations and may be the best they are capable of doing right now to take that important first step on the road to improving the quality of their data.
Great post Henrik, cheers for the links.
When I talk about short-term, achievable goals, I’m certainly not advocating a short-term view, this is all part of a long-term strategy but I find that if you pitch the mother-of-all initiatives to the business it can sometimes backfire.
I was really trying to emphasize the need for caution when reading all these (excellent) text books we now have in our profession. Take data quality assessment by Arkady Maydanchik, I love that book, it’s a perfect framework but to implement that many rules on the business takes time.
The point I was trying to make is that the process is really as important as the result. People have to learn a new way of thinking, a new way of communicating about data. So it pays to keep it simple, first. Then build out from there. All the techniques I was talking about, information chains etc., they’re all enterprise techniques, I’m just advocating a light simmer of the ocean instead of a rolling boil!
I’m not for Guerilla as in localized anarchy and reactive cleanse etc., this is all strategy focused but the point is you can’t set measures for completeness, consistency and timeliness in some governance committee and expect it to be obeyed at grass roots, you’ll never get buy-in.
But hey, that’s just my opinion, perhaps some organisations are rolling out the big enterprise frameworks – what’s other people’s take on all this?
A number of years ago I was part of a small group that worked on activity-based costing (ABC)as a skunk works exercise. We ‘liquified’ a lot of the public facing data for a university college, and produced the first faceted ABC reports for post-secondary institutions in the Province of Alberta. We had a lot of DQ issues, but the context of the big picture not only helped us solve these, but sell them to the responsible business units after we were done.
If I could change the connotation of guerilla from its negative to something more akin to ‘stealth’ I think Henrik’s on to something.
The key – just like our skunk works project – is to walk softly (under the radar and in small projects) and carry a big stick (a proven enterprise-level model). Then it becomes neither expensive nor risky, and you could wind up with regular Data Quality war heroes on your hands.
Jim, Dylan and John, thanks for sharing your meaningful thoughts and adding to the vocabulary.
I have continued this search for the missing link between Guerrilla / Berserkergang / Hyperactive / Stealth Data Quality and the teaching in available literature on how to make the mother-of-all initiatives in a new post Gorilla Data Quality.
Let’s follow through on the equivalences. You need a goal – freedom from the tyranny of bad data. You need a figurehead or leader – the DQ-sympathising executive or project lead (either will suffice, but they are very different) – to get shared vision. You need to gather a cache of easily-accessible weapons (tools). You need to gather popular support for your endeavours. Sometimes, you need to destroy old constructs to prove a point. Sometimes you will need to hide your activities from misguided authorities. The freedom fighters should be anonymous (no egos), tough, seemingly innumerable, passionate, & locally knowledgeable.
Sometimes you need to take a bullet & die for the cause … OK, you’ve got to set limits to the metaphor.
I think I would only partially agree with Henrik’s approach, but more fully agree with Dylan’s suggestions.
At times of limited resources and increasing pressures on businesses there needs to be strategic agreement on areas to improve. Guerrilla data quality risks an anarchic approach where the improvements most needed by a business are ignored in favour of improvements that are easier to achieve. So a strategy is essential to ensure the correct areas are targetted for improvement.
Big strategies can be difficult for individual employees to ‘see’ the relevance to them and their role. The skillful part is breaking the strategy into small parts that are meaningful to local staff to implement. A standard process for improvement (and recording of improvements) will help ensure consistency. If this is backed up with support from local ‘disciples’ who can guide staff and help resolve difficulties, then this will increase the chance of success.
Some of the most important areas to improve data quality may require significant cost and resourcs to deliver. This would either require budgetary provision, or specific centrally funded projects to resolve these areas.
Such an approach will combine local involvement with strategic direction.
Thanks Jax and Julian,
Julian, I actually agree with Dylan too. I am not advocating for Guerrilla approaches all over but noticing that they exist. As a tool vendor and practice manager I am engaged on such projects and these projects usually are very well justified often as a success factor in CRM, MDM and BI initiatives.