Using data quality to fight fires in IT
You would think that the unending scramble to resolve IT emergencies and put out fires would get old, but for some reason, it seems to be standard operating procedure at most organizations. What’s even worse is that you routinely hear people say that they are so busy that other aspects of their jobs are simply not getting done. In some cases, they even acknowledge that there would be significant benefits in terms of efficiencies and effectiveness if they could just go back and correct data issues, or what my kid refers to as a “do-over”. I pointed out to him that in the real world no one gets do-overs, and IT is no exception. Given that little reality check, what is the closest we can come to a legitimate do-over?
In an age where budgets and resources are constantly being stretched or cut, an immense strain is put on employees to get things done, but only the highest priority things are ever addressed (not unlike Agile software development). In the heat of fire fighting, people naturally do whatever seems to be easiest at that time to address the immediate issue. There is no time to think about fixing things long term, so they focus on the current burning issue (and let’s be fair, no one is going to think about designing a fire sprinkler system while their house is on fire). Because most IT fire drills are by nature intended to address what’s in the enterprises collective face, everything else keeps smoldering and eventually develops into another fire elsewhere in the organization. The collective waste of energy and resources to repeatedly fight data fires is staggering.
A secondary issue is that while all this time is being spent fire fighting, taking time away from performing the normal day to day activities, things naturally slip through the cracks or drop down the priority list. For example, when bad data goes uncorrected, it offers an opportunity for evil-doers who might want to do harm to the organization to mask their efforts within the bad data. They could easily modify IDs, account numbers and maybe even credentials with some confidence that the organization isn’t analyzing or correcting poor quality data unless it is causing an outage.
A firefighting mentality is not a sustainable framework for the enterprise. Unfortunately, most businesses simply can’t seem to take a step back far enough to understand the overall dynamic of the situation. Many times, it may feel like it is a small, localized data quality issue when in fact it’s a systemic issue across the organization. When we consider the efforts of individuals trying to resolve high priority outages, it is understandable that for them to look deeper into potential data quality issues is asking too much. The issues, however, are that there never seem to be any follow-up or follow-on action to investigate what happens afterwards, because it is incredible easy for IT to become interrupt-driven. It is this mentality that fuels future fires.
Eliminating or at least minimizing the need to firefight has a multitude of benefits to both the organization and the firefighters, your employees. Constantly putting out fires and running from emergency to emergency is exhausting and employees don’t want to work that way. They want to permanently fix things rather than put patches over them, because they know they’ll be impacted again by it. Organizations cannot continue on this track of inefficient and ineffective activities rooted in poor data quality. Organizations need to acknowledge this road they’re on and step up to the challenge with strong leadership, enhanced processes and reliable technologies. It’s actually not that difficult to minimize the need for firefighting when you focus and fix the bad data which is causing the fires.
So what do you do? Take a BIG step back, grab the people who understand what’s going on (and not just management, get the grunts who are on the front lines as well) and look at the longer term patterns that are driving your organization and the functions within them. What you’re seeing are the effects that define your day to day. Now you need to determine the cause, and we can guarantee you’ll find bad data is a core enabler. Bad data can trigger a nearly endless stream of problem, but data can also be cleaned up, aligned, remediated, normalized, contextualized, etc. Pushing data through a stringent Data Quality filter will make an enormous and immediate difference in your day to day, and this is not just hype.
We’ve seen multiple organizations take their game to a whole new level by addressing the root cause of data fire-fighting, and because its data-centric, it lends itself very nearly to quantifiable analysis.