Quality data is the key ingredient to better business outcomes

Quality inputs are a core requirement for a quality output, and this is true for nearly any profession, regardless of whether you’re a chef, or an IT professional. Your service delivery organization needs high quality reliable data before it can assemble it properly for the decision makers, in order to ensure positive business outcomes. You need to not only find those reliable data sources, but also put in place the necessary quality controls and checks. Those high quality data elements are the vital ingredients that make up the products and services your customers consume.

If you ever speak to chefs as they describe the dishes they provide, you will hear them talk about the same things in nearly the same terms (obviously adjusted for domain). They believe that the only way they can deliver the best quality product for their customers is to ensure the ingredients they put into their dishes are consistently of the highest quality. It’s simply a philosophy of the output will never be better than the input, or in the common vernacular, garbage in, garbage out. Improve the input quality and the end result also is improved.

This concept isn’t exclusive to food products; it also applies to data used in any sort of business making decision. It’s especially applicable when we’re talking about aggregation of data from various sources or migrating data from one database to another. It makes no sense to migrate data to a new source when its quality level is unknown or worse, is known to be poor and unreliable. Yet, we see attempts to simply lift and move data to a new source all the time. What is shocking is that people are surprised the poor quality data wasn’t miraculously corrected in the migration. Data doesn’t magically get better on its own, and in fact a migration runs a real risk of making things worse without a directed effort to improve the data as part of the migration process.

Every employee makes decisions based on the data that is available to them, so it’s important that they have the best quality data available. This is similar to the chef who carefully picks only the best leaves from their personal herb garden, because these are the only ones they would include in their special gourmet dish. As IT professionals, we also need to carefully evaluate our data sources as high quality ingredients. We need to decide which sources are best to use for our systems and which data elements within them we can count on to help us improve business outcomes. For long term success, we also need to determine which processes will be necessary to maintain the desired quality levels. It is Important to keep in mind that not all of them have to be used, some may only need to be used in a specific context, and some may be unable to provide the data quality we seek.

Every product or service created by an organization can only be as good as those elements that go into it. IT organizations need to employ more rigorous data quality controls and audit checks to ensure that those elements are of the very best quality. When sources or element data is not of the optimum quality, an alternative has to be used or considered. Regular checks and balances must be in place between systems to ensure that they don’t begin to diverge and potentially corrupt one another.

If the effort is intended for just a one-time migration, it is even more important to have a rigorous mechanism to normalize and cross reference the data for quality and completeness. This is because there will be no second chance to identify discrepancies and the end product will be negatively impacted. Bottom line? Quality will always matter, and the more critical the end product, the more important quality becomes. Getting an ingredient in a gourmet dish wrong will have a small, localized effect, but migrating bad data in a health care application can be disastrous. Bad data and its effects are entirely manageable; the tools are available, and the best practices are well established.