Photo credit: Akeneo
Having large volumes of data is useless if they are of poor quality. The challenge of Data Quality Management is a major priority for companies today. As a decision-making tool, used for managing innovation as well as customer satisfaction, monitoring data quality requires much rigor and method.
Producing data for the sake of producing data, because it’s trendy, because your competitors are doing it, because you read about it in the press or on the Internet; all that is in the past. Today, no business sector denies the eminently strategic nature of data.
However, the real challenge surrounding data is that of its quality. According to the 2020 edition of the Gartner Magic Quadrant for Data Quality Solutions, more than 25% of critical data in large companies is incorrect. Which puts enterprises in a situation that generates direct and indirect costs. Strategic errors, bad decisions, various costs associated with data management… The average cost of bad data quality is 11 million euros per year.
Why is that?
Simply because from now on, all of your company’s strategic decisions are guided by the knowledge of your customers, your suppliers, and your partners. If we consider that data is omnipresent in your business, Data Quality becomes a priority issue.
Gartner is not the only one to underline this reality. At the end of 2020, IDC revealed in a study that companies are facing many challenges with their data. Nearly 2 out of 3 companies consider the identification of relevant data as a challenge, 76% of them consider that data collection can be improved, and 72% think that their data transformation processes for analysis purposes could be improved.
Data Quality Management: A demanding discipline
Just like when you’re cooking, the more you use quality ingredients, the more your guests will appreciate your recipe. Because data are elements that must lead to better analyses and therefore to better decisions, it is essential to ensure that they are of good quality.
But what is quality data? Several criteria can be taken into account. The accuracy of the data (a complete telephone number), its conformity (a number is composed of 10 digits preceded by a national prefix), its validity (it is always used), its reliability (it allows you to reach your correspondent), etc.
For an efficient Data Quality Management, it is necessary to make sure that all the criteria you have defined to consider that the data is of good quality are fulfilled. But be careful! Data must be updated and maintained to ensure its quality over time to avoid it becoming obsolete. And obsolete data, or data that is not updated, shared or used, instantly loses its value because it no longer contributes effectively to your thinking, your strategies and your decisions.
Data Quality Best Practices
To guarantee the integrity, the coherence, the accuracy, the validity and, in a word, the quality of your data, you must act with correct methodology. The essential step of an efficient Data Quality Management project is to avoid duplication. Beyond acting as a dead weight in your databases, duplicates distort analyses and can undermine the relevance of your decisions.
If you choose a Data Quality Management tool, make sure it includes a module that automates the exploitation of metadata. By centralizing all the knowledge you have about your data within a single interface, their exploitation is facilitated. This is the second pillar of your Data Quality Management project.
The precise definition of your data and their taxonomy, allows you to efficiently engage the quality optimization process. Then, once your data has been clearly identified and classified, it is a matter of putting it into perspective with the expectations of the various business lines within the company in order to assess its quality.
This work of reconciliation between the nature of the available data and its use by the business lines is a decisive element of Data Quality management. But it is also necessary to go further and question the sensitivity of the data. Whether or not the data is sensitive depends on your choices in relation to the challenge of regulatory compliance.
Since the GDPR came to be in 2018, the consequences of risky choices in terms of data security are severe, and not only from a financial point of view. Indeed, your customers are now very sensitive to the nature, use and protection of the data they share with you.
By effectively managing Data Quality, you also contribute to maintaining trust with your customers… And customer trust is priceless!