More than 70% of revenue leaders in an InsideView Alignment Report 2020 rank data management as the highest priority, yet, a Harvard Business Review study estimates only 3 percent of companies’ data meets basic quality standards.
There is a major gap between what companies want in terms of data quality and what they are doing to fix it.
The first step to any data management plan is to test the quality of data and identify some of the core issues that lead to poor data quality.
Here’s a quick guide-based checklist to help IT managers, business managers and decision-makers to analyze the quality of their data and what they can do to make it accurate and reliable.
What is Data Quality and Why Does it Matter?
Before we delve into the checklist, here’s a quick briefing on what is data quality and why it matters.
There is no specific definition of data quality and to give one would be to limit the scope of data itself. There are however benchmarks that can be used to assess the state of your data. For instance, data of high quality would mean:
- It’s error-free. No typos, no format and structure issues.
- It’s consolidated. Data is not scattered over different systems.
- It’s unique. It is not duplicated.
- It’s timely. The data is not obsolete.
- It’s accurate. You can rely on this data to make business decisions.
It’s not mandatory (but is helpful) for your data to be all of this. It’s a consistent effort to maintain data quality.
As for why it matters? We’ve covered the answer to this question extensively in a post given below.
To summarize, it matters because:
- Your business is losing money for every inaccurate data field
- Your direct mail & marketing campaigns incur unnecessary costs for every wrong address data field
- You’re making business decisions made on flawed data
- You’re not receiving accurate insights
- Your data is obsolete and does not fulfill its intended purpose
Put simply, poor data, left neglected impacts every aspect of your business process – from sales to marketing, customer support to customer service, and team efficiency. In recent years, data quality is no longer a backburner process. It’s affecting businesses drastically, which makes it all the more important to treat data quality as a burning issue that needs a resolution before it endangers the growth plan of a business.
How Do You Check Your Data for Quality?
In traditional data warehouse environments, a data quality test is a manual verification process where users verify field data type and length to identify null or incorrect values. SQL scripts are the most commonly used methods to perform data quality checks. However, code-based solutions are proving to be cumbersome, inefficient, and ineffective in managing and controlling poor data.
Codoid has a detailed post on data quality checks in a data warehouse that might be of interest if you’re interested to know more.
Before extracting your lists and running scripts to check for errors, you’ll need to do a complete assessment of your current data status, including analyzing your goals.
Here’s a checklist to help you out:
1. Define What You Want From Your Data
While all businesses want accurate insights from their data, it is not enough.
For instance, firms in the finance industry, have data quality goals that would be dictated by regulatory compliance reporting.
Organizations in healthcare have data quality goals dictated by the demands of accurate medical record-keeping and research studies.
Organizations in retail and manufacturing would have data quality goals dictated by customer experience.
B2B businesses would be more focused on CRM data. B2C on customer behavioral data.
Data quality is business, industry and function-specific. Once you know your main focus, you’ll be at a better position to create informed goals.
2. Assess Your Data Based on Ground Realities
It’s easy to create blanket goals, but often companies end up with catastrophic failures when they jump into investments without assessing ground realities.
Part of data quality testing lies in understanding the constraints and limitations that prevent your firm from achieving the desired level of accuracy.
Some factors to look out for include:
- Your current business model (startup, small business, mid-large business, enterprise)
- How your company handles data (the types of apps or systems used to collect/store/ data)
- Whether you have data policies/standards in place
- The people/teams involved in managing data and whether they are trained in handling modern data in terms of volume, veracity, variety and velocity
- The systems you have in place (such as a legacy system that causes bottlenecks in your day-to-day operations)
- Current restrains in terms of C-suite executive buy-ins, culture and attitudes towards data, dependencies on IT, pain-points of business users are some examples
- Methods used by your team to deal with issues arising from data errors
…. These and other business-specific factors must be taken into account while performing a data quality audit.
3. Choosing Machine-Language Powered Solutions Over Manual Processes
In modern data environments, code-based data quality tests are no longer effective. At most firms, it would take a human at least 12 hours to perform basic data quality testing. In working hour terms, it might take 4 to 5 days just to get a report on basic errors like null or missing values, incorrect information etc. This does not take into account serious problems like identity/entity resolution or the merging of multiple lists and records.
Luckily, most ML-based data preparation solutions today allow businesses to easily check the quality of their data with a few easy steps. You’re choosing between 2 minutes vs 12 hours. And the choice doesn’t have to be daunting.
Best-in-class solutions like DataMatch Enterprise allow free trials that you can benefit from. All you have to do is plug in your data source and let the software guide you through the process. You’ll be surprised at the hours and manual effort you’d be saving your team with an automated solution that also delivers 100% more accurate results than manual methods.
4. Assess the Impact of Poor Data
Convincing your C-executives to take data quality issues seriously will require clarity in terms of the impact of poor data.
For instance, you can begin by checking:
- The number of returned mail per month
- The ratio of poor data to good data. For instance, there are 3 bad records to every 1 good record.
- Number of incomplete, obsolete contact information in a month’s data
- The number of email bounces
- The number of customer complaints
- The amount of time each sales/marketing rep spends in manually fixing data per day/month
- The cost of losing potential business because of inaccurate data
Knowing the cost will help prepare you to focus on remediations that will lower cost and also increase ROI.
5. Profile Your Data to Know How Bad the Quality of Data Is
You could manually profile your data to check for basic errors and inconsistencies but manual efforts, based on outdated business rules cannot give you an accurate view of your data problems.
Instead, you can easily use DataMatch Enterprise’s trial version to review problems within your data and get a list of problems.
When you profile data using the DME’s built-in profiler, you’ll discover issues like:
- Missing information
- Standardization and formatting problems
- Negative spacing between letters
- Use of punctuation in text fields
- Use of numbers in text fields and vice versa
- Number of missing values
- Min/Max statistics
And many other options based on pre-defined business rules. You also have the option of creating custom rules using an exclusive Pattern Builder tool to profile your data according to function-specific requirements.
6. Assess the Results from Profiling & Decide on the Next Course of Action
After you’ve used a solution to profile and check for data quality issues, you’ll need to decide on the next steps to fix the problems. For instance, if you’ve discovered most of your address data is invalid or has format issues, you might want to fix that as a priority if your organization runs direct mail campaigns. Or you might have discovered most of the data is duplicated. Your next step would be to de-duplicate the data and consolidate your lists.
You might also want to clean your most recent data and remove all the inconsistencies affecting the accuracy of your data.
You can even implement a data quality management plan that includes your front-end and back-end data acquisition and processing systems. Here’s a detailed guide on how to implement a data quality management system in your organization.
Data quality doesn’t have a definite goal, and you’re not required to turn your data inside out and scrub it clean to achieve quality. Instead, you’re required to understand what kind of quality issue impacts your business the most and use an available solution to fix that problem. Eventually, as you begin fixing problems one by one, you’ll also be implementing a DQM infrastructure along the way.
To Conclude – Test Your Data Quality Before it Gets too Late
Most companies don’t engage in data quality tests unless critical for data migration or a merger, but at that time, it’s way too late to salvage the problems caused by poor data. Test your data quality, define the criteria, and set benchmarks to drive improvement.