With the proliferation of business information systems, data has become the raw material of the information economy. When one considers the reliance that organizations place on the accuracy of data in their information systems, it is a surprise to learn how many do not have an active data quality management plan in place. Even more disconcerting is the fact that so many organization do not consider data quality to be an issue that they need to worry about. Consider these statistics:
- According to research by Experian, only 14% of organizations worldwide have completely accurate data, and 73% recognize that the data inaccuracy affects their bottom line.
- Further research suggests that poor data quality costs the typical company at least 10% of revenue (6% alone resulting from poor customer data).
- The Data Warehousing Institute (TDWI) estimates that poor-quality customer data costs U.S. businesses a staggering $611 billion a year.
In most cases, organizations don’t even realize that they are losing revenue or increasing costs. The effect can be small or crisis-level, and organizations that do not have effective data governance often do not catch inaccurate data until it is too late.
In addition to the bottom line, other consequences of inaccurate data include less-effective decision-making and a reduced ability to make and execute strategy. A recent Gartner survey found that poor-quality data is the second biggest problem associated with business intelligence and management reporting. “Companies routinely make decisions based on remarkably inaccurate or incomplete data,” the survey found, and poor-quality data is “a leading cause of the failure of high-profile and high-cost IT projects.”
Data quality is quickly turning into a competitive advantage as businesses strive to continually gain more visibility into their operations. The ability to adapt to key events in the business lifecycle such as mergers and acquisitions, expansion into new markets/industries and new regulations is often centered around an organization’s ability to quickly and accurately consolidate and report information needed for decision-making. In addition, regulators are increasingly expecting that organizations be able to comply with regulations in shorter time frames. The IBM Data Governance Council is even predicting that data governance will become a regulatory requirement in an increasing number of countries and organizations over the next four years.
Organizations without existing data quality programs are finding that data cleanup can be a significantly daunting (and expensive) task, especially for those organizations with multiple business segments and information systems. Time certainly isn’t on any organization’s side when it comes to data quality. One problem with data is that its quality quickly degenerates over time. Experts say that 2% of records in a customer file become obsolete in a month. In addition, there are data-entry errors, data conversion errors, interface errors, system migrations and changes to source systems that generate additional data inaccuracies. Proponents of data governance often cite the 1-10-100 rule: It takes $1 to verify a record as it is entered, $10 to cleanse it after the fact, and $100 if nothing is done, as the ramifications of the mistakes proliferate.
The key barrier to maintaining data accuracy is often perceived to be lack of time and internal resources, and most organizations conclude that poor-quality data is just a fact of life. This is simply not the case. The costs associated with data quality management can be easily justified. Even small organizations have a great deal to gain. Data quality management is easiest and least expensive at small organizations with few information systems. However, as organizations expand, the ability to retroactively fix data quality issues becomes a much more costly proposition. A proactive approach and investment in data quality today will invariably pay dividends down the road, regardless of organization size.
The first step toward effective data quality management is to alter the culture of the organization so that it realizes that data quality requires the involvement and cooperation of the entire enterprise. Fixing data quality issues is a not a one-time deal, but rather an ongoing program to manage the people, processes and policies required to create and control a consistent enterprise view of an organization’s data. This usually requires that an individual within the organization who carries the delegated authority of the CEO be identified to lead a data governance initiative. The next step is to survey the situation in your organization to determine the pervasiveness of data quality issues. Finally, develop and implement a data governance strategy, focusing first on baseline cleansing of data and then shifting towards maintaining consistent levels of data quality. Third-party assurance providers can also play an integral role in data governance initiatives, since they provide objective data quality assessments and guidance on data governance best practices. The task of meeting data quality objectives may seem daunting at first, but the benefits of reliable business information will be felt in your organization for many years to come.
For more information on how data quality management can benefit your organization’s bottom line please contact Eric Wright.
Schneider Downs provides accounting, tax and business advisory services through innovative thought leaders who deliver the expertise to meet the individual needs of each client. Our offices are located in Pittsburgh, PA, and Columbus, OH.