Avoid Poor Quality Data-induced Disasters


Organisations lose almost $12.9 million every year to poor data quality costs. Experts list the best processes to get one’s data house in order, and skip the risky impacts of poor-quality data

Retail giant Target had to pay $18.5 million in settlements after experiencing a massive data breach that affected its 40 million customers. Experts said poor data quality management and lax security protocols were the root cause of this enormous data blunder that shook retailers in 2013.

In 2016, Wells Fargo employees opened millions of fake customer accounts without their knowledge or consent, partly due to poor data quality and oversight. This resulted in the American MNC paying $3 billion in fines and legal settlements. 

The base for both cases was poor data quality management, which constitutes inaccurate, incomplete, inconsistent or unreliable data. With inefficient data, organisations are left with incorrect insights and decisions, forcing them to waste resources.  

Human error, outdated or incompatible systems, lack of standardisation and improper data management practices are the most common reasons behind poor-quality data.

Organisations suffer due to the immediate impact on revenue, and in the long term, it leads to the complexity of data ecosystems, resulting in poor decision-making.

Data Quality Hurts Both Ends

Besides the negative consequences for organisations, inefficient data management also creates a disconnected experience for consumers. There have been several examples of customers receiving multiple offers via emails or push notifications after refinancing their homes. 

Brett House, Global Vice President of Marketing Solutions at Neustar, a TransUnion company, says real-time data is a more significant challenge for brands than most people give credit to. These challenges start with internal data silos. Brands often have different organisations to handle their data lakes or environments through channels following other metrics or key performance indicators, resulting in disconnected internal teams.  

Teams must consistently or coherently share data, analytics and information across the organisation. “Product-level teams within financial services handle products like mortgages or credit cards in a direct response manner. A few other teams oversee their marketing initiatives with a different data set. Due to this, consumers often get a disconnected experience, and brands need help getting relevant information about where their consumers are in their customer journey,” House adds.

Action-based real-time data starts with getting your data house in order. Once you have a coherent, unified data and identity strategy, organisations must ensure that their customers’ first-party data is clean, accurate, and up-to-date.

“The real-time actionability around that data is minimal if the data could be better. Sixty or 70% of data will go out of date within six months, as data includes multiple email addresses, phone numbers, and IDs like mobile ad ID cookies. It starts with aligning the data across the organisation and organising the data house.” 

Getting The Data House In Order

What is the optimal ways to get one’s data house in order?, House suggests the following steps; developing a segmentation, audience creation and targeting multichannel strategy. Organisations must align their offline and online targeting and reach out to consumers through a standard, unified view of their consumer data. 

When data is of poor quality, it can lead to incorrect or misleading insights and decisions, wasted resources and increased risk. Organisations must have proper data governance practices to ensure good data quality, including data profiling, cleansing, validation, quality checks, and ongoing monitoring. Ensuring high-quality data helps organisations make more informed decisions, improve operational efficiency, and achieve better business outcomes.

Kulani Likotsi, Head of Data Management and Data Governance at Standard Bank South Africa, shared pointers on how leaders can set up systems to identify data quality issues. She advises getting executive sponsorship or buy-in on how data is an enterprise asset, driving the importance of data quality and how accurate data can enable better decision-making. “Organisations should develop processes regularly assessing and maintaining data quality inconsistencies, ensuring data governance policies and procedures are followed. They should always focus on fixing data quality at its source to keep a single version of the truth and avoid the need for data clean-up downstream. Lastly, invest in tools that can help manage data quality,” says Likotsi.

“Methods like metadata data, cataloguing, creating a data dictionary and a business glossary help document the data to transform it into an asset,” says  Kamran Ahmed, Vice President of Data Quality at The Saudi National Bank. Ahmed explains that data governance and stewardship play a broader role; when you add industry knowledge to data, it changes the value your products can offer customers. “Someone in the loans department needs to know where to find the correct contextual data to make the right decision for the business,” he adds. 

The main challenge, says Ahmed, is that data is documented initially, but when it is operationalised, and many cross-functional departments are using it or updating it—there is no control over the changes the data is going through. In other cases, the data isn’t updated, and circumstances have changed.” 

Lineage documentation can help understand the data’s source and how and when it has been transformed in such cases. Documentation makes it easy to know whether the data is trustworthy or dated.