Bad Data: It's Costing Your Credit Union
Learn five common data quality saboteurs and how to be proactive in getting to the root of the problem.
Most credit unions are eager to put their data to work – and when done right, it certainly can work magic. But first, a credit union must work for the data.
“Working for the data” means more than pulling it into an analytics platform. In order to work its magic, action must be taken to improve data quality – and there’s a very real, bottom line reason why. Research and consulting firm Gartner says that every year, poor data quality costs an average company $12.9 million. In addition to the immediate impacts on revenue, the compounding effects of poor data quality ultimately lead to poor data analytics – and then, poor decision-making based on those analytics.
It pays (literally) to be proactive. With money, time and insights on the line, here are common causes of most credit unions’ bad data woes. Are they a factor for your credit union?
5 Common Root Causes of Bad Data at Credit Unions
Bad data happens at every organization. Though it’s inevitable, there’s a lot to be done to enhance and maintain data quality. Here are common data saboteurs and how to get at the source of the problem.
1. Incomplete or wrong data from members or employees. When inputting information, most members – and even employees – don’t think about how the data will be used, or how long it might live. If a member misses a detail on a loan application, for example, forgetting to enter a zip code, a staff member might input “11111” to fill in the required field. Though it wasn’t done with bad intent, this creates a data error that will be propagated throughout the system. A simple mistake, especially when repeated, becomes an impediment to data accuracy, segmentation and targeting.
Credit unions can counter this with a data-led culture where employees understand how it’s used and why it’s important. This translates into better data stewardship throughout the organization. When collecting data directly from members, credit unions can incorporate auto-population features that make form fills more accurate and convenient.
2. Poor data in vendor and third-party files. Often credit unions will obtain data from additional sources to further augment its own data. This adds more life and detail to the picture of members and financial trends, but the additional insights will only be as good as the underlying data. Other parties may not have the same data quality standards as your credit union.
Investigate opportunities to incorporate clear standards into contracts with vendors. It may be possible to address certain issues up front, before the data is integrated into your system.
When extracting data from public sources, such as the U.S. Census or Federal Reserve, then, “you get what you get.” It’s the credit union’s responsibility to check and validate the information by implementing business data rules. If this seems overwhelming, there are technology tools available to help. You also might decide initially to focus on just a few key data fields, writing additional rules over time that continue improving the data.
3. Lack of data standardization. I had a situation recently where I was in a service provider’s system twice – once with my middle initial and once without. Though both of my records contained the same social security number (which should have been a big flag!), the lack of data standardization allowed two separate accounts to be created. This resulted in a lot of confusion when I had an issue that needed to be resolved.
That’s not an experience any credit union wants its members to have. You can help avoid this by creating a cross-functional business team that develops uniform data standards and communicates them to IT. Leaving this task solely to the IT department – or even to a single individual – can limit perspective on how the data will be used.
4. Formula errors. If your credit union is relying on a formula to generate data, like a mortgage P&I calculation, always check and validate the results. In addition to wreaking havoc on the accuracy of a credit union’s financial data and potential decision-making, in this particular example, even a small error can damage the member relationship.
A quality assurance team can be tasked with the responsibility of testing others’ calculations to ensure the expected result is generated. If it isn’t, the team can investigate the problem, design a solution and implement leading practices that minimize potential future problems.
5. Natural data decay. Data has a lifespan and the older a record is, it’s more likely that data elements have become stale. Over time, members change addresses, the financial products they use, names and membership status. No credit union wants to make decisions based on information that is no longer true.
Ensure your members have easy ways to update their personal information when it changes, and periodically prompt them to confirm your records are up-to-date. Set up processes to regularly validate and reconcile data, creating rules that establish the most trusted source when there’s a discrepancy. For example, if a member has submitted a change of address, that should always trump a third-party data source that may still have an old address on file.
The common thread between all of these problems and solutions is being proactive. A “people, processes and technology” approach to data quality will make achieving and maintaining higher data quality more manageable, while mitigating against the very real costs of bad data. Data quality is a journey – not a destination. Ongoing data quality, combined with other data management practices, will not only have a bottom-line impact today, but pay dividends for the life of your data.
Merrill Albert is Data Services Delivery Director for the Tampa, Fla.-based Trellance, a business analytics solutions provider for credit unions and banks.