Data Quality (DQ), which was always on the agenda of IT managers, but had taken a back seat due to budget constraints or other critical mandates, is now once again a central theme. Owing to regulatory compliance requirements, the push for enterprise risk management, and control of operations' cost, financial institutions are gearing up to put management of data quality back into practice. It is no longer a secondary activity, but an enterprise-wide agenda to achieve data quality maturity through proper governance and deployment of quality measures.
Data Managers are struggling to answer some of the key questions related to their data environment that includes, but are not limited to:
- Are we prepared for the next wave of regulatory requirements?
- Are our amendment request volumes above the industry benchmark?
- Is the risk reporting or alignment of risk with finance reference data accurate?
The current responses are common across financial institutions – either they have some answers to these questions but lack confidence, or they recognize that much more needs to be done to achieve expected quality.
Historically, organizations used to benchmark data quality based on certain principles and metrics such as:
- Creation of a golden copy of reference data domains
- Limiting the number of processing breaks below a certain threshold
- Tracking the number of manual adjustments per month
These criteria are no longer adequate as even a small percentage of bad data could have a significant negative impact on reporting values or compliance with regulations. It is, thus, crucial to have a clear view of data quality – the ability to drill down to the lowest level of granularity and have well-defined measurement criteria.
This paper talks about the ways financial institutions are preparing tobetter manage data to meet their business needs.
The challenges in managing data quality are multi-dimensional and can be attributed to factors such as geographical spread of the organization, magnitude of the IT infrastructure and diversification of the business. Though some financial institutions already have specific solutions to address the issues related to bad data quality, most of them are in fact-finding mode.
Some of the key challenges that the financial services industry are striving to resolve are:
Lack of Transparency
Data governance at most financial institutions is on paper alone and not practiced in the true sense, which has resulted in issues of data controls and data stewardship. 40% of the data managers agree that the biggest challenge is the lack of maturity in data governance, as per the CEB TowerGroup Data Management Systems Technology Analysis.
A majority of the reference data domains are neither fully nor partially adopted as golden copy, which results in data inconsistency among consumers. Furthermore, reference data domains such as product, book and client that require consistent hierarchy/classification and definition across front office, operations and control functions need to be merged to create a single structure.
Wealth Management firms and Investment Banks are still maintaining separate data sets, resulting in duplication of efforts and higher costs.
Data Governance is in Flux
Organizations do not have absolute visibility into the current state of their data quality issues, which hinders the creation of a fool-proof roadmap for DQ remediation programs.
Manual adjustments of data are a common practice across the industry. However, the magnitude of manual adjustments represents the quality of data. 38% of the managers feel that frequent manual intervention is required for error-free data. Frequent intervension indicates if less than 75% of the workflow is error-free, as per the CEB TowerGroup Data Management Systems Technology Analysis
Most of the organizations have fungible operations budget for unplanned activities such as regulatory changes, data breaks or BCP. Since regulatory compliance is the key focus, it consumes most of the budget and resources. Thereby, most data quality initiatives run in an ad-hoc mode, due to unavailability of fixed budgets.
Major Industry Initiatives
A majority of financial institutions are spending a substantial amount of their operations' budget in reactive data quality management. Lately, managers are recognizing the benefits of automation and realignment of their operational processes as a key element for proactive data quality management.
The intent of proactive data quality management is to reduce human dependency, prevent errors, measure data quality, and provide continuous improvement. Organizations are re-architecting their infrastructure by introducing focused DQ technology themes such as workflow management, rules management, exception reporting, data entry control, and reconciliation tools during the data processing cycle. This framework works more as a watchdog to pinpoint and report quality issues rather than as a remediation tool. As a result, financial institutions are able to bring down sizable operational costs and increased quality with implementation of each technology theme.
While the industry is looking forward to implementing sustainable and strategic enterprise data quality eco-systems, most financial institutions are performing due diligence by analyzing existing data sets and infrastructure before taking up larger initiatives. The objective is to develop an inventory of systems, processes and issues to assess the present day data quality and identify break points in order to develop a quality remediation roadmap.