February | 2015
Data quality is of utmost importance. It defines the usability and reliability of data for making important decisions and compliance standards with regulatory requirements. However, data quality (DQ), which has always been on the agenda of IT managers, had of late taken a back seat due to budget constraints and other critical mandates.
From a financial services perspective, data quality has been a blind spot for financial institutions for a long time. Despite investments in trading and back office technologies, financial institutions have not achieved the expected returns due to the lack of a perfect data quality.
Given this situation, data quality management is taking the center stage once again. And owing to the regulatory and compliance requirements as well as the push for enterprise risk management and operational cost control, financial institutions are gearing up to put management of data quality back into practice. It is no longer a secondary activity, but an enterprise-wide agenda to achieve data quality maturity through proper governance and deployment of quality measures.
However, this task is not as easy as it looks. Data managers are struggling to answer the following key questions related to their data environment:
The current responses to these questions are common across all financial institutions - that is, they either have some answers to these questions but lack confidence or they recognize that much more needs to be done to achieve the expected quality.
Historically, organizations have been using the following principles and metrics to benchmark data quality:
These criteria are no longer adequate as even a small percentage of bad data could have a significant negative impact on reporting values or regulatory compliance. It is, thus, crucial to have a clear view on data quality, i.e., the ability to drill down to the lowest level of granularity and have well defined measurement criteria.
A majority of financial institutions are spending a substantial amount of their operational budgets in reactive data quality management. Lately, managers are recognizing the benefits of automation and realignment of their operational processes as a key element for proactive data quality management.
The intent of proactive data quality management is to reduce human dependency, prevent errors, measure data quality, and provide continuous improvements. Organizations are re-architecting their infrastructure by introducing focused data quality technology themes such as workflow management, rules management, exception reporting, data entry control and reconciliation tools during the data processing cycle. This framework works more as a watch dog to pinpoint and report quality issues rather than as a remediation tool. As a result, financial institutions are able to bring down sizable operational costs and increased quality with implementation of each technology theme.
© 2021 Wipro Limited |
|
© 2021 Wipro Limited |
Engineering, Construction & Operations
Pharmaceutical & Life Sciences