Where there is data, there is regulation. Most of all in the financial industry. Banks, insurance companies and financial institutes must deal with a complex system of external, local and global regulations, all requiring banks to submit regular reports about their business to an authority. Banks need to prepare this data according to specific requirements set by various authorities. The biggest challenge so far is data quality and the manual preparation of data for reporting.
In this detailed post, you’ll get an overview of how banks are currently preparing data and how it is no longer effective. Additionally, you’ll get to know:
- Expectations in regulatory reporting
- Data quality challenges that threaten compliance
- Cost of failing to meet regulatory compliances
- Why Excel is no longer the right tool for compliance data preparation
- Approaches to Remediation – Self-service data preparation tools
Let’s get started.
Expectations in Regulatory Reporting
The increasing complexities of financial crimes coupled with multiple financial crises have escalated stricter guidelines for institutions. The banking world has experienced tougher regulations that demand comprehensive capital analysis reviews, comprehensive liquidity reviews, supervisory reviews, and evaluation processes among others. All these regulations, whether it’s the BCBS 239, the CCAR, Basel III or MiFID II are inherently data-centric.
These regulations were brought in for multiple purposes – for instance, the Dodd-Frank is to ensure transparency in record keeping while the CCAR and BCBS 239 are focused on data quality, data lineage and overall data management with a special emphasis on proving and improving data governance.
These regulations have imposed a new mode of operation adding new complexities to regulatory compliance. There is more to come. As digital transactions and online commerce take precedence over traditional banking, these compliances are expected to increase to counter money laundering and financial crimes carried over the internet.
Put simply, banks are required to keep up with technology and the evasive world of digital finance. The problem? Financial institutes are not prepared. Not for cultural transformations, not for technology transformations, not even for data transformations. The pace is slow, resources are limited but the pressure to comply is mounting. Failure to demonstrate a compliant solution can result in massive regulatory and reputational risks – including hefty fines, imprisonment of executives, and loss of reputation.
Cost of Failing to Meet Regulatory Compliance Standards
The cost of compliance to the regulatory standards consumes approximately 20% of “run-the-bank” cost base of a financial services provider and about 40% of “change the bank” costs for ventures presently in progress. Dodd-Frank and BCBS-239 support consumer protection; however, call for investment to manage at scale, placing a burden on the profits of the financial sector. Failing to comply has resulted in more than 200 billion dollars in fines during the past five years and has also caused an increase in the concerns for personal accountability for the banking executives.
Local banks are also not exempted. The American Bankers Association recently conducted a survey of small American banks and established that approximately 50 percent of small American banks have dropped their product offerings, lowered staff because of the stress of regulatory compliance, and the consumer support is not up to the mark pertinent to their efforts towards compliance.
Regulatory reporting demands a swift merging of varied data available throughout the financial organization. This can be an extremely costly and resource-intensive task. Even when financial organizations are spending approximately 4.5 billion dollars annually on compliance only, they are still paying billions of dollars in fine. Diverse data management for regulatory reporting is a multi-tiered challenge for the modern-day financial sector.
Key Data Preparation Challenges in Regulatory Compliance
Experts mutually agree that resources in financial institutes need to spend 80% of their time in analytics review (including reviewing data for sanctions compliance), while 20% on data preparation (which encapsulates operations as data cleansing, data standardization, data profiling, data matching etc.). In reality, though, the sheer volume and complexity of data coupled with limited human and technological resources have resulted in teams spending more time on data treatment and preparation rather than on analytics. This mainly because regulatory reporting is still considered a subsidiary, backside function handled manually by IT resources working in silos. But there are other hurdles too – hurdles that are preventing organizations from establishing a foolproof regulatory system.
Over the years, we’ve worked with several of the largest banks and financial institutes in the US and across the globe to help them with data quality challenges. Almost every client we’ve worked with cited one or all of the following hurdles:
- Disparate Data Sources: A wide network of vendors and partners and a spread of multiple branches means banks are dealing with disparate data sources. They are struggling with the consolidation of data from multiple sources. For every report or analytics review, banks need to collect data from these multiple sources which could take up to months.
- Reliance on Out-dated Systems: Too often, traditional financial institutions (FIs) are still getting by with the same systems they’ve had in place for the past 20 or 30 years, so it’s hardly surprising that these solutions aren’t well-equipped for today’s digitally-focused, omni-channel environment.
- Data Exists in Silos: Banks still relying on legacy systems that have a highly segmented data management structure, where each part of the business has its silos. For instance, some banks still don’t have a centralized data management system, which means at the time of analytics reviews, the organization will have a difficult time pulling and consolidating this data from multiple systems. Siloed data makes for one of the most time-consuming activities as firms struggle to extract data from a host of apps, platforms, and systems.
- Poor Data Quality: For most financial institutions, data quality remains an ongoing challenge, with its integrity degraded by inconsistent taxonomies, inaccuracy, incompleteness, and duplication. According to a study conducted by Oracle Financial Services and the Center of Financial Professionals, inconsistent data, and poor data quality resulting from siloed systems are two of the barriers to achieving BCBS 239 compliance.
- Data Preparation Still a Manual Process: There is still a heavy dependence on manual methods to prepare data. Excel sheets and SQL programming are still being employed in aggregating complex data. This manual approach prevents financial institutes to keep up with new demands – both in terms of customer and regulatory expectations.
Regulatory reporting demands data to be clean, accurate, complete, and consistent. But one of the biggest roadblocks to meeting these demands is impaired technology coupled with a stubborn insistence of sticking to outdated data preparation methods that worked well in the past but are no longer helpful in managing current data needs.
Why Excel and SQL Programming are No Longer Effective Tools for Data Preparation
The Federal Reserve and regulators are now less tolerant of manual solutions and workarounds that are no longer of a match to the scope, volume, and granularity of data that need to be submitted to regulatory authorities.
Adding fuel to fire is the counter-intuitive reporting architecture of many firms that still delivers individual reports by business area preventing the accurate calculation and reporting of risks across entities or by product mix. Plagued by disparate systems, inconsistent data sets, manual data entry errors, and mounting compliance pressures, professionals spend a significant amount of time and effort in data aggregation and reconciliation via Excel or SQL codes.
Highly limited in the face of vast volumes and varieties of data, common technologies like Excel first introduced 40 years ago to complete regulatory reporting can no longer meet the required speed and demands. Some of the main challenges of using these technologies are:
- Limited Data Preparation Features: Excel is not intuitive and requires the user to create formulas and rules for every transformation. For instance, it takes multiple formulas and repetitive actions to remove white spaces or accidental punctuation marks in text fields. Moreover, unlike ML-based solutions that evolve with time to encapsulate new problems, Excel is still more or less the same as it was 40 years ago. It has limited data preparation features such as integration with other data sources, or profiling, or even click-based data cleansing. Lastly, it cannot be used to dedupe data which is one of the leading challenges professionals face when consolidating data from multiple sources.
- Data Lineage Limitations: One of the key requirements of regulatory reporting is visibility. Stakeholders want to know exactly how data has been transformed before being submitted to a regulator. Excel does not automatically keep records of transformations. Users often have to go back through their work and manually demonstrate the steps they took to reach the desired level of accuracy. Financial institutions must use a data preparation software that automatically records all transformations and preserves the structure of this data.
- Requiring Expert Users: Both SQL and Excel require expert/advanced programmers or users making the regulatory reporting an IT task instead of a business task. Not every financial or regulatory compliance analyst is technically sound in SQL or in data management for that matter. Data analysts or programmers on the other hand are not owners of compliance data and therefore do not have as robust an understanding of the nature of this data as do the people owning it. This crisscross between IT and financial analysts is one of the leading causes of siloed data preparation that hampers a progressive approach to data management.
Organizations must acknowledge the fact that common technologies like spreadsheets and SQL are only effective when preparing data at a small scale – for regulatory reporting – which demands accuracy – they are hardly ideal. Today, financial institutions need automated, ML-based solutions that are powerful enough to allow for agile data preparation while allowing your departments to easily consolidate, merge, dedupe, and clean data for regulatory compliance. The platform must be intuitive, allows for easy integration and an easy-to-use interface that does not depend on the expertise and availability of programmers or IT experts.
Approaches to Remediation – ML-Based Self-Service Data Preparation Tools
While most experts talk about culture change, data transformation journeys, and a complete overhaul of infrastructure, we believe the right approach to remediation lies in first acknowledging core problems with data quality and understanding challenges with regards to processes.
For instance, firms can start by improving the quality of their data before moving on to bigger transformation initiatives like migrations or new infrastructure implementations. As the saying goes, the devil is in the detail, and in this case, it’s not the infrastructure or technology that’s hampering progress, it’s quite literally the details in a bank’s data source.
The first step to remediation, therefore, is in preparing data for compliance. And this can be done by using a top-in-line self-service data preparation tool such as DataMatch Enterprise, that allows for:
- Integration of data sources into a single, one-stop platform
- In-depth profiling of data to discover errors and anomalies
- Cleansing of data according to pre-defined and customized rules, patterns or logic
- Merging and deduplicating data with a 100% match accuracy
- Consolidating data into one single source of truth which can be presented for regulatory reporting
- Data lineage – show every transformation as it happens
The DataMatch Enterprise is a technological answer to the challenges of data preparation and data quality assurance for regulatory compliance. This top-in-line solution provides both business and IT users with a unified, one-stop, consistent platform for managing the data preparation process without the need for programming language knowledge or manual intervention.
The Bottom Line….
Regulatory reporting demands data accuracy and integrity, both of which cannot be achieved via manual processing of data. Financial firms need data preparation tools that can evolve with time and allow them the flexibility of preparing massive volumes and multiple varieties of data as effortlessly as possible. The goal is to minimize repetitive tasks to make time for the core business.
Teams must not be burdened by new regulations; they must be prepared to tackle it head-on. And while a complete overhaul of infrastructure and culture cannot be achieved overnight, businesses can invest in self-service data preparation tools to kickstart their journey towards automation.
Need to meet anti-money laundering regulations? Download this whitepaper to see how Data Ladder helps banks and financial institutions perform core operations as data preparation, data profiling and data matching to meet AML regulatory requirements.