Regulations in the financial sector are especially numerous and complicated. These regulations frequently overlap and sometimes contradict each other. The schedules are complex and deadlines change frequently, but institutions are expected to be ready regardless. Ten years the financial crisis, we’re seeing institutions struggle to meet the complexity of these regulations.
There has been a slowdown in new regulations in the last few years, giving institutions the breathing room needed to start thinking more strategically about their compliance programs. However, regulators are already looking at new areas of interests today.
Some of the new regulations are specifically targeted at the management and integrity of data, such as BCBS 239 from the Basel Committee for Banking Stability. This global regulation, which became effective in May 2018, is driving the need for greater data infrastructure and investment.
Other areas of interest for regulators include the global impact of Fintech and managing the influx of new market entrants. There is now the possibility of non-bank institutions entering financial services. One example is Amazon, who recently inquired about a banking license. These developments are sure to generate new regulations that all institutions need to prepare for.
Tug of War: Stress Between Banks and Regulators
Both the regulators and financial institutions had been severely tested in the post-recession era. This has resulted in stress and trust issues in the financial system.
On the institution side, FT Research reported that banks have paid between $150-200 billion of fines in the last 10 years. These fines have had major consequences for the banks– both in terms of the balance sheet and in public trust. Suspicion of financial institutions has lead directly to activist movements like Occupy Wall Street.
The credibility of regulators in government has also been seriously undermined. Both sides are highly sensitive about how to do better going forward.
The Next Era of Regulations
Regulators have learned a lot in the last 8 to 10 years. They are assessing the effectiveness of what has been done since the crisis and are responding accordingly. In general, regulators are moving away from releasing large, complex new regulations and instead breaking them up over several years.
New threats, like the proliferation of cyber-attacks and release of private customer data into the public domain, will usher in the next era of regulations after the financial crisis. Those types of activities call for high levels of sophisticated control, and we can expect to see regulations tightening around these vulnerabilities.
From a firm’s perspective, the consequences of these new threats go beyond the monetary. A brand’s reputation and image– which has a significant impact on market capitalization– can be severely damaged in the wake of such an attack.
The Only Constant is Change… and Cost
The only constant in this space has been change. Dozens of new rules and future regulations make planning difficult. Requirements and implementation timelines change frequently, making compliance a costly activity for the banks in terms of both time and money.
For example, Basel III (a volunteer regulatory framework for market liquidity risk and stress testing) was originally planned to launch in May 2018, but was moved back to 2019 at the last minute. Banks were forced to scramble and expend resources to meet a deadline that is now a year out.
Another example is CCAR. In order to meet the Comprehensive Analysis and Review deadline (the annual US stress test regulations), the largest banks in the US will likely spend several billion dollars. That is just on a single piece of compliance regulation.
The Need for Regulatory Compliance Architecture
Over the last 6-8 years, financial institutions have responded to new regulations in a variety of ways. Some institutions have done well from a technical and business perspective, while others have done very poorly. In general, architectural solutions to compliance have occurred two phases.
Version 1 “Architecture”: ~2010 to ~2013
Between 2010 and 2013, banks were using anything that they could to meet the deadlines. Version 1 “architecture” was tactical invention on the fly, leveraging any and all existing infrastructure, technology processes, and legacy computing systems. It was an ad hoc process, with a lot of manual activities and human dependencies. This resulted in errors and significant key person risk.
Central to the challenge was data sourcing. In order to meet particular requirements, institutions were forced to create a “golden source” of data by finding and stitching together fragmented information. It was extremely challenging for banks to understand the lineage of that data and its original provenance, not to mention demonstrating that to the regulators.
Compliance teams would use offline tools like Microsoft Excel that introduced errors and risk during data manipulation. Reconciliation efforts were often a costly, but necessary, part of producing an accurate regulatory report.
Institutions also had a very poor business understanding of data provenance in this V1 era. It took a long time for businesses to take responsibility for the ownership of data.
Version 2 Architecture: ~2013 to ~2017
Version 2 architecture– which emerged sometime around 2013 and continues into today– is characterized by institutions investing money to try to reduce the tactical complexity coming out of Version 1.
Institutions started to make significant investments in technology, people, and operational processes. Their goal was to reduce delivery times, improve reports, cut down on error rates and significant reconciliations, and ultimately, improve the predictability of results.
The other major initiative of V2 architecture was the push for data governance and full lifecycle data management. This initiative was driven by the need for more qualitative analysis. Regulators not only cared that the numbers were correct (quantitative), but they also wanted to see how you arrived at those conclusions. Many institutions in V1 succeeded quantitatively, but failed qualitatively.
Despite the improvements, most V2 architectures still depended on legacy infrastructure. Instead of investing in new infrastructure, institutions chose to build new capabilities around their legacy systems to make up for deficiencies. This created even more complexity and new challenges.
Initial Regulatory Solutions Created Stress Points
While regulators have considered the early results of regulation moderately successful, the implementation of V1 and V2 architectures has created significant stress inside the institutions.
The first major challenge has been the reassignment or creation of teams to focus on regulatory compliance. Part of this challenge includes budget allocation and high costs (remember the CCAR example in the previous section).
Compliance has also put tremendous stress on internal technology infrastructure. Companies have had to provision and re-configure infrastructure to create environments where compliance reports and obligations could be developed, tested, and ultimately put into production.
On the business side of the organization, almost every group has been impacted. Lines of business have had to adapt to changing regulations, while entirely new departments have been created around risk regulatory affairs. In general, the operational side of the business has taken on the lion’s share of the ownership for maintaining compliance.
There is a series of trends that developed during the first 10 years of regulations that are continuing through today. Hortonworks defines these trends as:
- Regulator pressure, enforcement scrutiny
- Crisis for data management
- Quest for efficiency and effectiveness
Regulatory pressure will continue for the foreseeable future, whether it means complying with the existing regulations or adjusting to timelines and targets.
This pressure has effectively created a crisis in data management across the whole industry: Institutions must manage 1) the rapidly expanding the universe of data necessary to meet these rules; 2) the increasing volumes from normal market and customer activity; and 3) the increasing velocity of the life cycle of that data.
The solutions and approaches employed to date are not sustainable for long-term cost effectiveness and efficiency. This is due to the siloed approach of V1 and V2 architecture that results in duplication and rework.
Institutions are on a quest to make these processes as fully automated as possible, ultimately moving them into the core of the business. This is how companies can leverage their data to gain strategic advantage.
New Investment in Data Management
In 2017, a number of industry observers commented that banks still have a long way to go to improving their overall data management. This speaks to the magnitude and complexity of the challenge at hand. Financial institutions are still learning to manage the ever-changing rules and scope of compliance regulations.
Notably, in a report on key trends and regulatory challenges for 2018, KPMG said there will be increased investment this year in, “platforms, tools, systems, and algorithms to capture, aggregate, govern, and analyze data from customers, financial activity, employees, and third parties.”
There will to continue to be a significant push to invest in strategic core data management and data analytic infrastructure for the purposes of meeting the regulations. That brings with it other opportunities to gain strategic advantage.
Part 3: Implementing Compliance Architecture
Evolving the Architecture
As we think about evolving the architecture into a future state, there are a number of drivers that are impinging upon financial institutions, including pressures from internal and external activity, and frankly, requirements to just to “do better”.
External pressures include the digitalization trend within financial services. This is happening in order to meet customer demand and competitive threats. Not coincidentally, digitalization always starts with the data.
In addition, the rise of FinTech has added pressure on traditional financial institutions. In 2013, $2.8 billion of VC money was invested in FinTech startups. In 2015, that number rose to $13.8 billion. Money is being poured into young firms looking to take revenue away from traditional financial services firms, or create new revenue streams and poach customers from incumbents. This is a very significant trend to follow.
Internal pressures include stubbornly high cost/income ratios. Cost effectiveness and cost efficiencies continue to be one of the main challenges for financial institutions. In fact, it is one of the key decision-making factors for investment in technology operations across the industry.
The complexity introduced by Versions 1 and 2 architectures have created cost, latency, and friction that need to be addressed. The siloed approach of V2 architecture is outdated and counter to the direction that banks and institutions need to move.
Keys to Better Regulation Compliance Architecture
Future regulation compliance architecture will have several key attributes in common:
- Simplified. This is one of the most important aspects of future architecture. Includes eliminating duplication that was created in the earlier days.
- Sustainable. Architecture needs to be flexible to meet future needs in both capacity and performance.
- Fast. Institutions need the ability to deliver results in a much faster timeframe.
- Automated. Automation eliminates manual human intervention that is slow, high-risk, costly, and error-prone. It also removes key man risk.
- Qualitative. Institutions need a highly-qualitative approach to data management, focusing on “how” it’s done is as much as the accuracy of the numbers.
- Modular. Services-based, modular architecture allows for future integration with FinTech capabilities and technologies as they evolve over time.
McKinsey has an interesting perspective on future architecture:
“The proactive institutions will leverage their investments to pursue and reach strategic and operational objectives, even if regulatory demand grows less aggressive. Some of them have already started while others have only experimented or not yet begun.”
In other words, regulatory demands will rise and fall, but they will never go away. Institutions should leverage their investment in compliance to create greater strategic value. Those that know how to do this will be the winners into the future.
Version 3.0 Architecture
The first two versions of regulatory compliance architecture technology got the job done, but they were not strategic. The net effect was higher cost and increased duplication, complexity, and risk. The next version of architecture– Version 3.0– addresses these major issues.
The principle underlying Version 3 architecture is to integrate regulatory capabilities into the core of the business so that they become part of daily operations. This starts and ends with data.
PRINCIPLE: Integrate regulatory capabilities into the core of the business (Pull out quote)
Integrating compliance into the business core requires operational convergence– the harmonizing of regulatory data with broader operational data by eliminating silos and putting all data in one place.
The goal of Version 2 architecture was accurate reporting and regulatory compliance. In Version 3 architecture, the goal is to run a lean, accurate, cost-efficient business, where regulatory reporting is simply a byproduct. Compliance is an output benefit and not an end in itself.
Benefits of Version 3.0 Architecture
There are clear business benefits to leveraging data repositories and analytics from compliance efforts.
New studies have found that the data collection and manipulation of CCAR compliance can be used to improve significant business decisions– better budgeting, optimizing balance sheet allocations, establishing better risk thresholds, and making acquisition and divestiture decisions. This advantage would not have existed without the infrastructure and the data harmonization efforts for CCAR.
McKinsey offers this advice for the next three years of financial services regulations:
“Think of this as a pivot into becoming compliant every day, and not just compliant when you’re preparing a snapshot to report for the regulators.”
In order to achieve this everyday compliance, you need to have a pristine, fully-automated common data environment. This includes consolidation of data, clear lineage of all data, excellent governance, clear ownership, and the ability to create business intelligence.
Embracing AI and FinTech in Version 3
In a common data environment, AI can be used to facilitate model-driven decision making to proactively manage risk. In other words, this would allow you to inject artificial intelligence into the data fabric and use that as a decision making framework. This would allow institutions to proactively investigate incoming or even anticipated regulations, which is a fundamental shift from the first two versions of compliance architecture.
From a risk perspective, AI introduces the ability to have real-time risk management. Many of the regulations that been put on hold around risk, credit, and liquidity will become possible.
The data environment in V3 architecture can also serve as an on-ramp for integration with FinTech innovation. One of the biggest challenges for FinTech startups when working with the large financial institutions is the lack of good data access, which inhibits their ability to stand up pilots and Proof of Value projects (POVs). Banking institutions with clean, clear data access are going to have a competitive advantage when embracing new, innovative technologies.
The introduction of AI and other new technologies is another reason why KPMG is forecasting a significant increase in spending on data platforms and tools in 2018. Version 3 architecture will guide regulatory infrastructure into the core business and create advantages that would not have existed otherwise.
A broader technology trend is the to move to the public cloud. Financial services have not been a leader in this space for a variety of reasons, particularly due to regulation.
However, the cloud-native paradigm– including leveraging containers and functions (or Lambdas as Amazon calls them)– is an architecture that allows for great modularity, agility, and future-proofing. As financial institutions look to move to the cloud, it makes increasing sense to design V3 architecture in a cloud-native format.
There are already conversations happening internally at institutions about cloud-native architecture. This expertise needs to be brought to the regulatory infrastructure.
Summary of Version 3 Architecture
The clear strategic direction for financial institutions is defined by Version 3 Architecture: Moving regulatory infrastructure into the core of the business. This shift won’t happen overnight (and still there are institutions that are struggling with Versions 1 and 2) but the desired future state seems to be well-understood, and clearly it all starts and ends with the data.
V3 architecture provides a number of strategic benefits to institutions:
- Agility. Agility means business flexibility; the ability to embrace competitive threats as well as new opportunities, new regulations, and new innovation. V3 architecture is designed for the future. It’s a foundation for digital leverage as we continue to digitize our products and services to meet customer demand, responding and adapting to external pressures like FinTech.
- Clear data ownership. Clear data ownership from a business perspective is absolutely essential. Ultimately, the Lines of Business are the ones who are going to innovate and create new products and services, which generates new data.
- Cost effectiveness. There is enormous pressure on the institutions to continue improving that cost efficiencies. By converging data infrastructures and removing duplication, there will be a huge cost effectiveness benefit over multiple years.
- Reduces operational risk. V3 architecture will help institutions stay compliant every single day, as opposed to just around deadlines.
- Creates the path to intelligence. Artificial intelligence is going to continue to increase in significance. AI only works well with a clear, comprehensive set of data.
These benefits can lead to faster reporting, the ability to manage change more effectively, and better data governance. It also increases trust in the data, so the veracity of the output has a far greater degree of acceptability.
In this continuous mode, institutions can be compliant every day or every hour, building that compliance into the way they run the business rather than making it a special event.
Next week, we’ll look at meeting critical requirements and highlight a specific case study.