Regulatory Future Proofing with In-Memory Computing

In-memory analytics tools allow even the most complex of regulatory calculations to be performed in minutes or seconds rather than hours.

Since the 2008 Global Financial Crisis (GFC), the increase in regulatory and risk management requirements at banks have gradually raised the need for additional resources to achieve compliance. In recent years, however, regulatory requirements have increasingly been tied to the ability to process large datasets, look for anomalies in data and create a unified approach to risk management as part of the compliance challenge.

With larger datasets comes greater complexity in calculating formulas, whether to assess portfolio risk, model P&L and balance sheet impacts, or perform real-time analysis on a bank’s liquidity risk. The trend is clear. Regulatory requirements are increasingly coming with a heavier data ask, prompting a need for faster calculations, more advanced data analytics capabilities and processing performance.

Indeed, many banks have adapted by investing in newer technology and tools, moving to the cloud where higher computing power is available, and even by replacing entire codebases with higher performance programming languages. While all of this helps, a new approach is needed to future proof banks against ever increasing regulatory demands.

FRTB: A prime example

A prime example of the increasing demand for processing power to meet regulatory requirements is the ongoing implementation of the Fundamental Review of the Trading Book (FRTB), the market risk standard finalised by the Basel Committee on Banking Supervision (BCBS) in January 2019. Jurisdictions following the BCBS timeline – including Hong Kong and Singapore – will be implementing FRTB from 1 January 2023.

FRTB is designed to address issues related to under-capitalised trading books, capital arbitrage between banking and trading books, and internal risk transfers within banks. At a practical level, implementation requires systems that can handle heavy data inputs (captured from multiple systems) and perform complex non-linear calculations on these large data sets.

The market risk capital requirement under the standardised approach (SA) is the sum of three components – the capital requirement under the sensitivities-based method (SBM), the default risk charge (DRC) and the residual risk add-on (RRAO).

Computing these components and combining them to arrive at a bank’s capital requirement for market risk under the SA requires a significant amount of data and processing power, particularly if a bank has large capital market operations, in which case it may have hundreds of thousands of positions on its trading book.

The internal models approach (IMA) goes even further in terms of the volume of data needed as inputs and the complexity of the calculations required. The decision to adopt the IMA is one that each bank has to make, and requires regulatory approval, but it can often result in a lower overall capital requirement for market risk – a significant consideration for any bank.

Whether a bank adopts the SA or IMA, the calculations to determine its market risk capital requirement should be performed daily. Banks adopting the IMA must also report regulatory capital calculated under the SA, meaning both sets of calculations would need to be performed. FRTB requires banks to report on all of their entities as standalones, which triggers the need to report in multiple currencies as well as all entities in aggregate – adding to the computing challenge.

Seconds rather than hours

The very high data analysis requirements associated with FRTB has prompted many banks to seek out more effective technology solutions that satisfy both the need for computational speed and the desire to keep costs in check. One approach is to use in-memory analytics tools – such as Atoti+, which is designed to perform calculations on complex transactional data and analyse it in real time, without sacrificing on performance.

Developed by ActiveViam, the Atoti+ software is not itself a risk engine and does not perform trade valuation calculations. Rather, it sits as an unobtrusive layer between systems and integrates with a bank’s existing architecture.

Antoine Chambille, Chief Technology Officer for ActiveViam

Antoine Chambille, Chief Technology Officer, ActiveViam

The solution works by importing data from any bank system, aggregating and storing it in-memory, and performing calculations without the need to re-call data. From a technology standpoint, this approach is faster and less resource intensive, allowing for calculations to be performed in minutes or seconds rather than hours.

“We’ve been working on in-memory technology for a decade and a half and we’ve achieved a high level of optimisation not just to make calculations very fast, but also to minimise the hardware footprint,” said Antoine Chambille, Chief Technology Officer for ActiveViam. “When you look at the total cost of ownership, in-memory is actually very competitive today in a large number of use cases.”

Add-ons to Atoti+, called Accelerators, can be configured to solve for a particular regulatory requirement, with FRTB being just one example. The FRTB Accelerator contains all of the business logic, source code and BCBS-prescribed formulas needed to compute capital requirements for market risk, while also allowing for variations in parameters, correlations and calculations as may be required in different jurisdictions.

It provides a series of 25 different calculation chains to compute the SA components, as well as all the formulas and flexibilities required for the IMA. Atoti+ also provides capabilities to perform ‘What-If’ analyses, which can be used to simulate in real-time the impact of a trade or set of trades on the final capital requirement, the effect of transitioning from the SA to the IMA, or the benefits of moving trading books between desks.

“Clients have always relied on us to help them solve their most challenging regulatory and data analytics issues, with FRTB being just one of the areas our technology excels at,” said Colleen Cosgrove, Global Director of R&D Apps for ActiveViam. “Last year we ran ISDA’s benchmarking unit test through our software and passed, a testament to our technology’s capabilities in meeting the significant demands of FRTB.”

Colleen Cosgrove, Global Director of R&D Apps, ActiveViam

Colleen Cosgrove, Global Director of R&D Apps, ActiveViam

Significant utility

Dozens of banks have tapped ActiveViam’s approach for other use cases, such as liquidity risk management, which can be particularly challenging due to the large number of cash flows that need to be continuously monitored. An in-memory analytics platform such as Atoti+ would allow a bank to instantaneously measure the impact that a change in the amount or duration of a loan or portfolio of loans would have on liquidity, for instance.

In the area of credit risk management, banks could add or delete individual loans, or adjust loan-loss reserves, to model the impact onthe balance sheet in real time. Users have the ability to recalculate P&L across a bank’s businesses throughout the trading day and across time zones, which also opens up new opportunities in areas such as risk management and capital optimisation.

Stress testing is another key area where in-memory computing has significant utility, particularly given the increasingly complex scenarios that regulators are requiring banks to model for. Many banks are still performing stress tests using manual approaches and legacy systems, meaning less than five stress tests can be performed each year, in many cases.

It is because of this that stress tests have been underutilised in areas such as risk management and business decision-making, as the results often take months at a time to become available. To run more frequent stress tests, some banks have moved to cloud architecture for the extra computing power, while others are recreating entire codebases in more accessible programming languages such as Python.

ActiveViam’s solution scales across the enterprise and can generate stress test results in near-real time, which allows, for instance, visualisation of the impacts to a bank’s balance sheet, with access to granular details within seconds, allowing banks to pinpoint how any change would affect liquidity, capital or profitability over any time period.

Climate risk on the horizon

In the EU and UK, regulators are requiring banks to stress test the impact of climate change on their balance sheets 30 years into the future. This will require them to project temperatures and weather events in areas where bank and client assets are held and estimate any changes in value based on these climate risk scenarios, among a myriad of other variables.

Traditional stress testing approaches are no longer appropriate for such an exercise. Recognising this, ActiveViam has integrated dedicated tools to help firms model the losses banks face in their loan portfolios from climate change using methods from popular credit risk models.

Atoti+ makes it fairly easy to model those risks by leveraging the power of in-memory computing and high-volume data analytics. Its Python API puts it in the hands of quants, data scientists and business users a seamless, interactive experience.

To see how to integrate climate in a credit risk model, read: A New Paradigm: Identifying and Managing Climate Risk in Loan Portfolios.

This article was jointly developed by Regulation Asia and ActiveViam. ActiveViam won the Big Data & Analytics award in the RegTech category of the Regulation Asia Awards for Excellence 2020.

For more detailed information on ActiveViam’s FRTB Accelerator, read this white paper.

To Top
Share via
Copy link
Powered by Social Snap