Skip to content
,

FED announces 'Basel III endgame'

POSTED BY
false

A long-awaited 'Basel III endgame' proposal was published on 27th July. This is the Federal Reserve initiating the beginning of end of the Basel III capital reform process which can trace its origins back to the GFC of 2008. 

In summary, the proposal sets out a new standardised approach for operational risk capital in the US which is very much aligned to the 2017 BCBS paper, Basel III: Finalising post-crisis reform. As expected, losses are included in the calculation but this confirms the US as somewhat of an outlier.

Background

Much of the content is not surprising as a public steer had been given in a speech several weeks ago. However, the proposal provides a lot of detail which offers an insight into the FED's thinking.

To recap, from an operational risk (OR) perspective, the Basel III endgame is all about how the FED is planning to implement a replacement for the current standardised and advanced measurement approaches (AMA) to capital. In development by the BCBS, the new approach was previously referred to as the SMA but this acronym is now absent from most regulator papers.  

Under the new method, capital is calculated using:

  • A Business Indicator (BI) metric - a financial statement calculation designed to capture the volume of activities that carry operational risks, a proxy for an institution's risk profile
  • An Internal Loss Multiplier (ILM) – a measure of the aggregate historical operational risk losses in relation to the size of an institution

One way to think about these are two independent assessments of a risk profile. The BI is perhaps more focused on inherent risk, capturing volumes of risk-generating activity, while the ILM is more about observed residual impacts (albeit historical). Capital is set by scaling the BI component according to the ILM. 

This is a proposal… 

It is worth noting that this is a proposal. The document runs to more than 1000 pages (although the scope is across all risk disciplines), and contains many questions for consultation. History tells us maybe some changes in the detail will be made, but do not expect radical rethinks. The timeline set out is a phased transition 2025-28, although it is unclear whether phasing means the methodology, the inevitable changes in capital levels, or both.  

Significant items for operational risk

Removal of internal models  

This is no surprise at all as it was the central theme of the reforms but perhaps a little more justification is given than before.  

The new approach is claimed to be more risk-sensitive than the current US standardised approach – which is most likely true, but that comparison is against a simplistic baseline.

Much of the previous criticisms of the AMA are restated, for example:

  • Models are subjective and produced unwarranted variability – to a degree, but this is really a consequence of the problem statement (loss at 99.9%) and the wide range of practice specifically encouraged
  • The outcome is varying capital for similar exposure – I would argue the same is true of a standardised approach
  • Empirical verification requires many years of data – this is inherent in the problem, and as true for the new approach as it was for the AMA

And some of the perceived positives are set out:

  • Benefits market participants – standardisation increases the consistency (but not accuracy) of measuring risk profiles, and combined with greater public disclosure provides market participants with a better view
  • Efficiency – one argument which wasn’t really given much airtime in the past, which is simplification will lead to lower operating costs for both banks and supervisors

Models and data remain important

Despite the negativity around capital models, the paper does specifically advocate the benefits of measurement in wider operational risk management.

 

“Although the proposal would remove use of internal models for calculating capital requirements for credit and operational risk, internal models can provide valuable information to a banking organization’s internal stress testing, capital planning, and risk management functions… Large banking organizations should employ internal modeling capabilities as appropriate for the complexity of their activities”

 

Although many industry participants now operate more contemporary taxonomies, the original Basel Event Types are retained

Loss data appears, but can only increase capital 

As indicated in the speech it is proposed that loss data is included. This makes the US rules somewhat of an outlier in comparison to other jurisdictions, with most other regulators opting for ILM=1. This does not necessarily mean the US will be an outlier in capital terms, as there are big jurisdictional differences in the way Pillar 2 is operated. 

One nuance is that the loss multiplier is always positive >= 1, meaning inclusion of losses will never decrease capital levels, deviating from the original intention. For context, the formula was calibrated on global data – with the aim that an institution with “typical” losses, given its exposure as defined by the Business Indicator, would get a multiplier of 1. This necessarily meant there would have been broadly equal numbers of institutions with a multiplier <1 and >1. Is this proposal every bank has ILM >=1. 

How losses are used is as expected: a 10-year data set, losses >$20k (similar to the €20k threshold set by the BCBS), a grouping of losses under a “common trigger”, net loss per quarter – but loss net of recoveries and insurance is perhaps a surprise. 

Some more detail is provided on thorny topics of how to exclude divested or ceased activities from the BI and ILM, which are based on materiality and confidence that the exposure has been removed.

A nod to data-driven operational risk? 

One explicit requirement included is that:

 

“The proposal would introduce a requirement that banking organizations collect descriptive information about the drivers or causes of operational loss events that result in a gross operational loss of $20,000 or more. This requirement would facilitate the efforts of banking organizations and the agencies to understand the sources of operational risk and the drivers of operational loss events…The agencies would expect that the level of detail of any descriptive information be commensurate with the size of the gross loss amount of the operational loss event.”
 

It has been a long-held ambition of many to leverage qualitative descriptions of events to understand how operational risks can be better understood, and mitigated. As we enter an era of machine learning with easy access to powerful new tools (e.g. large language models) that can process and summarise qualitative information, this feels like a positive step to a more data-driven way of managing operational risk.

What is next?

We now enter a 120-day consultation period. It will be interesting to see the industry response to the proposal, particularly around the inclusion of loss data - and especially only as an increase to the BI component.

ORX will continue to support its US members through our wide range of activities – including our capital methodology studies (banking and insurance), benchmarking and CCAR initiatives.