Linkedin    Twitter   Facebook

Get Started
Get a Demo
Category: Portfolio Risk

Striking a Proper Balance: ESG for Structured Finance

The securitization market continues to wrestle with the myriad of approaches and lack of standards in identifying and reporting ESG factors in transactions and asset classes. But much needed guidance is on the way as industry leaders work toward a consensus on the best way to report ESG for structured finance.  

RiskSpan gathered with other key industry players tackling these challenges at this month’s third annual Structured Finance Association ESG symposium in New York City. The event identified a number of significant strides taken toward shaping an industry-standard ESG framework and guidelines.  

Robust and engaging discussions across a variety of topics illustrated the critical need for a thoughtful approach to framework development. We observed a broad consensus around the notion that market acceptance would require any solution to be data supported and fully transparent. 

Much of the discussion revolved around three recurring themes: Finding a workable balance between the institutional desire for portfolio-specific measures based on raw data and the market need for a standardized scoring mechanism that everybody understands, maintaining data privacy, and assessing tradeoffs between the societal benefits of ESG investing and the added risk it can pose to a portfolio. 

Striking the Right Balance: Institution-Specific Measures vs. Industry-Standard Asset Scoring 

When it comes to disclosure and reporting, one point on a spectrum does not fit all. Investors and asset managers vary in their ultimate reporting needs and approach to assessing ESG and impact investing. On the one hand, having raw data to apply their own analysis or specific standards can be more worthwhile to individual institutions. On the other, having well defined standards or third-party ESG scoring systems for assets provides greater certainty and understanding to the market as a whole.  

Both approaches have value.

Everyone wants access to data and control over how they view the assets in their portfolio. But the need for guidance on what ESG impacts are material and relevant to structured finance remains prominent. Scores, labels, methodologies, and standards can give investors assurance a security contributes to meeting their ESG goals. Investors want to know where their money is going and if it is meaningful.

Methodologies also have to be explainable. Though there was agreement that labeled transactions are not always necessary (or achievable), integration of ESG factors in the decision process is. Reporting systems will need to link underlying collateral to external data sources to calculate key metrics required by a framework while giving users the ability to drill down to meet specific and granular analytical needs.    

Data Privacy

Detailed analysis of underlying asset data, however, highlights a second key issue: the tradeoff between transparency and privacy, particularly for consumer-related assets. Fiduciary and regulatory responsibility to protect disclosure of non-public personally identifiable information limits investor ability to access loan-level data.

While property addresses provide the greatest insight to climate risk and other environmental factors, concerns persist over methods that allow data providers to triangulate and match data from various sources to identify addresses. This in turn makes it possible to link sensitive credit information to specific borrowers.

The responsibility to summarize and disclose metrics required by the framework falls to issuers. The largest residential issuers already appreciate this burden. These issuers have expressed a desire to solve these issues and are actively looking at what they can do to help the market without sacrificing privacy. Data providers, reporting systems, and users will all need to consider the guardrails needed to adhere to source data terms of use.   

Assessing Impact versus Risk

Another theme arising in nearly all discussions centered on assessing ESG investment decisions from the two sometimes competing dimensions of impact and risk and considering whether tradeoffs are needed to meet a wide variety of investment goals. Knowing the impact the investment is making—such as funding affordable housing or the reduction of greenhouse gas emissions—is fundamental to asset selection or understanding the overall ESG position.

But what risks/costs does the investment create for the portfolio? What is the likely influence on performance?

The credit aspect of a deal is distinct from its ESG impact. For example, a CMBS may be socially positive but rent regulation can create thin margins. Ideally, all would like to maximize positive impact but not at the cost of performance, a strategy that may be contributing now to an erosion in greeniums. Disclosures and reporting capabilities should be able to support investment analyses on these dimensions.  

A disclosure framework vetted and aligned by industry stakeholders, combined with robust reporting and analytics and access to as much underlying data as possible, will give investors and asset managers certainty as well as flexibility to meet their ESG goals.   

Contact us

RiskSpan Wins Risk as a Service Category for Second Consecutive Year, Leaps 12 Spots in RiskTech100® 2022 Ranking

RiskSpan’s Edge Platform, a leading provider of risk analytics, data, and behavioral modeling to the structured finance industry, is the “Risk as a Service” category winner for the second consecutive year in Chartis Research’s prestigious RiskTech100® ranking of the world’s 100 top risk technology firms. 

The win accompanies a 12-point improvement in RiskSpan’s overall ranking, placing the firm among the year’s most significant movers.  

“RiskSpan’s continued growth and ongoing partnership strategy have made it one of the big risers in the rankings this year,” said Phil Mackenzie, Research Principal at Chartis Research. “Its strength in securitization and analytics as a service is reflected in its 12-point jump.” 

Licensed by some of the largest asset managers, broker/dealers, hedge funds, mortgage REITs and insurance companies in the U.S., Edge is a one-stop shop for research, analytics, pricing, risk metrics, and reporting. Edge’s cloud-native infrastructure scales as individual client needs change and is supported by RiskSpan’s unparalleled team of mortgage and structured finance experts.  

“This year’s award reflects a year marked by an unprecedented wave of enhancements to our risk platform, noted Bernadette Kogler, RiskSpan’s co-founder and CEO. “Our loan-level analytics has been a hit, while our fully managed risk option continues to tailor scalable offerings to individual client needs. Our best-in-class portfolio analytics for structured products are fast becoming the talk of the industry.”


About RiskSpan 

RiskSpan offers end-to-end solutions for data management, risk analytics, and visualization on a highly secure, fast, and fully scalable, cloud-native platform that has earned the trust of the mortgage and structured finance industry’s largest firms. Combining the strength of subject matter experts, quantitative analysts, and technologists, RiskSpan’s Edge Platform integrates a range of datasets – structured and unstructured – and off-the-shelf analytical tools providing users with powerful insights and a competitive advantage. Learn more at www.riskspan.com.


About Chartis Research: 

Chartis Research is the leading provider of research and analysis on the global market for risk technology. It is part of Infopro Digital, which owns market-leading brands such as Risk and WatersTechnology. Chartis’ goal is to support enterprises as they drive business performance through improved risk management, corporate governance and compliance, and to help clients make informed technology and business decisions by providing in-depth analysis and actionable advice on virtually all aspects of risk technology. 


Managing Market Risk for Crypto Currencies

 

Contents

Overview

Asset Volatility vs Asset Sensitivity to Benchmark (Beta)

Portfolio Asset Covariance

Value at Risk (VaR)

Bitcoin Futures: Basis and Proxies

Intraday Value at Risk (VaR)

Risk-Based Limits

VaR Validation (Bayesian Approach)

Scenario Analysis

Conclusion


Overview

Crypto currencies have now become part of institutional investment strategies. According to CoinShares, assets held under management by crypto managers reached $57B at the end of Q1 2021.  

Like any other financial asset, crypto investments are subject to market risk monitoring with several approaches evolving. Crypto currencies exhibit no obvious correlation to other assets classes, risk factors  or economic variables. However, crypto currencies have exhibited high price volatility and have enough historical data to implement a robust market risk process. 

In this paper we discuss approaches to implementing market risk analytics for a portfolio of crypto assets. We will look at betas to benchmarks, correlations, Value at Risk (VaR) and historical event scenarios. 

Value at Risk allows risk managers to implement risk-based limits structures, instead of relying on traditional notional measures. The methodology we propose enables consolidation of risk for crypto assets with the rest of the portfolio. We will also discuss the use of granular time horizons for intraday limit monitoring. 

Asset Volatility vs Asset Sensitivity to Benchmark (Beta)

For exchange-traded instruments, beta measures the sensitivity of asset price returns relative to a benchmark. For US-listed large cap stocks, beta is generally computed relative to the S&P 500 index. For crypto currencies, several eligible benchmark indices have emerged that represent the performance of the overall crypto currency market.

We analyzed several currencies against S&P’s Bitcoin Index (SPBTC). SPBTC is designed to track the performance of the original crypto asset, Bitcoin. As market capitalization for other currencies grows, it would be more appropriate to switch to a dynamic multi-currency index such as Nasdaq’s NCI. At the time of this paper, Bitcoin constituted 62.4% of NCI.

Traditionally, beta is calculated over a variable time frame using least squares fit on a linear regression of benchmark return and asset return. One of the issues with calculating betas is the variability of the beta itself. In order to overcome that, especially given the volatility of crypto currencies, we recommend using a rolling beta.

Due to the varying levels of volatility and liquidity of various crypto currencies, a regression model may not always be a good fit. In addition to tracking fit through R-squared, it is important to track confidence level for the computed betas.

Figure 1 History of Beta to S&P Bitcoin Index with Confidence Intervals

The chart above shows rolling betas and confidence intervals for four crypto currencies between January 2019 and July 2021. Beta and confidence interval both vary over time and periods of high volatility (stress) cause a larger dislocation in the value of beta.

Rolling betas can be used to generate a hierarchical distribution of expected asset values.

Portfolio Asset Covariance

Beta is a useful measure to track an asset’s volatility relative to a single benchmark. In order to numerically analyze the risk exposure (variance) of a portfolio with multiple crypto assets, we need to compute a covariance matrix. Portfolio risk is a function not only of each asset’s volatility but also of the cross-correlation among them.

Figure 2 Correlations for 11 currencies (calculated using observations from 2021)

The table above shows a correlation matrix across 11 crypto assets, including Bitcoin.

Like betas, correlations among assets change over time. But correlation matrices are more unwieldy to track over time than betas are. For this reason, hierarchical models provide a good, practical framework for time-varying covariance matrices.

Value at Risk (VaR)

The VaR for a position or portfolio can be defined as some threshold Τ (in dollars) where the existing position, when faced with market conditions resembling some given historical period, will have P/L greater than Τ with probability k. Typically, k  is chosen to be 99% or 95%.

To compute this threshold Τ, we need to:

  1. Set a significance percentile k, a market observation period, and holding period n.
  2. Generate a set of future market conditions (scenarios) from today to period n.
  3. Compute a P/L on the position for each scenario

After computing each position’s P/L, we sum the P/L for each scenario and then rank the scenarios’ P/Ls to find the the k th percentile (worst) loss. This loss defines our VaR Τ at the the k th percentile for observation-period length n.

Determining what significance percentile k and observation length n to use is straightforward and often dictated by regulatory rules. For example, 99th percentile 10-day VaR is used for risk-based capital under the Market Risk Rule. Generating the scenarios and computing P/L under these scenarios is open to interpretation. We cover each of these, along with the advantages and drawbacks of each, in the next two sections.

To compute VaR, we first need to generate projective scenarios of market conditions. Broadly speaking, there are two ways to derive this set of scenarios:

  1. Project future market conditions using historical (actual) changes in market conditions
  2. Project future market conditions using a Monte Carlo simulation framework

In this paper, we consider a historical simulation approach.

RiskSpan projects future market conditions using actual (observed) n-period changes in market conditions over the lookback period. For example, if we are computing 1-day VaR for regulatory capital usage under the Market Risk Rule, RiskSpan takes actual daily changes in risk factors. This approach allows our VaR scenarios to account for natural changes in correlation under extreme market moves. RiskSpan finds this to be a more natural way of capturing changing correlations without the arbitrary overlay of how to change correlations in extreme market moves. This, in turn, will more accurately capture VaR. Please note that newer crypto currencies may not have enough data to generate a meaningful set of historical scenarios. In these cases, using a benchmark adjusted by a short-term beta may be used as an alternative.

One key consideration for the historical simulation approach is the selection of the observation window or lookback period. Most regulatory guidelines require at least a one-year window. However, practitioners also recommend a shorter lookback period for highly volatile assets. In the chart below we illustrate how VaR for our portfolio of crypto currencies changes for a range of lookback periods and confidence intervals. Please note that VaR is expressed as a percentage of portfolio market value.

Use of an exponentially weighted moving average methodology can be used to overcome the challenges associated with using a shorter lookback period. This approach emphasizes recent observations by using exponentially weighted moving averages of squared deviations. In contrast to equally weighted approaches, these approaches attach different weights to the past observations contained in the observation period. Because the weights decline exponentially, the most recent observations receive much more weight than earlier observations.

Figure 3 Daily VaR as % of Market Value calculated using various historical observation periods

VaR as a single number does not represent the distribution of P/L outcomes. In addition to computing VaR under various confidence intervals, we also compute expected shortfall, worst loss, and standard deviation of simulated P/L vectors. Worst loss and standard deviation are self-explanatory while the calculation of expected shortfall is described below.

Expected shortfall is the average of all the P/L figures to the left of the VaR figure. If we have 1,000 simulated P/L vectors, and the VaR is the 950th worst case observation, the expected shortfall is the average of P/Ls from 951 to 1000. 

The table below presents VaR-related metrics as a percentage of portfolio market value under various lookback periods.

Figure 4 VaR for a portfolio of crypto assets computed for various lookback periods and confidence intervals

Bitcoin Futures: Basis and Proxies

One of the most popular trades for commodity futures is the basis trade. This is when traders build a strategy around the difference between the spot price and futures contract price of a commodity. This exists in corn, soybean, oil and of course Bitcoin. 

For the purpose of calculating VaR, specific contracts may not provide enough history and risk systems use continuous contracts. Continuous contracts introduce additional basis as seen in the chart below. Risk managers need to work with the front office to align risk factor selection with trading strategies, without compromising independence of the risk process.

Figure 5 BTC/Futures basis difference between generic and active contracts

Intraday Value

The highly volatile nature of crypto currencies requires another consideration for VaR calculations. A typical risk process is run at the end of the day and VaR is calculated for a one-day or longer forecasting period. But Crypto currencies, especially Bitcoin, can also show significant intraday price movements. 

We obtained intraday prices for Bitcoin (BTC) from Gemini, which is ranked third by volume. This data was normalized to create time series to generate historical simulations. The chart below shows VaR as a percentage of market value for Bitcoin (BTC) for one-minute, one-hour and one-day forecasting periods. Our analysis shows that a Bitcoin position can lose as much as 3.5% of its value in one hour (99th percentile VaR).

 

Risk-Based Limits 

Right from the inception of Value at Risk as a concept it has been used by companies to manage limits for a trading unit. VaR serves as a single risk-based limit metric with several advantages and a few challenges:

Pros of using VaR for risk-based limit:

  • VaR can be applied across all levels of portfolio aggregation.
  • Aggregations can be applied across varying exposures and strategies.
  • Today’s cloud scale makes it easy to calculate VaR using granular risk factor data.

VaR can be subject to model risk and manipulation. Transparency and use of market risk factors can avoid this pitfall.

Ability to calculate intra-day VaR is key for a risk-based limit implementation for crypto assets. Risk managers should consider at least an hourly VaR limit in addition to the traditional daily limits.

VaR Validation (Bayesian Approach)

Standard approaches for back-testing VaR are applicable to portfolios of crypto assets as well.

Given the volatile nature of this asset class, we also explored an approach to validating the confidence interval and percentiles implied from historical simulations. Although this is a topic that deserves its own document, we present a high-level explanation and results of our analysis.

Building an approach first proposed in the Pyfolio library, we generated a posterior distribution using the Pymc3 package from our historically observed VaR simulations.

Sampling routines from Pymc3 were used to generate 10,000 simulations of the 3-year lookback case. A distribution of percentiles (VaR) was then computed across these simulations.

The distribution shows that the mean 95th percentile VaR would be 7.3% vs 8.9% calculated using the historical simulation approach. However, the tail of the distribution indicates a VaR closer to the historical simulation approach. One could conclude that the test indicates that the original calculation still represents the extreme case, which is the motivation behind VaR.

Figure 6 Distribution of percentiles generated from posterior simulations

Scenario Analysis

In addition to standard shock scenarios, we recommend using the volatility of Bitcoin to construct a simulation of outcomes. The chart below shows the change in Bitcoin (BTC) volatility for select events in the last two years. Outside of standard macro events, crypto assets respond to cyber security events and media effects, including social media.

Figure 7 Weekly observed volatility for Bitcoin  

Conclusion

Given the volatility of crypto assets, we recommend, to the extent possible, a probability distribution approach. At the very least, risk managers should monitor changes in relationship (beta) of assets.

For most financial institutions, crypto assets are part of portfolios that include other traditional asset classes. A standard approach must be used across all asset classes, which may make it challenging to apply shorter lookback windows for computing VaR. Use of the exponentially weighted moving approach (described above) may be considered.

Intraday VaR for this asset class can be significant and risk managers should set appropriate limits to manage downward risk.

Idiosyncratic risks associated with this asset class have created a need for monitoring scenarios not necessarily applicable to other asset classes. For this reason, more scenarios pertaining to cyber risk are beginning to be applied across other asset classes.  

CONTACT US TO LEARN MORE!

Related Article

Calculating VaR: A Review of Methods


Automating Compliance Risk Analytics

 Recorded: August 4th | 1:00 p.m. EDT

Completing the risk sections of Form PF, AIFMD, Open Protocol and other regulatory filings requires submitters to first compute an extensive battery of risk analytics, often across a wide spectrum of trading strategies and instrument types. This “pre-work” is both painstaking and prone to human error. Automating these upstream analytics greatly simplifies life downstream for those tasked with completing these filings.

RiskSpan’s Marty Kindler walks through a process for streamlining delta equivalent exposure, 10 year bond equivalent exposure, DV01/CS01, option greeks, stress scenario impacts and VaR in support not only of downstream regulatory filings but of an enhanced, overall risk management regime.


Featured Speaker

Martin Kindler

Managing Director, RiskSpan


May 19 Workshop: Quality Control Using Anomaly Detection (Part 2)

Recorded: May 19 | 1:00 p.m. ET

Last month, RiskSpan’s Suhrud Dagli and Martin Kindler outlined the principles underlying anomaly detection and its QC applications related to market data and market risk. You can view a recording of that workshop here.

On Wednesday, May 19th, Suhrud presented Part 2 of this workshop, which dove into mortgage loan QC and introduce coding examples and approaches for avoiding false negatives using open-source Python algorithms in the Anomaly Detection Toolkit (ADTK).

RiskSpan presents various types of detectors, including extreme studentized deviate (ESD), level shift, local outliers, seasonal detectors, and volatility shift in the context of identifying spike anomalies and other inconsistencies in mortgage data. Specifically:

  • Coding examples for effective principal component analysis (PCA) loan data QC
  • Use cases around loan performance and entity correction, and
  • Novelty detection

Suhrud Dagli

Co-founder and CIO, RiskSpan

Martin Kindler

Managing Director, RiskSpan



April 28 Workshop: Anomaly Detection

Recorded: April 28 | 1:00 p.m. ET

Outliers and anomalies refer to various types of occurrences in a time series. Spike of value, shift in level or volatility or a change in seasonal pattern are common examples. Anomaly detection depends on specific context. 

In this month’s installment in our Data and Machine Learning Workshop Series, RiskSpan Co-Founder & CIO Suhrud Dagli is joined by Martin Kindler, a market risk practitioner who has spent decades dealing with outliers.

Suhrud and Martin explore unsupervised approaches for detecting anomalies.

Suhrud Dagli

Co-founder and CIO, RiskSpan

Martin Kindler

Managing Director, RiskSpan



RiskSpan’s Edge Platform Wins 2021 Buy-Side Market Risk Management Product of the Year

RiskSpan, a leading SaaS provider of risk management, data and analytics has been awarded Buy-Side Market Risk Management Product of the Year for its Edge Platform at Risk.net’s 2021 Risk Markets Technology Awards. The honor marks Edge’s second major industry award in 2021, having also been named the winner of Chartis Research’s Risk-as-a-Service category.

Licensed by some of the largest asset managers and Insurance companies in the U.S., a significant component of the Edge Platform’s value is derived from its ability to serve as a one-stop shop for research, pre-trade analytics, pricing and risk quantification, and reporting. Edge’s cloud-native infrastructure allows RiskSpan clients to scale as needs change and is supported by RiskSpan’s unparalleled team of domain experts — seasoned practitioners who know the needs and pain points of the industry firsthand

Adjudicators cited the platform’s “strong data management and overall technology” and “best-practice quant design for MBS, structured products and loans” as key factors in the designation.

GET A DEMO

Edge’s flexible configurability enables users to create custom views of their portfolio or potential trades at any level of granularity and down to the loan level. The platform enables researchers and analysts to integrate conventional and alternative data from an impressive array of sources to identify impacts that might otherwise go overlooked.

For clients requiring a fully supported risk-analytics-as-a-service offering, the Edge Platform provides a comprehensive data analysis, predictive modeling, portfolio benchmarking and reporting solution tailored to individual client needs.

An optional studio-level tier incorporates machine learning and data scientist support in order to leverage unstructured and alternative datasets in the analysis.


Contact us to learn how Edge’s capabilities can transform your mortgage and structured product analytics. 

Learn more about Edge at https://riskspan.com/edge-platform/ 


Get Started
Get A Demo

Linkedin    Twitter    Facebook