Linkedin    Twitter   Facebook

Get Started
Log In

Linkedin

Articles Tagged with: Innovation and Alternative Data

MSR & Loan Trading Insights

RiskSpan’s Edge Platform is the leading comprehensive data and mortgage analytics platform tailored for residential whole loan and MSR trading, empowering investors with advanced cloud technology and AI. By streamlining loan and MSR data management, providing customizable historical performance insights, and powering robust valuation and risk analysis, Riskspan’s Edge Platform automates complex data tasks and identifies critical, loan-level insights. 

Looking for an edge? Read our latest whole loan trading and MSR-related insights below.


FICO

What do 2023 Originations Means for MSRs?

Are you investing in MSRs or considering doing so in the near future? If so, understanding current origination trends and loan characteristics is a critical component of predicting future MSR performance and prepayment risk. Read our latest research post, which looks into key characteristics of 2023 originations.

Trader

Why Accurate Loan Pool and MSR Cost Forecasting Requires Loan-by-Loan Analytics

Loan cohorting has been a useful strategy to limit the computational power necessary to run simulations. But advances in cloud compute and increasing heterogeneity of loan and MSR portfolios means better methods are now available. 

christopher-burns-Kj2SaNHG-hg-unsplash

5 foundational steps for investors to move towards loan-level analyses

It’s critical to leverage your full spectrum of data and run analyses at the loan level rather than cohorting. But what does it take to make the switch to loan-level analytics? Our team has put together a short set of recommendations and considerations for how to tackle an otherwise daunting project.

DAAS

It’s time to move to DaaS — Why it matters for loan and MSR investors

The ability to analyze loan-level granular data is fast becoming the difference between profitable trades and near misses… but operating at the loan level means wading through an ocean of data. Learn about how you can get the most out of your data.

uriel-sc

Case Study: How one investor moved to loan level analysis while reducing their costs

Are you looking to optimize investment decisions while reducing costs? Discover how one loan and MSR investor transformed their analytics using RiskSpan, in our latest case study.

Improving-MSR-Pricing

Whitepaper: Improving the precision of MSR pricing using loan-level analytics

Incorporating both credit and prepayment modeling into an MSR valuation regime requires a loan-by-loan approach to capture the necessary level of granularity, but performing such an analysis has been historically viewed as impractical. Read RiskSpan’s deep-dive whitepaper to explore how today’s cloud-based, loan-level technology can make this not only practical, but cost effective.

Interested In learning about RiskSpan’s Edge Platform?

loan-analytics


RS Edge Platform Implementation Streamlined Processes Reducing Client Resource Support Needs by 46%-VERSION 2

Asset Manager | New York, NY

RiskSpan Applications Provided

Edge Portfolio

MARKET RISK ANALYTICS

Edge-Predictive

MODELS & FORECASTING

Edge-Perspective

MODEL VALIDATION

Edge-Predictive""

GOVERNANCE

ABOUT THE CLIENT

A leading provider of capital and services to the mortgage and financial services industries that leverage their proven investment expertise and identity and invest in assets that offer attractive risk-adjusted returns while also protecting our existing portfolio and generating long-term value for our investors.


PROBLEM

An asset manager sought to replace an inflexible risk system provided by a Wall Street dealer. ​The portfolio was diverse, with a sizable concentration in structured securities and mortgage assets. ​

The legacy analytics system was rigid with no flexibility to vary scenarios or critical investor and regulatory reporting.


CHALLENGE

Lacked a single-solution

Data integrity issues

Inflexible locally installed risk management system

No direct connectivity to downstream systems

Models + Data management = End-to-end Managed Process


HIGHLIGHTS

GET STARTED

Data Library5 Vendors → Single Platform

Loan32% Annual Cost Savings

Private Label SecuritiesIncreased Flexibility

Port AnalyticsAdditional

DOWNLOAD CASE STUDY


SOLUTION

RiskSpan’s Edge Platform delivered a cost-efficient and flexible solution by bundling required data feeds, predictive models for mortgage and structured products, and infrastructure management. ​

The Platform manages and validates the asset manager’s third-party and portfolio data and produces scenario analytics in a secure hosted environment.


TESTIMONIAL

”Our existing daily process for calculating, validating, and reporting on key market and credit risk metrics required significant manual work… [Edge] gets us to the answers faster, putting us in a better position to identify exposures and address potential problems.” 

          — Managing Director, Securitized Products


EDGE PROVIDED

END-TO-END DATA AND RISK MANAGEMENT PLATFORM 

  • Scalable, cloud native technology
  • Increased flexibility to run analytics at loan level; additional interactive / ad-hoc analytics
  • Reliable accurate data with frequent updates

COST AND OPERATIONAL EFFICIENCIES GAINED

  • Streamlined workflows | Automated processes
  • 32% annual cost savings
  • 46% fewer resources needed for maintenance
  •  



RS Edge Platform Implementation Streamlined Processes Reducing Client Resource Support Needs by 46%-VERSION 1

 

AT-A-GLANCE

An asset manager sought to replace an inflexible risk system provided by a Wall Street dealer. ​The portfolio was diverse, with a sizable concentration in structured securities and mortgage assets. ​

The legacy analytics system was rigid with no flexibility to vary scenarios or critical investor and regulatory reporting.


Data Library5 Vendors → Single Platform

Loan Flat32% Annual Cost Savings

Private Label SecuritiesIncreased Flexibility

Port AnalyticsAdditional Ad-hoc Analytics


”Our existing daily process for calculating, validating, and reporting on key market and credit risk metrics required significant manual work… [Edge] gets us to the answers faster, putting us in a better position to identify exposures and address potential problems.” 

          — Managing Director, Securitized Products 

LET US BUILD YOUR SOLUTION

Models + Data management = End-to-end Managed Process

 

CHALLENGES

Lacked a single-solution

Data integrity issues

Inflexible locally installed risk management system

No direct connectivity to downstream systems


SOLUTIONS

RiskSpan’s Edge Platform delivered a cost-efficient and flexible solution by bundling required data feeds, predictive models for mortgage and structured products, and infrastructure management. ​

The Platform manages and validates the asset manager’s third-party and portfolio data and produces scenario analytics in a secure hosted environment. 


 

EDGE WE PROVIDED

End-to-end data and risk management platform

  • Scalable, cloud native technology
  • Increased flexibility to run analytics at loan level; additional interactive / ad-hoc analytics
  • Reliable accurate data with frequent updates

Cost and operational efficiencies gained

  • Streamlined workflows | Automated processes
  • 32% annual cost savings
  • 46% fewer resources needed for maintenance


Mortgage Data and the Cloud – Now is the Time

As the trend toward cloud computing continues its march across an ever-expanding set of industries, it is worth pausing briefly to contemplate how it can benefit those of us who work with mortgage data for a living.  

The inherent flexibility, efficiency and scalability afforded by cloud-native systems driving this trend are clearly of value to users of financial services data. Mortgages in particular, each accompanied by a dizzying array of static and dynamic data about borrower incomes, employment, assets, property valuations, payment histories, and detailed loan terms, stand to reap the benefits of cloud and the shift to this new form of computing.  

And yet, many of my colleagues still catch themselves referring to mortgage data files as “tapes.” 

Migrating to cloud evokes some of the shiniest words in the world of computing – cost reduction, security, reliability, agility – and that undoubtedly creates a stir. Cloud’s ability to provide on-demand access to servers, storage locations, databases, software and applications via the internet, along with the promise to ‘only pay for what you use’ further contributes to its popularity. 

These benefits are especially well suited to mortgage data. They include:  

  • On-demand self-service and the ability to provision resources without human interference – of particular use for mortgage portfolios that are constantly changing in both size and composition. 
  • Broad network access, diverse platforms having access to multiple resources available over the network – valuable when origination, secondary marketing, structuring, servicing, and modeling tools are seeking to simultaneously access the same evolving datasets for different purposes. 
  • Multi-tenancy and resource pooling, allowing resource sharing while maintaining privacy and security. 
  • Rapid elasticity and scalability, quick acquiring and disposing of resources and allowing quick but measured scaling based on demand. 

Cloud-native systems reduce ownership and operational expenses, increase speed and agility, facilitate innovation, improve client experience, and even enhance security controls. 

There is nothing quite like mortgage portfolios when it comes to massive quantities of financial data, often PII-laden, with high security requirements. The responsibility for protecting borrower privacy is the most frequently cited reason for financial institution reluctance when it comes to cloud adoption. But perhaps counterintuitively, migrating on-premises applications to cloud actually results in a more controlled environment as it provides for backup and access protocols that are not as easily implemented with on-premise solutions. 

The cloud affords a sophisticated and more efficient way of securing mortgage data. In addition to eliminating costs associated with running and maintaining data centers, the cloud enables easy and fast access to data and applications anywhere and at any time. As remote work takes hold as a more long-term norm, cloud-native platform help ensure employees can work effectively regardless of their location. Furthermore, the scalability of cloud-native data centers allows holders of mortgage assets to grow and expand storage capabilities as the portfolio grows and reduce it when it contracts. The cloud protects mortgage data from security breaches or disaster events, because the loan files are (by definition) backed up in a secure, remote location and easily restored without having to invest in expensive data retrieval methods.  

This is not to say that migrating to the cloud is without its challenges. Entrusting sensitive data to a new third-party partner and relying on its tech to remain online will always carry some measure of risk. Cloud computing, like any other innovation, comes with its own advantages and disadvantages, and redundancies mitigate virtually all of these uncertainties. Ultimately, the upside of being able work with mortgage data on cloud-native solutions far outweighs the drawbacks. The cloud makes it possible for processes to become more efficient in real-time, without having to undergo expensive hardware enhancements. This in turn creates a more productive environment for data analysts and modelers seeking to give portfolio managers, servicers, securitizers, and others who routinely deal with mortgage assets the edge they are looking for.

Kriti Asrani is an associate data analyst at RiskSpan.


Want to read more on this topic? Check out COVID-19 and the Cloud.


Managing Market Risk for Crypto Currencies

 

Contents

Overview

Asset Volatility vs Asset Sensitivity to Benchmark (Beta)

Portfolio Asset Covariance

Value at Risk (VaR)

Bitcoin Futures: Basis and Proxies

Intraday Value at Risk (VaR)

Risk-Based Limits

VaR Validation (Bayesian Approach)

Scenario Analysis

Conclusion


Overview

Crypto currencies have now become part of institutional investment strategies. According to CoinShares, assets held under management by crypto managers reached $57B at the end of Q1 2021.  

Like any other financial asset, crypto investments are subject to market risk monitoring with several approaches evolving. Crypto currencies exhibit no obvious correlation to other assets classes, risk factors  or economic variables. However, crypto currencies have exhibited high price volatility and have enough historical data to implement a robust market risk process. 

In this paper we discuss approaches to implementing market risk analytics for a portfolio of crypto assets. We will look at betas to benchmarks, correlations, Value at Risk (VaR) and historical event scenarios. 

Value at Risk allows risk managers to implement risk-based limits structures, instead of relying on traditional notional measures. The methodology we propose enables consolidation of risk for crypto assets with the rest of the portfolio. We will also discuss the use of granular time horizons for intraday limit monitoring. 

Asset Volatility vs Asset Sensitivity to Benchmark (Beta)

For exchange-traded instruments, beta measures the sensitivity of asset price returns relative to a benchmark. For US-listed large cap stocks, beta is generally computed relative to the S&P 500 index. For crypto currencies, several eligible benchmark indices have emerged that represent the performance of the overall crypto currency market.

We analyzed several currencies against S&P’s Bitcoin Index (SPBTC). SPBTC is designed to track the performance of the original crypto asset, Bitcoin. As market capitalization for other currencies grows, it would be more appropriate to switch to a dynamic multi-currency index such as Nasdaq’s NCI. At the time of this paper, Bitcoin constituted 62.4% of NCI.

Traditionally, beta is calculated over a variable time frame using least squares fit on a linear regression of benchmark return and asset return. One of the issues with calculating betas is the variability of the beta itself. In order to overcome that, especially given the volatility of crypto currencies, we recommend using a rolling beta.

Due to the varying levels of volatility and liquidity of various crypto currencies, a regression model may not always be a good fit. In addition to tracking fit through R-squared, it is important to track confidence level for the computed betas.

Crypto-VolitilityFigure 1 History of Beta to S&P Bitcoin Index with Confidence Intervals

The chart above shows rolling betas and confidence intervals for four crypto currencies between January 2019 and July 2021. Beta and confidence interval both vary over time and periods of high volatility (stress) cause a larger dislocation in the value of beta.

Rolling betas can be used to generate a hierarchical distribution of expected asset values.

Portfolio Asset Covariance

Beta is a useful measure to track an asset’s volatility relative to a single benchmark. In order to numerically analyze the risk exposure (variance) of a portfolio with multiple crypto assets, we need to compute a covariance matrix. Portfolio risk is a function not only of each asset’s volatility but also of the cross-correlation among them.

Crypto-PortfolioFigure 2 Correlations for 11 currencies (calculated using observations from 2021)

The table above shows a correlation matrix across 11 crypto assets, including Bitcoin.

Like betas, correlations among assets change over time. But correlation matrices are more unwieldy to track over time than betas are. For this reason, hierarchical models provide a good, practical framework for time-varying covariance matrices.

Value at Risk (VaR)

The VaR for a position or portfolio can be defined as some threshold Τ (in dollars) where the existing position, when faced with market conditions resembling some given historical period, will have P/L greater than Τ with probability k. Typically, k  is chosen to be 99% or 95%.

To compute this threshold Τ, we need to:

  1. Set a significance percentile k, a market observation period, and holding period n.
  2. Generate a set of future market conditions (scenarios) from today to period n.
  3. Compute a P/L on the position for each scenario

After computing each position’s P/L, we sum the P/L for each scenario and then rank the scenarios’ P/Ls to find the the k th percentile (worst) loss. This loss defines our VaR Τ at the the k th percentile for observation-period length n.

Determining what significance percentile k and observation length n to use is straightforward and often dictated by regulatory rules. For example, 99th percentile 10-day VaR is used for risk-based capital under the Market Risk Rule. Generating the scenarios and computing P/L under these scenarios is open to interpretation. We cover each of these, along with the advantages and drawbacks of each, in the next two sections.

To compute VaR, we first need to generate projective scenarios of market conditions. Broadly speaking, there are two ways to derive this set of scenarios:

  1. Project future market conditions using historical (actual) changes in market conditions
  2. Project future market conditions using a Monte Carlo simulation framework

In this paper, we consider a historical simulation approach.

RiskSpan projects future market conditions using actual (observed) n-period changes in market conditions over the lookback period. For example, if we are computing 1-day VaR for regulatory capital usage under the Market Risk Rule, RiskSpan takes actual daily changes in risk factors. This approach allows our VaR scenarios to account for natural changes in correlation under extreme market moves. RiskSpan finds this to be a more natural way of capturing changing correlations without the arbitrary overlay of how to change correlations in extreme market moves. This, in turn, will more accurately capture VaR. Please note that newer crypto currencies may not have enough data to generate a meaningful set of historical scenarios. In these cases, using a benchmark adjusted by a short-term beta may be used as an alternative.

One key consideration for the historical simulation approach is the selection of the observation window or lookback period. Most regulatory guidelines require at least a one-year window. However, practitioners also recommend a shorter lookback period for highly volatile assets. In the chart below we illustrate how VaR for our portfolio of crypto currencies changes for a range of lookback periods and confidence intervals. Please note that VaR is expressed as a percentage of portfolio market value.

Use of an exponentially weighted moving average methodology can be used to overcome the challenges associated with using a shorter lookback period. This approach emphasizes recent observations by using exponentially weighted moving averages of squared deviations. In contrast to equally weighted approaches, these approaches attach different weights to the past observations contained in the observation period. Because the weights decline exponentially, the most recent observations receive much more weight than earlier observations.

Crypto-VaRFigure 3 Daily VaR as % of Market Value calculated using various historical observation periods

VaR as a single number does not represent the distribution of P/L outcomes. In addition to computing VaR under various confidence intervals, we also compute expected shortfall, worst loss, and standard deviation of simulated P/L vectors. Worst loss and standard deviation are self-explanatory while the calculation of expected shortfall is described below.

Expected shortfall is the average of all the P/L figures to the left of the VaR figure. If we have 1,000 simulated P/L vectors, and the VaR is the 950th worst case observation, the expected shortfall is the average of P/Ls from 951 to 1000.

Crypto-VaR

The table below presents VaR-related metrics as a percentage of portfolio market value under various lookback periods.

Crypto-VaRFigure 4 VaR for a portfolio of crypto assets computed for various lookback periods and confidence intervals

Bitcoin Futures: Basis and Proxies

One of the most popular trades for commodity futures is the basis trade. This is when traders build a strategy around the difference between the spot price and futures contract price of a commodity. This exists in corn, soybean, oil and of course Bitcoin. For the purpose of calculating VaR, specific contracts may not provide enough history and risk systems use continuous contracts. Continuous contracts introduce additional basis as seen in the chart below. Risk managers need to work with the front office to align risk factor selection with trading strategies, without compromising independence of the risk process.

Crypto-BasisFigure 5 BTC/Futures basis difference between generic and active contracts

Intraday Value

The highly volatile nature of crypto currencies requires another consideration for VaR calculations. A typical risk process is run at the end of the day and VaR is calculated for a one-day or longer forecasting period. But Crypto currencies, especially Bitcoin, can also show significant intraday price movements.

We obtained intraday prices for Bitcoin (BTC) from Gemini, which is ranked third by volume. This data was normalized to create time series to generate historical simulations. The chart below shows VaR as a percentage of market value for Bitcoin (BTC) for one-minute, one-hour and one-day forecasting periods. Our analysis shows that a Bitcoin position can lose as much as 3.5% of its value in one hour (99th percentile VaR).

Crypto-Intraday

 

Risk-Based Limits 

Right from the inception of Value at Risk as a concept it has been used by companies to manage limits for a trading unit. VaR serves as a single risk-based limit metric with several advantages and a few challenges:

Pros of using VaR for risk-based limit:

  • VaR can be applied across all levels of portfolio aggregation.
  • Aggregations can be applied across varying exposures and strategies.
  • Today’s cloud scale makes it easy to calculate VaR using granular risk factor data.

VaR can be subject to model risk and manipulation. Transparency and use of market risk factors can avoid this pitfall.

Ability to calculate intra-day VaR is key for a risk-based limit implementation for crypto assets. Risk managers should consider at least an hourly VaR limit in addition to the traditional daily limits.

VaR Validation (Bayesian Approach)

Standard approaches for back-testing VaR are applicable to portfolios of crypto assets as well.

Given the volatile nature of this asset class, we also explored an approach to validating the confidence interval and percentiles implied from historical simulations. Although this is a topic that deserves its own document, we present a high-level explanation and results of our analysis.

Building an approach first proposed in the Pyfolio library, we generated a posterior distribution using the Pymc3 package from our historically observed VaR simulations.

Sampling routines from Pymc3 were used to generate 10,000 simulations of the 3-year lookback case. A distribution of percentiles (VaR) was then computed across these simulations.

The distribution shows that the mean 95th percentile VaR would be 7.3% vs 8.9% calculated using the historical simulation approach. However, the tail of the distribution indicates a VaR closer to the historical simulation approach. One could conclude that the test indicates that the original calculation still represents the extreme case, which is the motivation behind VaR.

Crypto-Var-ValidationFigure 6 Distribution of percentiles generated from posterior simulations

Scenario Analysis

In addition to standard shock scenarios, we recommend using the volatility of Bitcoin to construct a simulation of outcomes. The chart below shows the change in Bitcoin (BTC) volatility for select events in the last two years. Outside of standard macro events, crypto assets respond to cyber security events and media effects, including social media.

Crypto-Scenario-AnalysisFigure 7 Weekly observed volatility for Bitcoin

Conclusion

Given the volatility of crypto assets, we recommend, to the extent possible, a probability distribution approach. At the very least, risk managers should monitor changes in relationship (beta) of assets.

For most financial institutions, crypto assets are part of portfolios that include other traditional asset classes. A standard approach must be used across all asset classes, which may make it challenging to apply shorter lookback windows for computing VaR. Use of the exponentially weighted moving approach (described above) may be considered.

Intraday VaR for this asset class can be significant and risk managers should set appropriate limits to manage downward risk.

Idiosyncratic risks associated with this asset class have created a need for monitoring scenarios not necessarily applicable to other asset classes. For this reason, more scenarios pertaining to cyber risk are beginning to be applied across other asset classes.  

CONTACT US TO LEARN MORE!

Related Article

Calculating VaR: A Review of Methods


RiskSpan Named to Inaugural STORM50 Ranking by Chartis Research – Winner of “A.I. Innovation in Capital Markets”

Chartis Research has named RiskSpan to its Inaugural “STORM50” Ranking of leading risk and analytics providers. The STORM report “focuses on the computational infrastructure and algorithmic efficiency of the vast array of technology tools used across the financial services industry” and identifies industry-leading vendors that excel in the delivery of Statistical Techniques, Optimization frameworks, and Risk Models of all types.

STORM50

RiskSpan’s flagship Edge Platform was a natural fit for the designation because of its positioning squarely at the nexus of statistical behavioral modeling (specifically around mortgage credit and prepayment risk) and functionality enabling users to optimize trading and asset management strategies.  Being named the winner of the “A.I. Innovation in Capital Markets” solutions category reflects the work of RiskSpan’s vibrant innovation lab, which includes researching and developing machine learning solutions to structured finance challenges. These solutions include mining a growing trove of alternative/unstructured data sources, anomaly detection in loan-level and other datasets, and natural language processing for constructing deal cash flow models from legal documents.

Learn more about the Edge Platform or contact us to discuss ways we might help you modernize and improve your mortgage and structured finance data and analytics challenges.


Automating Compliance Risk Analytics

 Recorded: August 4th | 1:00 p.m. EDT

Completing the risk sections of Form PF, AIFMD, Open Protocol and other regulatory filings requires submitters to first compute an extensive battery of risk analytics, often across a wide spectrum of trading strategies and instrument types. This “pre-work” is both painstaking and prone to human error. Automating these upstream analytics greatly simplifies life downstream for those tasked with completing these filings.

RiskSpan’s Marty Kindler walks through a process for streamlining delta equivalent exposure, 10 year bond equivalent exposure, DV01/CS01, option greeks, stress scenario impacts and VaR in support not only of downstream regulatory filings but of an enhanced, overall risk management regime.


Featured Speaker

Martin Kindler

Managing Director, RiskSpan


Is Your Enterprise Risk Management Keeping Up with Recent Regulatory Changes?

Recorded: June 30th | 1:00 p.m. EDT

Nick Young, Head of RiskSpan’s Model Risk Management Practice, and his team of model validation analysts walk through the most important regulatory updates of the past 18 months from the Federal Reserve, OCC, and FDIC pertaining to enterprise risk management in general (and model risk management in particular).

Nick’s team present tips for ensuring that your policies and practices are keeping up with recent changes to AML and other regulatory requirements.


Featured Speakers

Nick Young

Head of Model Risk Management, RiskSpan


Data & Machine Learning Workshop Series

RiskSpan’s Edge Platform is supported by a dynamic team of professionals who live and breathe mortgage and structured finance data. They know firsthand the challenges this type of data presents and are always experimenting with new approaches for extracting maximum value from it.

In this series of complimentary workshops our team applies machine learning and other innovative techniques to data that asset managers, broker-dealers and mortgage bankers care about.

Machine-Learning-Data-Workshop-Series

Check out our recorded workshops


Measuring and Visualizing Feature Impact & Machine Learning Model Materiality

RiskSpan CIO Suhrud Dagli demonstrates in greater detail how machine learning can be used in input data validations, to measure feature impact, and to visualize how multiple features interact with each other.

Structured Data Extraction from Images Using Google Document AI

RiskSpan Director Steven Sun shares a procedural approach to tackling the difficulties of efficiently extracting structured data from images, scanned documents, and handwritten documents using Google’s latest Document AI Solution.

Pattern Recognition in Time Series Data

Traders and investors rely on time series patterns generated by asset performance to inform and guide their trading and asset allocation decisions. Economists take advantage of analogous patterns in macroeconomic and market data to forecast recessions and other market events. But you need to be able to spot these patterns in order to use them.

Advanced Forecasting Using Hierarchical Models

Traditional statistical models apply a single set of coefficients by pooling a large dataset or for specific cohorts. Hierarchical models learn from feature behavior across dimensions or timeframes. This informative workshop applies hierarchical models to a variety of mortgage and structured finance use cases.

Quality Control with Anomaly Detection (Part I)

Outliers and anomalies refer to various types of occurrences in a time series. Spike of value, shift in level or volatility or a change in seasonal pattern are common examples.  RiskSpan Co-Founder & CIO Suhrud Dagli is joined by Martin Kindler, a market risk practitioner who has spent decades dealing with outliers.

Quality Control with Anomaly Detection (Part 2)

Suhrud Dagli presents Part 2 of this workshop, which dove into mortgage loan QC and introduce coding examples and approaches for avoiding false negatives using open-source Python algorithms in the Anomaly Detection Toolkit (ADTK).

Applying Few-Shot Learning Techniques to Mortgage Data

Few-shot and one-shot learning models continue to gain traction in a growing number of industries – particularly those in which large training and testing samples are hard to come by. But what about mortgages? Is there a place for few-shot learning where datasets are seemingly so robust and plentiful? 

RS-Tech-Talent


Mortgage DQs by MSA: Non-Agency Performance Chart of the Month

This month we take a closer look at geographical differences in loan performance in the non-agency space. The chart below looks at the 60+ DPD Rate for the 5 Best and Worst performing MSAs (and the overall average). A couple of things to note:

  • The pandemic seems to have simply amplified performance differences that were already apparent pre-covid. The worst performing MSAs were showing mostly above-average delinquency rates before last year’s disruption.
  • Florida was especially hard-hit. Three of the five worst-performing MSAs are in Florida. Not surprisingly, these MSAs rely heavily on the tourism industry.
  • New York jumped from being about average to being one of the worst-performing MSAs in the wake of the pandemic. This is not surprising considering how seriously the city bore the pandemic’s brunt.
  • Tech hubs show strong performance. All our best performers are strong in the Tech industry—Austin’s the new Bay Area, right?

Contact Us


Get Started
Log in

Linkedin   

risktech2024