Linkedin    Twitter   Facebook

Get Started
Log In

Linkedin

Blog Archives

RiskSpan VQI: Current Underwriting Standards Q1 2021

VQI-Risk-Layers-Calc-March-2021

RiskSpan’s Vintage Quality Index estimates the relative “tightness” of credit standards by computing and aggregating the percentage of Agency originations each month with one or more “risk factors” (low-FICO, high DTI, high LTV, cash-out refi, investment properties, etc.). Months with relatively few originations characterized by these risk factors are associated with lower VQI ratings. As the historical chart above shows, the index maxed out (i.e., had an unusually high number of loans with risk factors) leading up to the 2008 crisis.

Vintage Quality Index Stability Masks Purchase Credit Contraction

The first quarter of 2021 provides a stark example of why it is important to consider the individual components of RiskSpan’s Vintage Quality Index and not just the overall value. 

The Index overall dropped by just 0.37 points to 76.68 in the first quarter of 2021. On the surface, this seems to suggest a minimal change to credit availability and credit quality over the period. But the Index’s net stability masks a significant change in one key metric offset by more modest counterbalancing changes in the remaining eight. The percentage of high-LTV mortgages fell to 16.7% (down from 21% at the end of 2020) during the first quarter.  

While this continues a trend in falling rates of high-LTV loans (down 8.7% since Q1 of 2020 and almost 12% from Q1 2019) it coincides with a steady increase in house prices. From December 2020 to February 2021, the Monthly FHFA House Price Index® (US, Purchase Only, Seasonally Adjusted) rose 1.9%. More striking is the year-over-year change from February 2020 to 2021, during which the same rose by 11.1%. Taken together, the 10% increase in home prices combined with a 10% reduction in the share of high-LTV loans paints a sobering picture for marginal borrowers seeking to purchase a home.  

Some of the reduction in high-LTV share is obviously attributable to the growing percentage of refinance activity (including cash-out refinancing, which counterbalances the effect the falling high-LTV rate has on the index). But these refis does not impact the purchase-only HPI. As a result, even though the overall Index did not change materially, higher required down payments (owing to higher home prices) combined with fewer high-LTV loans reflects a credit box that effectively shrank in Q1.

 

VQI-Risk-Layers-Historical-Trend-March-2021

VQI-RISK-LAYERS-HEADER-MARCH-2021

VQI-March-2021-FICO-660

VQI-March-2021-LTV-80

VQI-March-2021-Adjust-Rate

VQI-March-2021-Loans-W-SF

VQI-March-2021-Cash-Refi

VQI-RISK-LAYERS-HEADER-MARCH-2021

VQI-March-2021

VQI-March-2021

VQI-March-2021-OBL

VQI-Analytic-and-Data-Assumptions-Header

Population assumptions:

  • Monthly data for Fannie Mae and Freddie Mac.

  • Loans originated more than three months prior to issuance are excluded because the index is meant to reflect current market conditions.

  • Loans likely to have been originated through the HARP program, as identified by LTV, MI coverage percentage, and loan purpose are also excluded. These loans do not represent credit availability in the market as they likely would not have been originated today but for the existence of HARP.                                                                                               

Data assumptions:

  • Freddie Mac data goes back to 12/2005. Fannie Mae only back to 12/2014.

  • Certain fields for Freddie Mac data were missing prior to 6/2008.   

GSE historical loan performance data release in support of GSE Risk Transfer activities was used to help back-fill data where it was missing.

An outline of our approach to data imputation can be found in our VQI Blog Post from October 28, 2015.                                                

 


Cash-out Refis, Investment Properties Contribute to Uptick in Agency Mortgage Risk Profile

RiskSpan’s Vintage Quality Index is a monthly measure of the relative risk profile of Agency mortgages. Higher VQI levels are associated with mortgage vintages containing higher-than-average percentages of loans with one or more “risk layers.”

These risk layers, summarized below, reflect the percentage of loans with low FICO scores (below 660), high loan-to-value ratios (above 80%), high debt-to-income ratios (above 45%), adjustable rate features, subordinate financing, cash-out refis, investment properties, multi-unit properties, and loans with only one borrower.

The RiskSpan VQI rose 4.2 points at the end of 2020, reflecting a modest increase in the risk profile of loans originated during the fourth quarter relative to the early stages of the pandemic.

The first rise in the index since February was driven by modest increases across several risk layers. These included cash-out refinances (up 2.5% to a 20.2% share in December), single borrower loans (up 1.8% to 52.0%) and investor loans (up 1.4% to 6.0%). Still, the December VQI sits more than 13 points below its local high in February 2020, and more than 28 points below a peak seen in January 2019.

While the share of cash-out refinances has risen some from these highs, the risk layers that have driven most of the downward trend in the overall VQI – percentage of loans with low FICO scores and high LTV and DTI ratios – remain relatively low. These layers have been trending downward for a number of years now, reflecting a tighter credit box, and the pandemic has only exacerbated tightening.

Population assumptions:

  • Monthly data for Fannie Mae and Freddie
  • Loans originated more than three months prior to issuance are excluded because the index is meant to reflect current market
  • Loans likely to have been originated through the HARP program, as identified by LTV, MI coverage percentage, and loan purpose, are also These loans do not represent credit availability in the market as they likely would not have been originated today but for the existence of HARP.

Data assumptions:

  • Freddie Mac data goes back to 12/2005. Fannie Mae only back to 12/2014.
  • Certain fields for Freddie Mac data were missing prior to 6/2008.
  • GSE historical loan performance data release in support of GSE Risk Transfer activities was used to help back-fill data where it was missing.

This analysis is developed using RiskSpan’s Edge Platform. To learn more or see a free, no-obligation demo of Edge’s unique data and modeling capabilities, please contact us.


RiskSpan VQI: Current Underwriting Standards Q3 2020

Sept 2020 Vintage Quality Index

Riskspan VQI Historical Trend

Riskspan VQI Historical Trend

RiskSpan’s Vintage Quality Index, which had declined sharply in the first half of the year, leveled off somewhat in the third quarter, falling just 2.8 points between June and September, in contrast to its 12 point drop in Q2.

This change, which reflects a relative slowdown in the tightening of underwriting standards reflects something of a return to stability in the Agency origination market.

Driven by a drop in cash-out refinances (down 2.3% in the quarter), the VQI’s gradual decline left the standard credit-related risk attributes (FICO, LTV, and DTI) largely unchanged.

The share of High-LTV loans (loans with loan-to-value ratios over 80%) which fell 1.3% in Q3, has fallen dramatically over the last year–1.7% in total. More than half of this drop (6.1%) occurred before the start of the COVID-19 crisis. This suggests that, even though the Q3 VQI reflects tightening underwriting standards, the stability of the credit-related components, coupled with huge volumes from the GSEs, reflects a measure of stability in credit availability.

Risk Layers Historical Trend

Risk Layers – September 20 – All Issued Loans By Count

FICO < 660 - Share Issued Loans

Loan to Value > 80 - Share of Issued Loans

Debt-to-Income > 45 - Share of Issued Loans

Ajustable-Rate-Share-of-Issued-Loans

Loans-w-Subordinate-Financing-Sept-2020

Cashout-Refinance

Risk Layers – September 20 – All Issued Loans By Count

Loan-Occupancy

Multi-Unit-Share-of-Issued-Loans

One-Borrower-Loans

Analytical And Data Assumptions

Population assumptions:

  • Monthly data for Fannie Mae and Freddie Mac.

  • Loans originated more than three months prior to issuance are excluded because the index is meant to reflect current market conditions.

  • Loans likely to have been originated through the HARP program, as identified by LTV, MI coverage percentage, and loan purpose are also excluded. These loans do not represent credit availability in the market as they likely would not have been originated today but for the existence of HARP.                                                                                                                          

Data assumptions:

  • Freddie Mac data goes back to 12/2005. Fannie Mae only back to 12/2014.

  • Certain fields for Freddie Mac data were missing prior to 6/2008.   

GSE historical loan performance data release in support of GSE Risk Transfer activities was used to help back-fill data where it was missing.

An outline of our approach to data imputation can be found in our VQI Blog Post from October 28, 2015.                                                


RiskSpan Vintage Quality Index (VQI): Q2 2020

The RiskSpan Vintage Quality Index (“VQI”) is a monthly index designed to quantify the underwriting environment of a monthly vintage of mortgage originations and help credit modelers control for prevailing underwriting conditions at various times. Published quarterly by RiskSpan, the VQI generally trends slowly, with interesting monthly changes found primarily in the individual risk layers. (Assumptions used to construct the VQI can be found at the end of this post.) The VQI has reacted dramatically to the economic tumult caused by COVID-19, however, and in this post we explore how the VQI’s reaction to the current crisis compares to the start of the Great Recession. We examine the periods leading up to the start of each crisis and dive deep into the differences between individual risk layers.

Reacting to a Crisis

In contrast with its typically more gradual movements, the VQI’s reaction to a crisis is often swift. Because the VQI captures the average riskiness of loans issued in a given month, crises that lower lender (and MBS investor) confidence can quickly drive the VQI down as lending standards are tightened. For this comparison, we will define the start of the COVID-19 crisis as February 2020 (the end of the most recent economic expansion, according to the National Bureau of Economic Research), and the start of the Great Recession as December 2007 (the first official month of that recession). As you might expect, the VQI reacted by moving sharply down immediately after the start of each crisis.[1]

riskspan-VQI-report

Though the reaction appears similar, with each four-month period shedding roughly 15% of the index, the charts show two key differences. The first difference is the absolute level of the VQI at the start of the crisis. The vertical axis on the graphs above displays the same spread (to display the slope of the changes consistently), but the range is shifted by a full 40 points. The VQI maxed out at 139.0 in December 2007, while at the start of the COVID-19 crisis, the VQI stood at just 90.4.

A second difference surrounds the general trend of the VQI in the months leading up to the start of each crisis. The VQI was trending up in the 18 months leading up the Great Recession, signaling an increasing riskiness in the loans being originated and issued. (As we discuss later, this “last push” in the second half of 2007 was driven by an increase in loans with high loan-to-value ratios.) Conversely, 2019 saw the VQI trend downward, signaling a tightening of lending standards.

Different Layers of Risk

Because the VQI simply indexes the average number of risk layers associated with the loans issued by the Agencies in a given month, a closer look at the individual risk layers provides insights that can be masked when analyzing the VQI as a whole.

The risk layer that most clearly depicts the difference between the two crises is the share of loans with low FICO scores (below 660).

riskspan-VQI-report

The absolute difference is striking: 27.9% of loans issued in December 2007 had a low FICO score, compared with just 7.1% of loans in February 2020. That 20.8% difference perfectly captures the underwriting philosophies of the two periods and pretty much sums up the differing quality of the two loan cohorts.

FICO trends before the crisis are also clearly different. In the 12 months leading up to the Great Recession the share of low-FICO loans rose from 24.4% to 27.9% (+3.2%). In contrast, the 12 months before the COVID-19 crisis saw the share of low-FICO loans fall from 11.5% to 7.2% (-4.3%).

The low-FICO risk layer’s reaction to the crisis also differs dramatically. Falling 27.9% to 15.4% in 4 months (on its way to 3.3% in May 2009), the share of low-FICO loans cratered following the start of the recession. In contrast, the risk layer has been largely unimpacted by the current crisis, simply continuing its downward trend mostly uninterrupted.

Three other large drivers of the difference between the VQI in December 2007 and in February 2020 are the share of cash-out refinances, the share of loans for second homes, and the share of loans with debt-to-income (DTI) ratios above 45%. What makes these risk layers different from FICO is their reaction to the crisis itself. While their absolute levels in the months leading up to the Great Recession were well above those seen at the beginning of 2020 (similar to low-FICO), none of these three risk layers appear to react to either crisis but rather continue along the same general trajectory they were on in the months leading up to each crisis. Cash-out refinances, following a seasonal cycle are mostly unimpacted by the start of the crises, holding a steady spread between the two time-periods:

riskspan-vqi-report

Loans for second homes were already becoming more rare in the runup to December 2007 (the only risk layer to show a reaction to the tumult of the fall of 2007) and mostly held in the low teens immediately following the start of the recession:

Great Recession

Finally, loans with high DTIs (over 45%) have simply followed their slow trend down since the start of the COVID-19 crisis, while they actually became slightly more common following the start of the Great Recession:

riskspan-VQI-report

The outlier, both pre- and post-crisis, is the high loan-to-value risk layer. For most of the 24 months leading up to the start of the Great Recession the share of loans with LTVs above 80% was well below the same period leading up to the COVID-19 crisis. The pre-Great Recession max of 33.2% is below the 24-month average of 33.3% at the start of the COVID-19 crisis. The share of high-LTV loans also reacted to the crisis in 2008, falling sharply after the start of the recession. In contrast, the current downward trend in high-LTV loans started well before the COVID-19 crisis and was seemingly unimpacted by the start of the crisis.

RiskSpan-VQI-report

Though the current downward trend is likely due to increased refinance activity as mortgage rates continue to crater, the chart seems upside down relative to what you might have predicted.

The COVID-19 Crisis is Different

What can the VQI tell us about the similarities and differences between December 2007 and February 2020? When you look closely, quite a bit.

  1. The loans experiencing the crisis in 2020 are less risky.

By almost all measures, the loans that entered the downturn beginning in December 2007 were riskier than the loans outstanding in February 2020. There are fewer low-FICO loans, fewer loans with high debt-to-income ratios, fewer loans for second homes, and fewer cash-out refinances. Trends aside, the absolute level of these risky characteristics—characteristics that are classically considered in mortgage credit and loss models—is significantly lower. While that is no guarantee the loans will fare better through this current crisis and recovery, we can reasonably expect better outcomes this time around.

  1. The 2020 crisis did not immediately change underwriting / lending.

One of the more surprising VQI trends is the non-reaction of many of the risk layers to the start of the COVID-19 crisis. FICO, LTV, and DTI all seem to be continuing a downward trend that began well before the first coronavirus diagnosis. The VQI is merely continuing a trend started back in January 2019. (The current “drop” has brought the VQI back to the trendline.) Because the crisis was not born of the mortgage sector and has not yet stifled demand for mortgage-backed assets, we have yet to see any dramatic shifts in lending practices (a stark contrast with 2007-2008). Dramatic tightening of lending standards can lead to reduced home buying demand, which can put downward pressure on home prices. The already-tight lending standards in place before the COVID-19 crisis, coupled with the apparent non-reaction by lenders, may help to stabilize the housing market.

The VQI was not designed to gauge the unknowns of a public health crisis. It does not directly address the lessons learned from the Great Recession, including the value of modification and forbearance in maintaining stability in the market. It does not account for the role of government and the willingness of policy makers to intervene in the economy (and in the housing markets specifically). Despite not being a crystal ball, the VQI nevertheless remains a valuable tool for credit modelers seeking to view mortgage originations from different times in their proper perspective.

—————

Analytical and Data Assumptions

Population assumptions:

  • Issuance Data for Fannie Mae and Freddie Mac.
  • Loans originated more than three months prior to issuance are excluded because the index is meant to reflect current market conditions.
  • Loans likely to have been originated through the HARP program, as identified by LTV, MI coverage percentage, and loan purpose are also excluded. These loans do not represent credit availability in the market, as they likely would not have been originated today if not for the existence of HARP.

Data Assumptions:

  • Freddie Mac data goes back to December 2005. Fannie Mae data only goes back to December 2014.
  • Certain Freddie Mac data fields were missing prior to June 2008.

GSE historical loan performance data release in support of GSE Risk Transfer activities was used to help back-fill data where it was missing.

 

 

[1] Note that the VQI’s baseline of 100 reflects underwriting standards as of January 2003.

 


Is Free Public Data Worth the Cost?

No such thing as a free lunch.

The world is full of free (and semi-free) datasets ripe for the picking. If it’s not going to cost you anything, why not supercharge your data and achieve clarity where once there was only darkness?

But is it really not going to cost you anything? What is the total cost of ownership for a public dataset, and what does it take to distill truly valuable insights from publicly available data? Setting aside the reliability of the public source (a topic for another blog post), free data is anything but free. Let us discuss both the power and the cost of working with public data.

To illustrate the point, we borrow from a classic RiskSpan example: anticipating losses to a portfolio of mortgage loans due to a hurricane—a salient example as we are in the early days of the 2020 hurricane season (and the National Oceanic and Atmospheric Administration (NOAA) predicts a busy one). In this example, you own a portfolio of loans and would like to understand the possible impacts to that portfolio (in terms of delinquencies, defaults, and losses) of a recent hurricane. You know this will likely require an external data source because you do not work for NOAA, your firm is new to owning loans in coastal areas, and you currently have no internal data for loans impacted by hurricanes.

Know the Data.

The first step in using external data is understanding your own data. This may seem like a simple task. But data, its source, its lineage, and its nuanced meaning can be difficult to communicate inside an organization. Unless you work with a dataset regularly (i.e., often), you should approach your own data as if it were provided by an external source. The goal is a full understanding of the data, the data’s meaning, and the data’s limitations, all of which should have a direct impact on the types of analysis you attempt.

Understanding the structure of your data and the limitations it puts on your analysis involves questions like:

  • What objects does your data track?
  • Do you have time series records for these objects?
  • Do you only have the most recent record? The most recent 12 records?
  • Do you have one record that tries to capture life-to-date information?

Understanding the meaning of each attribute captured in your data involves questions like:

  • What attributes are we tracking?
  • Which attributes are updated (monthly or quarterly) and which remain static?
  • What are the nuances in our categorical variables? How exactly did we assign the zero-balance code?
  • Is original balance the loan’s balance at mortgage origination, or the balance when we purchased the loan/pool?
  • Do our loss numbers include forgone interest?

These same types of questions also apply to understanding external data sources, but the answers are not always as readily available. Depending on the quality and availability of the documentation for a public dataset, this exercise may be as simple as just reading the data dictionary, or as labor intensive as generating analytics for individual attributes, such as mean, standard deviation, mode, or even histograms, to attempt to derive an attribute’s meaning directly from the delivered data. This is the not-free part of “free” data, and skipping this step can have negative consequences for the quality of analysis you can perform later.

Returning to our example, we require at least two external data sets:  

  1. where and when hurricanes have struck, and
  2. loan performance data for mortgages active in those areas at those times.

The obvious choice for loan performance data is the historical performance datasets from the GSEs (Fannie Mae and Freddie Mac). Providing monthly performance information and loss information for defaulted loans for a huge sample of mortgage loans over a 20-year period, these two datasets are perfect for our analysis. For hurricanes, some manual effort is required to extract date, severity, and location from NOAA maps like these (you could get really fancy and gather zip codes covered in the landfall area—which, by leaving out homes hundreds of miles away from expected landfall, would likely give you a much better view of what happens to loans actually impacted by a hurricane—but we will stick to state-level in this simple example).

Make new data your own.

So you’ve downloaded the historical datasets, you’ve read the data dictionaries cover-to-cover, you’ve studied historical NOAA maps, and you’ve interrogated your own data teams for the meaning of internal loan data. Now what? This is yet another cost of “free” data: after all your effort to understand and ingest the new data, all you have is another dataset. A clean, well-understood, well-documented (you’ve thoroughly documented it, haven’t you?) dataset, but a dataset nonetheless. Getting the insights you seek requires a separate effort to merge the old with the new. Let us look at a simplified flow for our hurricane example:

  • Subset the GSE data for active loans in hurricane-related states in the month prior to landfall. Extract information for these loans for 12 months after landfall.
  • Bucket the historical loans by the characteristics you use to bucket your own loans (LTV, FICO, delinquency status before landfall, etc.).
  • Derive delinquency and loss information for the buckets for the 12 months after the hurricane.
  • Apply the observed delinquency and loss information to your loan portfolio (bucketed using the same scheme you used for the historical loans).

And there you have it—not a model, but a grounded expectation of loan performance following a hurricane. You have stepped out of the darkness and into the data-driven light. And all using free (or “free”) data!

Hyperbole aside, nothing about our example analysis is easy, but it plainly illustrates the power and cost of publicly available data. The power is obvious in our example: without the external data, we have no basis for generating an expectation of losses after a hurricane. While we should be wary of the impacts of factors not captured by our datasets (like the amount and effectiveness of government intervention after each storm – which does vary widely), the historical precedent we find by averaging many storms can form the basis for a robust and defensible expectation. Even if your firm has had experience with loans in hurricane-impacted areas, expanding the sample size through this exercise bolsters confidence in the outcomes. Generally speaking, the use of public data can provide grounded expectations where there had been only anecdotes.

But this power does come at a price—a price that should be appreciated and factored into the decision whether to use external data in the first place. What is worse than not knowing what to expect after a hurricane? Having an expectation based on bad or misunderstood data. Failing to account for the effort required to ingest and use free data can lead to bad analysis and the temptation to cut corners. The effort required in our example is significant: the GSE data is huge, complicated, and will melt your laptop’s RAM if you are not careful. Turning NOAA PDF maps into usable data is not a trivial task, especially if you want to go deeper than the state level. Understanding your own data can be a challenge. Applying an appropriate bucketing to the loans can make or break the analysis. Not all public datasets present these same challenges, but all public datasets present costs. There simply is no such thing as a free lunch. The returns on free data frequently justify these costs. But they should be understood before unwittingly incurring them.


Get Started
Log in

Linkedin   

risktech2024