Linkedin    Twitter   Facebook

Get Started
Log In

Linkedin

Category: Article

How Do You Rate on Fannie Mae’s New Social Index?

Quick take-aways

  • HMDA data contains nearly every factor needed to replicate Fannie Mae’s Single Family Social Index. We use this data to explore how the methodology would look if the Fannie Mae Social Index were applied to other market participants.
  • The Agencies and Ginnie Mae are not the only game in town when it comes socially responsible lending. Non-agency loans would also perform reasonably well under Fannie Mae’s proposed Social Index.
  • Not surprisingly, Ginnie Mae outperforms all other “purchaser types” under the framework, buoyed by its focus on low-income borrowers and underserved communities. The gap between Ginnie and the rest of the market can be expected to expand in low-refi environments.
  • With a few refinements to account for socially responsible lending beyond low-income borrowers, Fannie Mae’s framework can work as a universally applicable social measure across the industry.

Fannie Mae’s new “Single Family Social Index

Last week, Fannie Mae released a proposed methodology for its Single Family Social Index.” The index is designed to provide “socially conscious investors” a means of “allocat[ing] capital in support of affordable housing and to provide access to credit for underserved individuals.”

The underlying methodology is simple enough. Each pool of mortgages receives a score based on how many of its loans meet one or more specified “social criteria” across three dimensions: borrower income, borrower characteristics and property location/type. Fannie Mae succinctly illustrates the defined criteria and framework in the following overview deck slide.


Social Index Figure 1: Source: Designing for Impact — A Proposed Methodology for Single-Family Social Disclosure


Each of the criteria is binary (yes/no) which facilitates the scoring. Individual loans are simply rated based on the number of boxes they check. Pools are measured in two ways: 1) a “Social Criteria Share,” which identifies the percentage of loans that meet any of the criteria, and 2) a “Social Density Score,” which assigns a “Social Score” of 0 thru 3 to each individual loan based on how many of the three dimensions (borrower income, borrower characteristics, and property characteristics) it covers and then averaging that score across all the loans in the pool.

If other issuers adopt this methodology, what would it look like?

The figure below is one of many charts and tables provided by Fannie Mae that illustrate how the Index works. This figure shows the share of acquisitions meeting one or more of the Social Index criteria (i.e., the overall “Social Criteria Share.” We have drawn a box approximately around the 2020 vintage,[1] which appears to have a Social Criteria Share of about 52% by loan count. We will refer back to this value later as we seek to triangulate in on a Social Criteria Share for other market participants.

SPEAK TO AN EXPERT

Graph Figure 2: Source: Designing for Impact — A Proposed Methodology for Single-Family Social Disclosure


We can get a sense of other issuers’ Social Criteria Share by looking at HMDA data. This dataset provides everything we need to re-create the Index at a high-level, with the exception of a flag for first time home buyers. The process involves some data manipulation as several Index criteria require us to connect to two census-tract level data sources published by FHFA.

HMDA allows us break down the loan population by purchaser type, which gives us an idea of each loan’s ultimate destination—Fannie, Freddie, Ginnie, etc. The purchaser type does not capture this for every loan, however, because originators are only obligated to report loans that are closed and sold during the same calendar year.  

The two tables below reflect two different approaches to approximating the population of Fannie, Freddie, and Ginnie loans. The left-hand table compares the 2020 origination loan count based on HMDA’s Purchaser Type field with loan counts based on MBS disclosure data pulled from RiskSpan’s Edge Platform.

The right-hand table enhances this definition by first re-categorizing as Ginnie Mae all FHA/VA/USDA loans with non-agency purchaser types. It also looks at the Automated Underwriting System field and re-maps all owner-occupied loans previously classified as “Other or NA” to Fannie (DU AUS) or Freddie (LP/LPA AUS).


Social Index



The adjusted purchaser type approach used in the right-hand table reallocates a considerable number of “Other or NA” loans from the left-hand table. The approach clearly overshoots the Fannie Mae population, as some loans underwritten using Fannie’s automated underwriting system likely wind up at Freddie and other segments of the market. This limitation notwithstanding, we believe this approximation lends a more accurate view of the market landscape than does the unadjusted purchaser type approach. We consequently rely primarily on the adjusted approach in this analysis.

Given the shortcomings in aligning the exact population, the idea here is not to get an exact calculation of the Social Index metrics via HMDA, but to use HMDA to give us a rough indication of how the landscape would look if other issuers adopted Fannie’s methodology. We expect this to provide a rough rank-order understanding of where the richest pools of ‘Social’ loans (according to Fannie’s methodology) ultimately wind up. Because the ultimate success of a social scoring methodology can truly be measured only to the extent it is adopted by other issuers, having a universally useful framework is crucial.

The table below estimates the Social Criteria Share by adjusted purchaser using seven of Fannie Mae’s eight social index criteria.[2] Not surprisingly, Ginnie, Fannie, and Freddie boast the highest overall shares. It is encouraging to note, however, that other purchaser types also originate significant percentages of socially responsible loans. This suggests that Fannie’s methodology could indeed be applied more universally. The table looks at each factor separately and could warrant its own blog post entirely to dissect, so take a closer look at the dynamics.[3]


Social Index


Ginnie Mae’s strong performance on the Index comes as no surprise. Ginnie pools, after all, consist primarily of FHA loans, which skew toward the lower end of the income spectrum, first-time borrowers, and traditionally underserved communities. Indeed, more than 56 percent of Ginnie Mae loans tick at least one box on the Index. And this does not include first-time homebuyers, which would likely push that percentage even higher.

Income’s Outsized Impact

Household income contributes directly or indirectly to most components of Fannie’s Index. Beyond the “Low-income” criterion (borrowers below 80 percent of adjusted median income), nearly every other factor favors income levels be below 120 percent of AMI. Measuring income is tricky, especially outside of the Agency/Ginnie space. The non-Agency segment serves many self-employed borrowers, borrowers who qualify based on asset (rather than income) levels, and foreign national borrowers. Nailing down precise income has historically proven challenging with these groups.

Given these dynamics, one could reasonably posit that the 18 percent of PLS classified as “low-income” is actually inflated by self-employed or wealthier borrowers whose mortgage applications do not necessarily reflect all of their income. Further refinements may be needed to fairly apply the Index framework to this and market segments that pursue social goals beyond expanding credit opportunities for low-income borrowers. This could just be further definitions on how to calculate income (or alternatives to the income metric when not available) and certain exclusions from the framework altogether (foreign national borrowers, although these may be excluded already based on the screen for second homes).

Positive effects of a purchase market

The Social Criteria Share is positively correlated with purchase loans as a percentage of total origination volume (even before accounting for the FTHB factor). This relationship is apparent in Fannie Mae’s time series chart near the top of this post. Shares clearly drop during refi waves.

Our analysis focuses on 2020 only. We made this choice because of HMDA reporting lags and the inherent facility of dealing with a single year of data. The table below breaks down the HMDA analysis (referenced earlier) by loan purpose to give us a sense for what our current low-refi environment could look like. (Rate/term refis are grouped together with cash-out refis.) As the table below indicates, Ginnie Mae’s SCS for refi loans is about the same as it is for GSE refi loans — it’s really on purchase loans where Ginnie shines. This implies that Ginnie’s SCS will improve even further in a purchase rate environment.


Social Index


Accounting for First-time Homebuyers

As described above, our methodology for estimating the Social Criteria Share omits loans to first-time homebuyers (because the HMDA data does not capture it). This likely accounts for the roughly 6 percentage point difference between our estimate of Fannie’s overall Social Criteria Share for 2020 (approximately 46 percent) and Fannie Mae’s own calculation (approximately 52 percent).

To back into the impact of the FTHB factor, we can pull in data about the share of FTHBs from RiskSpan’s Edge platform. The chart above that looks a Purchase vs. Refi tells us the SCS share without the FTHB factor for purchase loans. Using MBS data sources, we can obtain the share of 2020 originations that were FTHBs. If we assume that FTHB loans look the same as purchase loans overall in terms of how many other Social Index boxes they check, then we can back into the overall SCS incorporating all factors in Fannie’s methodology.

Applying this approach to Ginnie Mae, we conclude that, because 29 percent of Ginnie’s purchase loans (one minus 71 percent) do not tick any of the Index’s boxes, 29 percent of FTHB loans (which account for 33 percent of Ginnie’s overall population) also do not tick any Index boxes. Taking 29 percent of this 33 percent results in an additional 9.6 percent that should be tacked on to Ginnie Mae’s pre-FTHB share, bringing it up to 66 percent.


Social Index


Validating this estimation approach is the fact it increases Fannie Mae’s share from 46 percent (pre-FTHB) to 52 percent, which is consistent with the historical graph supplied by Fannie Mae (see Figure 2, above). Our FTHB approach implies that 92 percent of Ginnie Mae purchase loans meet one or more of the Index criteria. One could reasonably contend that Ginnie Mae FTHB loans might be more likely than Ginnie purchase loans overall to satisfy other social criteria (i.e., that 92 percent is a bit rich), in which case the 66 percent share for Ginnie Mae in 2020 might be overstated. Even if we mute this FTHB impact on Ginnie, however, layering FTHB loans on top of a rising purchase-loan environment would likely put today’s Ginnie Mae SCS in the low 80s.




[1] The chart is organized by acquisition month, our analysis of HMDA looks at 2020 originations, so we’ve tried to push the box slightly to the right to reflect the 1–3-month lag between origination and acquisition. Additionally, we think the chart and numbers throughout Fannie’s document are just Fixed Rate 30 loans, our analysis includes all loans. We did investigate what our numbers would look like if filtered to Fixed 30 and it would only increase the SCS slightly across the board.

[2] As noted above, we are unable to discern first-time homebuyer information from the HMDA data.

[3] We can compare the Fannie numbers for each factor to published rates in their documentation representing the time period 2017 forward. The only metric where we stand out as being meaningfully off is the percentage of loans in minority census tracts. We took this flag from FHFA’s Low-Income Area File for 2020 which defines a minority census tract having a ‘…minority population of at least 30 percent and a median income of less than 100 percent of the AMI.’ It is not 100% clear that this is what Fannie Mae is using in its definition.


It’s time to move to DaaS — Why it matters for loan and MSR investors

Data as a service, or DaaS, for loans and MSR investors is fast becoming the difference between profitable trades and near misses.

Granularity of data is creating differentiation among investors. To win at investing in loans and mortgage servicing rights requires effectively managing a veritable ocean of loan-level data. Buried within every detailed tape of borrower, property, loan and performance characteristics lies the key to identifying hidden exposures and camouflaged investment opportunities. Understanding these exposures and opportunities is essential to proper bidding during the acquisition process and effective risk management once the portfolio is onboarded.

Investors know this. But knowing that loan data conceals important answers is not enough. Even knowing which specific fields and relationships are most important is not enough. Investors also must be able to get at that data. And because mortgage data is inherently messy, investors often run into trouble extracting the answers they need from it.

For investors, it boils down to two options. They can compel analysts to spend 75 percent of their time wrangling unwieldy data – plugging holes, fixing outliers, making sure everything is mapped right. Or they can just let somebody else worry about all that so they can focus on more analytical matters.

Don’t get left behind — DaaS for loan and MSR investors

It should go without saying that the “let somebody else worry about all that” approach only works if “somebody else” possesses the requisite expertise with mortgage data. Self-proclaimed data experts abound. But handing the process over to an outside data team lacking the right domain experience risks creating more problems than it solves.

Ideally, DaaS for loan and MSR investors consists of a data owner handing off these responsibilities to a third party that can deliver value in ways that go beyond simply maintaining, aggregating, storing and quality controlling loan data. All these functions are critically important. But a truly comprehensive DaaS provider is one whose data expertise is complemented by an ability to help loan and MSR investors understand whether portfolios are well conceived. A comprehensive DaaS provider helps investors ensure that they are not taking on hidden risks (for which they are not being adequately compensated in pricing or servicing fee structure).

True DaaS frees up loan and MSR investors to spend more time on higher-level tasks consistent with their expertise. The more “blocking and tackling” aspects of data management that every institution that owns these assets needs to deal with can be handled in a more scalable and organized way. Cloud-native DaaS platforms are what make this scalability possible.

Scalability — stop reinventing the wheel with each new servicer

One of the most challenging aspects of managing a portfolio of loans or MSRs is the need to manage different types of investor reporting data pipelines from different servicers. What if, instead of having to “reinvent the wheel” to figure out data intake every time a new servicer comes on board, “somebody else” could take care of that for you?

An effective DaaS provider is one not only that is well versed in building and maintain loan data pipes from servicers to investors but also has already established a library of existing servicer linkages. An ideal provider is one already set-up to onboard servicer data directly onto its own DaaS platform. Investors achieve enormous economies of scale by having to integrate with a single platform as opposed to a dozen or more individual servicer integrations. Ultimately, as more investors adopt DaaS, the number of centralized servicer integrations will increase, and greater economies will be realized across the industry.

Connectivity is only half the benefit. The DaaS provider not only intakes, translates, maps, and hosts the loan-level static and dynamic data coming over from servicers. The DaaS provider also takes care of QC, cleaning, and managing it. DaaS providers see more loan data than any one investor or servicer. Consequently, the AI tools an experienced DaaS provider uses to map and clean incoming loan data have had more opportunities to learn. Loan data that has been run through a DaaS provider’s algorithms will almost always be more analytically valuable than the same loan data processed by the investor alone.  

Investors seeking to increase their footprint in the loan and MSR space obviously do not wish to see their data management costs rise in proportion to the size of their portfolios. Outsourcing to a DaaS provider that specializes in mortgages, like RiskSpan, helps investors build their book while keeping data costs contained.

Save time and money – Make better bids

For all these reasons, DaaS is unquestionably the future (and, increasingly, the present) of loan and MSR data management. Investors are finding that a decision to delay DaaS migration comes with very real costs, particularly as data science labor becomes increasingly (and often prohibitively) expensive.

The sooner an investor opts to outsource these functions to a DaaS provider, the sooner that investor will begin to reap the benefits of an optimally cost-effective portfolio structure. One RiskSpan DaaS client reported a 50 percent reduction in data management costs alone.

Investors continuing to make do with in-house data management solutions will quickly find themselves at a distinct bidding disadvantage. DaaS-aided bidders have the advantage of being able to bid more competitively based on their more profitable cost structure. Not only that, but they are able to confidently hone and refine their bids based on having a better, cleaner view of the portfolio itself.

Rethink your mortgage data. Contact RiskSpan to talk about how DaaS can simultaneously boost your profitability and make your life easier.

REQUEST A DEMO

Senior Home Equity Rises Again to $11.12 Trillion

Senior home equity rises again. Homeowners 62 and older saw their housing wealth grow by an estimated 4.9 percent ($520 billion) during the first quarter of 2022 to a record $11.1 trillion according to the latest quarterly release of the NRMLA/RiskSpan Reverse Mortgage Market Index.

Historical Changes in Aggregate Senior Home Values Q1 2000 - Q1 2022

The NRMLA/RiskSpan Reverse Mortgage Market Index (RMMI) rose to 388.83, another all-time high since the index was first published in 2000. The increase in older homeowners’ wealth was mainly driven by an estimated $563 billion (4.4 percent) increase in home values, offset by a $43 billion (2.1 percent) increase in senior-held mortgage debt.

For a comprehensive commentary, please see NRMLA’s press release.


How RiskSpan Computes the RMMI

To calculate the RMMI, RiskSpan developed an econometric tool to estimate senior housing value, mortgage balances, and equity using data gathered from various public resources. These resources include the American Community Survey (ACS), Federal Reserve Flow of Funds (Z.1), and FHFA housing price indexes (HPI). The RMMI represents the senior equity level at time of measure relative to that of the base quarter in 2000.[1] 

A limitation of the RMMI relates to Non-consecutive data, such as census population. We use a smoothing approach to estimate data in between the observable periods and continue to look for ways to improve our methodology and find more robust data to improve the precision of the results. Until then, the RMMI and its relative metrics (values, mortgages, home equities) are best analyzed at a trending macro level, rather than at more granular levels, such as MSA.


[1] There was a change in RMMI methodology in Q3 2015 mainly to calibrate senior homeowner population and senior housing values observed in 2013 American Community Survey (ACS).


Automated Legal Disclosure Generator for Mortgage and Asset-Backed Securities

Issuing a security requires a lot of paperwork. Much of this paperwork consists of legal disclosures. These disclosures inform potential investors about the collateral backing the bonds they are buying. Generating, reviewing, and approving these detailed disclosures is hard and takes a lot of time – hours and sometimes days. RiskSpan has developed an easy-to-use legal disclosure generator application that makes it easier, reducing the process to minutes.

RiskSpan’s Automated Legal Disclosure Generator for Mortgage and Asset-Backed Securities automates the generation of prospectus-supplements, pooling and servicing agreements, and other legal disclosure documents. These documents contain a combination of static and dynamic legal language, data, tables, and images.  

The Disclosure Generator draws from a collection of data files. These files contain collateral-, bond-, and deal-specific information. The Disclosure Generator dynamically converts the contents of these files into legal disclosure language based on predefined rules and templates. In addition to generating interim and final versions of the legal disclosure documents, the application provides a quick and easy way of making and tracking manual edits to the documents. In short, the Disclosure Generator is an all-inclusive, seamless, end-to-end system for creating, editing and tracking changes to legal documents for mortgage and asset-backed securities.   

The Legal Disclosure Generator’s user interface supports:  

  1. Simultaneous uploading of multiple data files.
  2. Instantaneous production of the first (and subsequent) drafts of legal documents, adhering to the associated template(s).
  3. A user-friendly editor allowing manual, user-level language and data changes. Users apply these edits either directly to a specific document or to the underlying data template itself. Template updates carry forward to the language of all subsequently generated disclosures. 
  4. A version control feature that tracks and retains changes from one document version to the next.
  5. An archiving feature allowing access to previously generated documents without the need for the original data files.
  6. Editing access controls based on pre-defined user level privileges.
CONTACT US

Overview

RiskSpan’s Automated Legal Disclosure Generator for Mortgage and Asset-Backed Securities enables issuers of securitized assets to create legal disclosures efficiently and quickly from raw data files.

The Legal Disclosure Generator is easy and intuitive to use. After setting up a deal in the system, the user selects the underlying collateral- and bond-level data files to create the disclosure document. In addition to the raw data related to the collateral and bonds, these data files also contain relevant waterfall payment rules. The data files can be in any format — Excel, CSV, text, or even custom file extensions. Once the files are uploaded, the first draft of the disclosures can be easily generated in just a few seconds. The system takes the underlying data files and creates a draft of the disclosure document seamlessly and on the fly.  In addition, the Legal Disclosure Generator reads custom scripts related to waterfall models and converts them into waterfall payment rules.

Here is a sample of a disclosure document created from the system.


REQUEST A DEMO

Blackline Version(s)

In addition to creating draft disclosure documents, the Legal Disclosure Generator enables users to make edits and changes to the disclosures on the fly through an embedded editor. The Disclosure Generator saves these edits and applies them to the next version. The tool creates blackline versions with a single integrated view for managing multiple drafts.

The following screenshot of a sample blackline version illustrates how users can view changes from one version to the next.

Tracking of Drafts

The Legal Disclosure Generator keeps track of a disclosure’s entire version history. The system enables email of draft versions directly to the working parties, and additionally retains timestamps of these emails for future reference.

The screenshot below shows the entire lifecycle of a document, from original creation to print, with all interim drafts along the way. 


Automated QC System

The Legal Disclosure Generator’s automated QC system creates a report that compares the underlying data file(s) to the data that is contained in the legal disclosure. The automated QC process ensures that data is accurate and reconciled.

Downstream Consumption

The Legal Disclosure Generator creates a JSON data file. This consolidated file consists of collateral and bond data, including waterfall payment rules. The data files are made available for downstream consumption and can also be sent to Intex, Bloomberg, and other data vendors. One such vendor noted that this JSON data file has enabled them to model deals in one-third the time it took previously.

Self-Serve System

The Legal Disclosure Generator was designed with the end-user in mind. Users can set up the disclosure language by themselves and edit as needed, with little or no outside help.

The ‘System’ Advantage

  • Remove unnecessary, manual, and redundant processes
  • Huge Time Efficiency – 24 Hours vs 2 Mins (Actual time savings for a current client of the system)
  • Better Managed Processes and Systems
  • Better Resource Management – Cost Effective Solutions
  • Greater Flexibility
  • Better Data Management – Inbuilt QCs


LEARN MORE

Why Accurate Loan Pool and MSR Cost Forecasting Requires Loan-by-Loan Analytics

When it comes to forecasting loan pool and MSR cash flows, the practice of creating “rep lines,” or cohorts, of loans with similar characteristics for analytical purposes has its roots in the Agency MBS market. One of the most attractive and efficient features of Agencies is the TBA market. This market allows originators and issuers to sell large pools of mortgages that have not even been originated yet. This is possible because all parties understand what these future loans will look like. All these loans will all have enough in common as to be effectively interchangeable with one another.  

Institutions that perform the servicing on such loans may reasonably feel they can extend the TBA logic to their own analytics. Instead of analyzing a hundred similar loans individually, why not just lump them into one giant meta-loan? Sum the balances, weight-average the rates, terms, and other features, and you’re good to go. 

Why the industry still resorts to loan cohorting when forecasting loan pool and MSR cash flows

The simplest explanation for cohort-level analytics lies in its simplicity. Rep lines amount to giant simplifying assumptions. They generate fewer technological constraints than a loan-by-loan approach does. Condensing an entire loan portfolio down to a manageable number of rows requires less computational capacity. This takes on added importance when dealing with on-premise software and servers. It also facilitates the process of assigning performance and cost assumptions. 

What is more, as OAS modeling has evolved to dominate the loans and MSR landscape, the stratification approach necessary to run Monte Carlo and other simulations lends itself to cohorting. Lumping loans into like groups also greatly simplifies the process of computing hedging requirements. 

Advantages of loan-level over cohorting when forecasting cash flows

Treating loan and MSR portfolios like TBA pools, however, has become increasingly problematic as these portfolios have grown more heterogeneous. Every individual loan has a story. Even loans that resemble each other in terms of rate, credit score, LTV, DTI, and documentation level have unique characteristics. Some of these characteristics – climate risk, for example – are not easy to bucket. Lumping similar loans into cohorts also runs the risk of underestimating tail risk. Extraordinarily high servicing/claims costs on just one or two outlier loans on a bid tape can be enough to adversely affect the yield of an entire deal. 

Conversely, looking at each loan individually facilitates the analysis of portfolios with expanded credit boxes. Non-banks, which do not usually have the benefit of “knowing” their servicing customers through depository or other transactional relationships, are particularly reliant on loan-level data to understand individual borrower risks, particularly credit risks. Knowing the rate, LTV, and credit score of a bundled group of loans may be sufficient for estimating prepayment risk. But only a more granular, loan-level analysis can produce the credit analytics necessary to forecast reliably and granularly what a servicing portfolio is really going to cost in terms of collections, loss mitigation, and claims expenses.  

Loan-level analysis also eliminates the reliance on stratification limitations. It facilitates portfolio composition analysis. Slicing and dicing techniques are much more simply applied to loans individually than to cohorts. Looking at individual loans also reduces the risk of overrides and lost visibility into convexity pockets. 

Loan-Level MSR Analytics

Potential challenges and other considerations 

So why hasn’t everyone jumped onto the loan-level bandwagon when forecasting loan pool and MSR cash flows? In short, it’s harder. Resistance to any new process can be expected when existing aggregation regimes appear to be working fine. Loan-level data management requires more diligence in automated processes. It also requires the data related to each individual loan to be subjected to QC and monitoring. Daily hedging and scenario runs tend to focus more on speed than on accuracy at the macro level. Some may question whether the benefits of such a granular, case-by-case analysis that identifying the most significant loan-level pickups requires actually justifies the cost of such a regime. 

Rethink. Why now? 

Notwithstanding these challenges, there has never been a better time for loan and MSR investors to abandon cohorting and fully embrace loan-level analytics when forecasting cash flows. The emergence of cloud-native technology and enhanced database and warehouse infrastructure along with the ability to outsource the hosting and computational requirements out to third parties creates practically limitless scalability. 

The barriers between loan and MSR experts and IT professionals have never been lower. This, combined with the emergence of a big data culture in an increasing number of organizations, has brought the granular daily analysis promised by loan-level analytics tantalizingly within reach.  

 

For a deeper dive into loan and MSR cost forecasting, view our webinar, “How Much Will That MSR Portfolio Really Cost You?”

 


Striking a Proper Balance: ESG for Structured Finance

The securitization market continues to wrestle with the myriad of approaches and lack of standards in identifying and reporting ESG factors in transactions and asset classes. But much needed guidance is on the way as industry leaders work toward a consensus on the best way to report ESG for structured finance.  

RiskSpan gathered with other key industry players tackling these challenges at this month’s third annual Structured Finance Association ESG symposium in New York City. The event identified a number of significant strides taken toward shaping an industry-standard ESG framework and guidelines.  

Robust and engaging discussions across a variety of topics illustrated the critical need for a thoughtful approach to framework development. We observed a broad consensus around the notion that market acceptance would require any solution to be data supported and fully transparent. 

Much of the discussion revolved around three recurring themes: Finding a workable balance between the institutional desire for portfolio-specific measures based on raw data and the market need for a standardized scoring mechanism that everybody understands, maintaining data privacy, and assessing tradeoffs between the societal benefits of ESG investing and the added risk it can pose to a portfolio. 

Striking the Right Balance: Institution-Specific Measures vs. Industry-Standard Asset Scoring 

When it comes to disclosure and reporting, one point on a spectrum does not fit all. Investors and asset managers vary in their ultimate reporting needs and approach to assessing ESG and impact investing. On the one hand, having raw data to apply their own analysis or specific standards can be more worthwhile to individual institutions. On the other, having well defined standards or third-party ESG scoring systems for assets provides greater certainty and understanding to the market as a whole.  

Both approaches have value.

Everyone wants access to data and control over how they view the assets in their portfolio. But the need for guidance on what ESG impacts are material and relevant to structured finance remains prominent. Scores, labels, methodologies, and standards can give investors assurance a security contributes to meeting their ESG goals. Investors want to know where their money is going and if it is meaningful.

Methodologies also have to be explainable. Though there was agreement that labeled transactions are not always necessary (or achievable), integration of ESG factors in the decision process is. Reporting systems will need to link underlying collateral to external data sources to calculate key metrics required by a framework while giving users the ability to drill down to meet specific and granular analytical needs.    

Data Privacy

Detailed analysis of underlying asset data, however, highlights a second key issue: the tradeoff between transparency and privacy, particularly for consumer-related assets. Fiduciary and regulatory responsibility to protect disclosure of non-public personally identifiable information limits investor ability to access loan-level data.

While property addresses provide the greatest insight to climate risk and other environmental factors, concerns persist over methods that allow data providers to triangulate and match data from various sources to identify addresses. This in turn makes it possible to link sensitive credit information to specific borrowers.

The responsibility to summarize and disclose metrics required by the framework falls to issuers. The largest residential issuers already appreciate this burden. These issuers have expressed a desire to solve these issues and are actively looking at what they can do to help the market without sacrificing privacy. Data providers, reporting systems, and users will all need to consider the guardrails needed to adhere to source data terms of use.   

Assessing Impact versus Risk

Another theme arising in nearly all discussions centered on assessing ESG investment decisions from the two sometimes competing dimensions of impact and risk and considering whether tradeoffs are needed to meet a wide variety of investment goals. Knowing the impact the investment is making—such as funding affordable housing or the reduction of greenhouse gas emissions—is fundamental to asset selection or understanding the overall ESG position.

But what risks/costs does the investment create for the portfolio? What is the likely influence on performance?

The credit aspect of a deal is distinct from its ESG impact. For example, a CMBS may be socially positive but rent regulation can create thin margins. Ideally, all would like to maximize positive impact but not at the cost of performance, a strategy that may be contributing now to an erosion in greeniums. Disclosures and reporting capabilities should be able to support investment analyses on these dimensions.  

A disclosure framework vetted and aligned by industry stakeholders, combined with robust reporting and analytics and access to as much underlying data as possible, will give investors and asset managers certainty as well as flexibility to meet their ESG goals.   

Contact us

Webinar: Tailoring Stress Scenarios to Changing Risk Environments

July 13th | 1:00 p.m. ET

Designing market risk stress scenarios is challenging because of the disparate ways in which various risk factors impact different asset classes. No two events are exactly alike, and the Covid-19 pandemic and the Russian invasion of Ukraine each provide a case study for risk managers seeking to incorporate events without precise precedents into existing risk frameworks.
 
Join RiskSpan’s Suhrud Dagli and Martin Kindler on Wednesday, June 15th at 1 p.m. ET as they illustrate an approach for correlating rates, spreads, commodity prices and other risk factors to analogous historical geopoltical disruptions and other major market events. Market risk managers will receive an easily digestable tutorial on the math behind how to create probability distributions and reliably model how such events are most likely to impact a portfolio.

 

Featured Speakers

Suhrud Dagli

Co-Founder and CIO, RiskSpan

Photo of Martin Kindler

Martin Kindler

Managing Director, RiskSpan


Why Climate Risk Matters for Mortgage Loan & MSR Investors 

The time has come for mortgage investors to start paying attention to climate risk.

Until recently, mortgage loan and MSR investors felt that they were largely insulated from climate risk. Notwithstanding the inherent risk natural hazard events pose to housing and the anticipated increased frequency of these events due to climate change, it seemed safe to assume that property insurers and other parties in higher loss position were bearing those risks. 

In reality, these risks are often underinsured. And even in cases where property insurance is adequate, the fallout has the potential to hit investor cash flows in a variety of ways. Acute climate events like hurricanes create short-term delinquency and prepayment spikes in affected areas. Chronic risks such as sea level rise and increased wildfire risk can depress housing values in areas most susceptible to these events. Potential impacts to property insurance costs, utility costs (water and electricity in areas prone to excessive heat and drought, for example) and property taxes used to fund climate-mitigating infrastructure projects all contribute to uncertainty in loan and MSR modeling. 

Moreover, dismissing climate risk “because we are in fourth loss position” should be antithetical to any investor claiming to espouse ESG principles. After all, consider who is almost always in the first loan position – the borrower. Any mortgage investment strategy purporting to be ESG friendly must necessarily take borrower welfare into account. Dismissing climate risk because borrowers will bear most of the impact is hardly a socially responsible mindset. This is particularly true when a disproportionate number of borrowers prone to natural hazard risk are disadvantaged to begin with. 

Hazard and flood insurers typically occupy the loss positions between borrowers and investors. Few tears are shed when insurers absorb losses. But society at large ultimately pays the price when losses invariably lead to higher premiums for everybody.    

Evaluating Climate Exposure

For these and other reasons, natural hazards pose a systemic risk to the entire housing system. For mortgage loan and MSR investors, it raises a host of questions. Among them: 

  1. What percentage of the loans in my portfolio are susceptible to flood risk but uninsured because flood maps are out of date? 
  2. How geographically concentrated is my portfolio? What percentage of my portfolio is at risk of being adversely impacted by just one or two extreme events? 
  3. What would the true valuation of my servicing portfolio be if climate risk were factored into the modeling?  
  4. What will the regulatory landscape look like in coming years? To what extent will I be required to disclose the extent to which my portfolio is exposed to climate risk? Will I even know how to compute it, and if so, what will it mean for my balance sheet? 

 

Incorporating Climate Data into Investment Decision Making

Forward-thinking mortgage servicers are at the forefront of efforts to get their arms around the necessary data and analytics. Once servicers have acquired a portfolio, they assess and triage their loans to identify which properties are at greatest risk. Servicers also contemplate how to work with borrowers to mitigate their risk.  

For investors seeking to purchase MSR portfolios, climate assessment is making its way into the due diligence process. This helps would-be investors ensure that they are not falling victim to adverse selection. As investors increasingly do this, climate assessment will eventually make its way further upstream, into appraisal and underwriting processes. 

Reliably modeling climate risk first requires getting a handle on how frequently natural hazard events are likely to occur and how severe they are likely to be. 

In a recent virtual industrial roundtable co-hosted by RiskSpan and Housing Finance Strategies, representatives of Freddie Mac, Mr. Cooper, and Verisk Analytics (a leading data and analytics firm that models a wide range of natural and man-made perils) gathered to discuss why understanding climate risk should be top of mind for mortgage investors and introduced a framework for approaching it. 

WATCH THE ENTIRE ROUNDTABLE

Building the Framework

The framework begins by identifying the specific hazards relevant to individual properties, building simulated catalogs of thousands of years worth of simulated events, computing likely events simulating damage based on property construction and calculating likely losses. These forecasted property losses are then factored into mortgage performance scenarios and used to model default risk, prepayment speeds and home price impacts. 

Connecting to Mortgage Performance Analysis

 

Responsibility to Borrowers

One member of the panel, Kurt Johnson, CRO of mega-servicer Mr. Cooper, spoke specifically of the operational complexities presented by climate risk. He cited as one example the need to speak daily with borrowers as catastrophic events are increasingly impacting borrowers in ways for which they were not adequately prepared. He also referred to the increasing number of borrowers incurring flood damage in areas that do not require flood insurance and spoke to how critical it is for servicers to know how many of their borrowers are in a similar position.

Johnson likened the concept of credit risk layering to climate risk exposure. The risk of one event happening on the heels of another event can cause the second event to be more devastating than it would have been had it occurred in a vacuum. As an example, he mentioned how the spike in delinquencies at the beginning of the covid pandemic was twice as large among borrowers who had just recovered from Hurricane Harvey 15 months earlier than it was among borrowers who had not been affected by the storm. He spoke of the responsibility he feels as a servicer to educate borrowers about what they can do to protect their properties in adverse scenarios.


FHFA Prepayment Monitoring Reports (Q1 2022) Powered by RiskSpan’s Edge Platform

To help enforce alignment of Agency prepayments across Fannie’s and Freddie’s Uniform MBS, the Federal Housing Finance Agency publishes a quarterly monitoring report. This report compares prepayment speeds of UMBS issued by the two Agencies. The objective is to help ensure that prepayment performance remains consistent. This consistency ensures that market expectations of a Fannie-issued UMBS are fundamentally indistinguishable from those of a Freddie-issued UMBS. The two Agencies’ UMBS should be interchangeably deliverable into passthrough “TBA” trades.

This week, the FHFA released the Q1 2022 version of this report. The charts in the FHFA’s publication, which it generates using RiskSpan’s Edge Platform, compare Fannie and Freddie UMBS prepayment rates (1-month and 3-month CPRs) across a variety of coupons and vintages.

30-year CPR Comparison All Coupons 1-month CPR

30-year CPR Comparison All Coupons 1-month CPR

30-year CPR Comparison All Coupons 1-month CPR

Relying on RiskSpan’s Edge Platform for this sort of analysis is fitting in that it is precisely the type of comparative analysis for which Edge was developed.

Edge allows traders, portfolio managers, and analysts to compare performance across a virtually unlimited number of loan subgroups. Users can cohort on multiple loan characteristics, including servicer, vintage, loan size, geography, LTV, FICO, channel, or any other borrower characteristic.

Edge’s easy-to-navigate user interface makes it accessible to traders and PMs seeking to set up queries and tweak constraints on the fly without having to write SQL code. Edge also offers an API for users that want programmatic access to the data. This is useful for generating customized reporting and systematic analysis of loan sectors.

Comparing Fannie’s and Freddie’s prepay speeds only scratches the surface of Edge’s analytical capabilities. Schedule a demo to see more of what the platform can do.

SPEAK TO AN EXPERT

Senior Home Equity Rises Again to $10.6 Trillion

Homeowners 62 and older saw their housing wealth grow by some $405 billion (3.8 percent) during the fourth quarter of 2021 to a record $10.6 trillion according to the latest quarterly release of the NRMLA/RiskSpan Reverse Mortgage Market Index.

Historical Changes in Aggregate Senior Home Values Q! 2000 - Q4 2021

The NRMLA/RiskSpan Reverse Mortgage Market Index (RMMI) rose to 370.56, another all-time high since the index was first published in 2000. The increase in older homeowners’ wealth was mainly driven by an estimated $452 billion (3.7 percent) increase in home values, offset by a $44 billion (2.3 percent) increase in senior-held mortgage debt.

For a comprehensive commentary, please see NRMLA’s press release.


How RiskSpan Computes the RMMI

To calculate the RMMI, RiskSpan developed an econometric tool to estimate senior housing value, mortgage balances, and equity using data gathered from various public resources. These resources include the American Community Survey (ACS), Federal Reserve Flow of Funds (Z.1), and FHFA housing price indexes (HPI). The RMMI represents the senior equity level at time of measure relative to that of the base quarter in 2000.[1] 

A limitation of the RMMI relates to Non-consecutive data, such as census population. We use a smoothing approach to estimate data in between the observable periods and continue to look for ways to improve our methodology and find more robust data to improve the precision of the results. Until then, the RMMI and its relative metrics (values, mortgages, home equities) are best analyzed at a trending macro level, rather than at more granular levels, such as MSA.


[1] There was a change in RMMI methodology in Q3 2015 mainly to calibrate senior homeowner population and senior housing values observed in 2013 American Community Survey (ACS).


Get Started
Log in

Linkedin   

risktech2024