Linkedin    Twitter   Facebook

Get Started
Log In

Linkedin

Blog Archives

RiskSpan Unveils New “Reverse ETL” Mortgage Data Mapping and Extract Functionality

ARLINGTON, Va., October 19, 2022 – Subscribers to RiskSpan’s Mortgage Data Management product can now not only leverage machine learning to streamline the intake of loan data from any format, but also define any target format for data extraction and sharing.

A recent enhancement to RiskSpan’s award-winning Edge Platform enables users to take in unformatted datasets from mortgage servicers, sellers and other counterparties and convert them into their preferred data format on the fly for sharing with accounting, client, and other downstream systems.

Analysts, traders, and portfolio managers have long used Edge to take in and store datasets, enabling them to analyze historical performance of custom cohorts using limitless combinations of mortgage loan characteristics and run predictive analytics on segments defined on the fly. With Edge’s novel “Reverse ETL” data extract functionality, these Platform users can now also easily and fully design an export format for exporting their data, creating the functional equivalent of a full integration node for sharing data with literally any system on or off the Edge Platform.   

Market participants tout the revolutionary technology as the end of having to share cumbersome and unformatted CSV files with counterparties. Now, the same smart mapping technology that for years has facilitated the ingestion of mortgage data onto the Edge Platform makes extracting and sharing mortgage data with downstream users just as easy.   

Comprehensive details of this and other new capabilities using RiskSpan’s Edge Platform are available by requesting a no-obligation live demo at riskspan.com.

SCHEDULE A FREE DEMO

This new functionality is the latest in a series of enhancements that is making the Edge Platform’s Data as a Service increasingly indispensable for mortgage loan and MSR traders and investors.

### 

About RiskSpan, Inc. 

RiskSpan is a leading technology company and the most comprehensive source for data management and analytics for residential mortgage and structured products. The company offers cloud-native SaaS analytics for on-demand market risk, credit risk, pricing and trading. With our data science experts and technologists, we are the leader in data as a service and end-to-end solutions for loan-level data management and analytics.

Our mission is to be the most trusted and comprehensive source of data and analytics for loans and structured finance investments.

Rethink loan and structured finance data. Rethink your analytics. Learn more at www.riskspan.com.

Media contact: Timothy Willis

CONTACT US

Bumpy Road Ahead for GNMA MBS?

In a recent webinar, RiskSpan’s Fowad Sheikh engaged in a robust discussion with two of his fellow industry experts, Mahesh Swaminathan of Hilltop Securities and Mike Ortiz of DoubleLine Group, to address the likely road ahead for Ginnie Mae securities performance.


The panel sought to address the following questions:

  • How will the forthcoming, more stringent originator/servicer financial eligibility requirements affect origination volumes, buyouts, and performance?
  • Who will fill the vacuum left by Wells Fargo’s exiting the market?
  • What role will falling prices play in delinquency and buyout rates?
  • What will be the impact of potential Fed MBS sales.

This post summarizes some the group’s key conclusions. A recording of the webinar in its entirety is available here.

GET STARTED

Wells Fargo’s Departure

To understand the the likely impact of Wells Fargo’s exit, it is first instructive to understand the declining market share of banks overall in the Ginnie Mae universe. As the following chart illustrates, banks as a whole account for just 11 percent of Ginnie Mae originations, down from 39 percent as recently as 2015.

Drilling down further, the chart below plots Wells Fargo’s Ginnie Mae share (the green line) relative to the rest of the market. As the chart shows, Wells Fargo accounts for just 3 percent of Ginnie Mae originations today, compared to 15 percent in 2015. This trend of Wells Fargo’s declining market share extends all the way back to 2010, when it accounted for some 30 percent of Ginnie originations.

As the second chart below indicates, Wells Fargo’s market share, even among banks has also been on a steady decline.

GeT A Free Trial or Demo

Three percent of the overall market is meaningful but not likely to be a game changer either in terms of origination trends or impact on spreads. Wells Fargo, however, continues to have an outsize influence in the spec pool market. The panel hypothesized that Wells’s departure from this market could open the door to other entities claiming that market share. This could potentially affect prepayment speeds – especially if Wells is replaced by non-bank servicers, which the panel felt was likely given the current non-bank dominance of the top 20 (see below) – since Wells prepays have traditionally been slightly better than the broader market.

The panel raised the question of whether the continuing bank retreat from Ginnie Mae originations would adversely affect loan quality. As basis for this concern, they cited the generally lower FICO scores and higher LTVs that characterize non-bank-originated Ginnie Mae mortgages (see below). 

These data notwithstanding, the panel asserted that any changes to credit quality would be restricted to the margins. Non-bank servicers originate a higher percentage of lower-credit-quality loans (relative to banks) not because non-banks are actively seeking those borrowers out and eschewing higher-credit-quality borrowers. Rather, banks tend to restrict themselves to borrowers with higher credit profiles. Non-banks will be more than happy to lend to these borrowers as banks continue to exit the market.

Effect of New Eligibility Requirements

The new capital requirements, which take effect a year from now, are likely to be less punitive than they appear at first glance. With the exception of certain monoline entities – say, those with almost all of their assets concentrated in MSRs – the overwhelming majority of Ginnie Mae issuers (banks and non-banks alike) are going to be able meet them with little if any difficulty.

Ginnie Mae has stated that, even if the new requirements went into effect tomorrow, 95 percent of its non-bank issuers would qualify. Consequently, the one-year compliance period should open the door for a fairly smooth transition.

To the extent Ginnie Mae issuers are unable to meet the requirements, a consolidation of non-bank entities is likely in the offing. Given that these institutions will likely be significant MSR investors, the potential increase in MSR sales could impact MSR multiples and potentially disrupt the MSR market, at least marginally.

Potential Impacts of Negative HPA

Ginnie Mae borrowers tend to be more highly leveraged than conventional borrowers. FHA borrowers can start with LTVs as high as 97.5 percent. VA borrowers, once the VA guarantee fee is rolled in, often have LTVs in excess of 100 percent. Similar characteristics apply to USDA loans. Consequently, borrowers who originated in the past two years are more likely to default as they watch their properties go underwater. This is potentially good news for investors in discount coupons (i.e., investors who benefit from faster prepay speeds) because these delinquent loans will be bought out quite early in their expected lives.

More seasoned borrowers, in contrast, have experienced considerable positive HPA in recent years. The coming forecasted decline should not materially impact these borrowers’ performance. Similarly, if HPD in 2023 proves to be mild, then a sharp uptick in delinquencies is unlikely, regardless of loan vintage or LTV. Most homeowners make mortgage payments because they wish to continue living in their house and do not seriously consider strategic defaults. During the financial crisis, most borrowers continued making good on their mortgage obligations even as their LTVs went as high as the 150s.

Further, the HPD we are likely to encounter next year likely will not have the same devastating effect as the HPD wave that accompanied the financial crisis. Loans on the books today are markedly different from loans then. Ginnie Mae loans that went bad during the crisis disproportionately included seller-financed, down-payment-assistance loans and other programs lacking in robust checks and balances. Ginnie Mae has instituted more stringent guidelines in the years since to minimize the impact of bad actors in these sorts of programs.

This all assumes, however, that the job market remains robust. Should the looming recession lead to widespread unemployment, that would have a far more profound impact on delinquencies and buyouts than would HPD.

Fed Sales

The Fed’s holdings (as of 9/21, see chart below) are concentrated around 2 percent and 2.5 percent coupons. This raises the question of what the Fed’s strategy is likely to be for unwinding its Ginnie Mae position.

Word on the street is that Fed sales are highly unlikely to happen in 2022. Any sales in 2023, if they happen at all, are not likely before the second half of the year. The panel opined that the composition of these sales is likely to resemble the composition of the Fed’s existing book – i.e., mostly 2s, 2.5s, and some 3s. They have the capacity to take a more sophisticated approach than a simple pro-rata unwinding. Whether they choose to pursue that is an open question.

The Fed was a largely non-economic buyer of mortgage securities. There is every reason to believe that it will be a non-economic seller, as well, when the time comes. The Fed’s trading desk will likely reach out to the Street, ask for inquiry, and seek to pursue an approach that is least disruptive to the mortgage market.

Conclusion

On closer consideration, many of these macro conditions (Wells’s exit, HPD, enhanced eligibility requirements, and pending Fed sales) that would seem to portend an uncertain and bumpy road for Ginnie Mae investors, may turn out to be more benign than feared.

Conditions remain unsettled, however, and these and other factors certainly bear watching as Ginnie Mae market participants seek to plot a prudent course forward.


Optimizing Analytics Computational Processing 

We met with RiskSpan’s Head of Engineering and Development, Praveen Vairavan, to understand how his team set about optimizing analytics computational processing for a portfolio of 4 million mortgage loans using a cloud-based compute farm.

This interview dives deeper into a case study we discussed in a recent interview with RiskSpan’s co-founder, Suhrud Dagli.

Here is what we learned from Praveen. 


Speak to an Expert

Could you begin by summarizing for us the technical challenge this optimization was seeking to overcome? 

PV: The main challenge related to an investor’s MSR portfolio, specifically the volume of loans we were trying to run. The client has close to 4 million loans spread across nine different servicers. This presented two related but separate sets of challenges. 

The first set of challenges stemmed from needing to consume data from different servicers whose file formats not only differed from one another but also often lacked internal consistency. By that, I mean even the file formats from a single given servicer tended to change from time to time. This required us to continuously update our data mapping and (because the servicer reporting data is not always clean) modify our QC rules to keep up with evolving file formats.  

The second challenge relates to the sheer volume of compute power necessary to run stochastic paths of Monte Carlo rate simulations on 4 million individual loans and then discount the resulting cash flows based on option adjusted yield across multiple scenarios. 

And so you have 4 million loans times multiple paths times one basic cash flow, one basic option-adjusted case, one up case, and one down case, and you can see how quickly the workload adds up. And all this needed to happen on a daily basis. 

To help minimize the computing workload, our client had been running all these daily analytics at a rep-line level—stratifying and condensing everything down to between 70,000 and 75,000 rep lines. This alleviated the computing burden but at the cost of decreased accuracy because they couldn’t look at the loans individually. 

What technology enabled you to optimize the computational process of running 50 paths and 4 scenarios for 4 million individual loans?

PV: With the cloud, you have the advantage of spawning a bunch of servers on the fly (just long enough to run all the necessary analytics) and then shutting it down once the analytics are done. 

This sounds simple enough. But in order to use that level of compute servers, we needed to figure out how to distribute the 4 million loans across all these different servers so they can run in parallel (and then we get the results back so we could aggregate them). We did this using what is known as a MapReduce approach. 

Say we want to run a particular cohort of this dataset with 50,000 loans in it. If we were using a single server, it would run them one after the other – generate all the cash flows for loan 1, then for loan 2, and so on. As you would expect, that is very time-consuming. So, we decided to break down the loans into smaller chunks. We experimented with various chunk sizes. We started with 1,000 – we ran 50 chunks of 1,000 loans each in parallel across the AWS cloud and then aggregated all those results.  

That was an improvement, but the 50 parallel jobs were still taking longer than we wanted. And so, we experimented further before ultimately determining that the “sweet spot” was something closer to 5,000 parallel jobs of 100 loans each. 

Only in the cloud is it practical to run 5,000 servers in parallel. But this of course raises the question: Why not just go all the way and run 50,000 parallel jobs of one loan each? Well, as it happens, running an excessively large number of jobs carries overhead burdens of its own. And we found that the extra time needed to manage that many jobs more than offset the compute time savings. And so, using a fair bit of trial and error, we determined that 100-loan jobs maximized the runtime savings without creating an overly burdensome number of jobs running in parallel.  

Get A Demo

You mentioned the challenge of having to manage a large number of parallel processes. What tools do you employ to work around these and other bottlenecks? 

PV: The most significant bottleneck associated with this process is finding the “sweet spot” number of parallel processes I mentioned above. As I said, we could theoretically break it down into 4 million single-loan processes all running in parallel. But managing this amount of distributed computation, even in the cloud, invariably creates a degree of overhead which ultimately degrades performance. 

And so how do we find that sweet spot – how do we optimize the number of servers on the distributed computation engine? 

As I alluded to earlier, the process involved an element of trial and error. But we also developed some home-grown tools (and leveraged some tools available in AWS) to help us. These tools enable us to visualize computation server performance – how much of a load they can take, how much memory they use, etc. These helped eliminate some of the optimization guesswork.   

Is this optimization primarily hardware based?

PV: AWS provides essentially two “flavors” of machines. One “flavor” enables you to take in a lot of memory. This enables you to keep a whole lot of loans in memory so it will be faster to run. The other flavor of hardware is more processor based (compute intensive). These machines provide a lot of CPU power so that you can run a lot of processes in parallel on a single machine and still get the required performance. 

We have done a lot of R&D on this hardware. We experimented with many different instance types to determine which works best for us and optimizes our output: Lots of memory but smaller CPUs vs. CPU-intensive machines with less (but still a reasonably amount of) memory. 

We ultimately landed on a machine with 96 cores and about 240 GB of memory. This was the balance that enabled us to run portfolios at speeds consistent with our SLAs. For us, this translated to a server farm of 50 machines running 70 processes each, which works out to 3,500 workers helping us to process the entire 4-million-loan portfolio (across 50 Monte Carlo simulation paths and 4 different scenarios) within the established SLA.  

What software-based optimization made this possible? 

PV: Even optimized in the cloud, hardware can get pricey – on the order of $4.50 per hour in this example. And so, we supplemented our hardware optimization with some software-based optimization as well. 

We were able to optimize our software to a point where we could use a machine with just 30 cores (rather than 96) and 64 GB of RAM (rather than 240). Using 80 of these machines running 40 processes each gives us 2,400 workers (rather than 3,500). Software optimization enabled us to run the same number of loans in roughly the same amount of time (slightly faster, actually) but using fewer hardware resources. And our cost to use these machines was just one-third what we were paying for the more resource-intensive hardware. 

All this, and our compute time actually declined by 10 percent.  

The software optimization that made this possible has two parts: 

The first part (as we discussed earlier) is using the MapReduce methodology to break down jobs into optimally sized chunks. 

The second part involved optimizing how we read loan-level information into the analytical engine.  Reading in loan-level data (especially for 4 million loans) is a huge bottleneck. We got around this by implementing a “pre-processing” procedure. For each individual servicer, we created a set of optimized loan files that can be read and rendered “analytics ready” very quickly. This enables the loan-level data to be quickly consumed and immediately used for analytics without having to read all the loan tapes and convert them into a format that analytics engine can understand. Because we have “pre-processed” all this loan information, it is immediately available in a format that the engine can easily digest and run analytics on.  

This software-based optimization is what ultimately enabled us to optimize our hardware usage (and save time and cost in the process).  

Contact us to learn more about how we can help you optimize your mortgage analytics computational processing.


Rethink Analytics Computational Processing – Solving Yesterday’s Problems with Today’s Technology and Access 

We sat down with RiskSpan’s co-founder and chief technology officer, Suhrud Dagli, to learn more about how one mortgage investor successfully overhauled its analytics computational processing. The investor migrated from a daily pricing and risk process that relied on tens of thousands of rep lines to one capable of evaluating each of the portfolio’s more than three-and-a-half million loans individually (and how they actually saved money in the process).  

Here is what we learned. 


Could you start by talking a little about this portfolio — what asset class and what kind of analytics the investor was running? 

SD: Our client was managing a large investment portfolio of mortgage servicing rights (MSR) assets, residential loans and securities.  

The investor runs a battery of sophisticated risk management analytics that rely on stochastic modeling. Option-adjusted spread, duration, convexity, and key rate durations are calculated based on more than 200 interest rate simulations. 

GET A FREE DEMO OR FREE TRIAL

Why was the investor running their analytics computational processing using a rep line approach? 

SD: They used rep lines for one main reason: They needed a way to manage computational loads on the server and improve calculation speeds. Secondarily, organizing the loans in this way simplified their reporting and accounting requirements to a degree (loans financed by the same facility were grouped into the same rep line).  

This approach had some downsides. Pooling loans by finance facility was sometimes causing loans with different balances, LTVs, credit scores, etc., to get grouped into the same rep line. This resulted in prepayment and default assumptions getting applied to every loan in a rep line that differed from the assumptions that likely would have been applied if the loans were being evaluated individually.  

The most obvious solution to this would seem to be one that disassembles the finance facility groups into their individual loans, runs all those analytics at the loan level, and then re-aggregates the results into the original rep lines. Is this sort of analytics computational processing possible without taking all day and blowing up the server? 

SD: That is effectively what we are doing. The process is not a speedy as we’d like it to be (and we are working on that). But we have worked out a solution that does not overly tax computational resources.  

The analytics computational processing we are implementing ignores the rep line concept entirely and just runs the loans. The scalability of our cloud-native infrastructure enables us to take the three-and-a-half million loans and bucket them equally for computation purposes. We run a hundred loans on each processor and get back loan-level cash flows and then generate the output separately, which brings the processing time down considerably. 

SPEAK TO AN EXPERT

So we have a proof of concept that this approach to analytics computational processing works in practice for running pricing and risk on MSR portfolios. Is it applicable to any other asset classes?

SD: The underlying principles that make analytics computational processing possible at the loan level for MSR portfolios apply equally well to whole loan investors and MBS investors. In fact, the investor in this example has a large whole-loan portfolio alongside its MSR portfolio. And it is successfully applying these same tactics on that portfolio.   

An investor in any mortgage asset benefits from the ability to look at and evaluate loan characteristics individually. The results may need to be rolled up and grouped for reporting purposes. But being able to run the cash flows at the loan level ultimately makes the aggregated results vastly more meaningful and reliable. 

A loan-level framework also affords whole-loan and securities investors the ability to be sure they are capturing the most important loan characteristics and are staying on top of how the composition of the portfolio evolves with each day’s payoffs. 

ESG factors are an important consideration for a growing number of investors. Only a loan-level approach makes it possible for these investors to conduct the kind of property- and borrower-level analyses to know whether they are working toward meeting their ESG goals. It also makes it easier to spot areas of geographic concentration risk, which simplifies climate risk management to some degree.  

Say I am a mortgage investor who is interested in moving to loan-level pricing and risk analytics. How do I begin? 

 SD: Three things: 

  1.  It begins with having the data. Most investors have access to loan-level data. But it’s not always clean. This is especially true of origination data. If you’re acquiring a pool – be it a seasoned pool or a pool right after origination – you don’t have the best origination data to drive your model. You also need a data store that can generate loan-loan level output to drive your analytics and models.
  2. The second factor is having models that work at the loan level – models that have been calibrated using loan-level performance and that are capable of generating loan-level output. One of the constraints of several existing modeling frameworks developed by vendors is they were created to run at a rep line level and don’t necessarily work very well for loan-level projections.  
  3. The third thing you need is a compute farm. It is virtually impossible to run loan-level analytics if you’re not on the cloud because you need to distribute the computational load. And your computational distribution requirements will change from portfolio to portfolio based on the type of analytics that you are running, based on the types of scenarios that you are running, and based on the models you are using. 

The cloud is needed not just for CPU power but also for storage. This is because once you go to the loan level, every loan’s data must be made available to every processor that’s performing the calculation. This is where having the kind of shared databases, which are native to a cloud infrastructure, becomes vital. You simply can’t replicate it using a on-premise setup of computers in your office or in your own data center. 

So, 1) get your data squared away, 2) make sure you’re using models that are optimized for loan-level, and 3) max out your analytics computational processing power by migrating to cloud-native infrastructure. Thank you, Suhrud, for taking the time to speak with us.


New Refinance Lag Functionality Affords RiskSpan Users Flexibility in Higher Rate Environments 

ARLINGTON, Va., September 29, 2022 — RiskSpan, a leading technology company and the most comprehensive source for data management and analytics for residential mortgage and structured products, has announced that users of its award-winning Edge Platform can now fine-tune the assumed time lag between a rate-incentivized borrower’s decision to refinance and ultimate payoff. Getting this time lag right unveils a more accurate understanding of the rate incentive that borrowers responded to and thus better predictions of coming prepayments. 

The recent run-up in interest rates has caused the number of rate-incentivized mortgage refinancings to fall precipitously. Newfound operational capacity at many lenders, created by this drop in volume, means that new mortgages can now be closed in fewer days than were necessary at the height of the refi boom. This “lag time” between when a mortgage borrower becomes in-the-money to refinance and when the loan actually closes is an important consideration for MBS traders and analysts seeking to model and predict prepayment performance. 

Rather than confining MBS traders to a single, pre-set lag time assumption of 42 days, users of the Edge Platform’s Historical Performance module can now adjust the lag assumption when building their S-curves to better reflect their view of current market conditions. Using the module’s new Input section for Agency datasets, traders and analysts can further refine their approach to computing refi incentive by selecting the prevailing mortgage rate measure for any given sector (e.g., FH 30Y PMMS, MBA FH 30Y, FH 15Y PMMS and FH 5/1 PMMS) and adjusting the lag time to anywhere from zero to 99 days.   

Comprehensive details of this and other new capabilities are available by requesting a no-obligation live demo below or at riskspan.com

GET A FREE DEMO

This new functionality is the latest in a series of enhancements that is making the Edge Platform increasingly indispensable for Agency MBS traders and investors.  

###

About RiskSpan, Inc. 

RiskSpan offers cloud-native SaaS analytics for on-demand market risk, credit risk, pricing and trading. With our data science experts and technologists, we are the leader in data as a service and end-to-end solutions for loan-level data management and analytics. 

Our mission is to be the most trusted and comprehensive source of data and analytics for loans and structured finance investments. 

Rethink loan and structured finance data. Rethink your analytics. Learn more at www.riskspan.com. 

Media contact: Timothy Willis

CONTACT US

Quantifying the Impact of Climate Risk on Housing Finance 

When people speak of the risk climate poses to housing, they typically do so in qualitative and relative terms. A Florida home is at greater risk of hurricane damage than an Iowa home. Wildfires generally threaten homes in northern California more than they threaten homes in New Hampshire. And because of climate change, the risk these and other perils pose to any individual geographical area are largely viewed as higher than they were 25 years ago.

People feel comfortable speaking in these general terms. But qualitative estimates are of little practical use to mortgage investors seeking to fine-tune their pricing, prepayment, and default models. These analytical frameworks require not just reliable data but the means to translate them into actionable risk metrics.   

Physical risks and transition risks

Broadly speaking, climate risk manifests itself as a combination of physical risks and transition risks. Physical risks include “acute” disaster events, such as hurricanes, tornadoes, wildfires, and floods. Chronic risks, such as sea level rise, extreme temperatures, and drought, are experienced over a longer period. Transition risks relate to costs resulting from regulations promulgated to combat climate change and from the need to invest in new technologies designed either to combat climate change directly or mitigate its effects.

Some of the ways in which these risks impact mortgage assets are self-evident. Acute events that damage or destroy homes have an obvious effect on the performance of the underlying mortgages. Other mechanisms are more latent but no less real. Increasing costs of homeownership, caused by required investment in climate-change-mitigating technologies, can be a source of financial stress for some borrowers and affect mortgage performance. Likewise, as flood and other hazard insurance premiums adjust to better reflect the reality of certain geographies’ increasing exposure to natural disaster risk, demand for real estate in these areas could decrease, increasing the pressure on existing homeowners who may not have much cushion in their LTVs to begin with.

Mortgage portfolio risk management

At the individual loan level, these risks translate to higher delinquency risks, probability of default, loss given default, spreads, and advance expenses. At the portfolio level, the impact is felt in asset valuation, concentration risk (what percentage of homes in the portfolio are located in high-risk areas), VaR, and catastrophic tail risk.

VaR can be computed using natural hazard risk models designed to forecast the probability of individual perils for a given geography and using that probability to compute the worst property loss (total physical loss and loss net of insurance proceeds) that can be expected during the portfolio’s expected life at the 99 percent (or 95 percent) confidence level. The following figure illustrates how this works for a portfolio covering multiple geographies with varying types and likelihoods of natural hazard risk.

CONTACT US
Climate risk dashboard acute risk

These analyses can look at the exposure of an entire portfolio to all perils combined:    

Climate risk dashboard U.S.
SPEAK TO AN EXPERT

Or they can look at the exposure of a single geographic area to one peril in particular:

Climate risk dashboard Florida

Accounting for climate risk when bidding on whole loans

The risks quantified above pertain to properties that secure mortgages and therefore only indirectly to the mortgage assets themselves. Investors seeking to build whole-loan portfolios that are resilient to climate risk should consider climate risk in the context of other risk factors. Such a “property-level climate risk” approach takes into account factors such as:

  • Whether the property is insured against the peril in question
  • The estimate expected risk (and tail risk) of property damage by the peril in question
  • Loan-to-value ratio

The most prudent course of action includes a screening mechanism that includes pricing and concentration limits tied to LTV ratios. Investors may choose to invest in areas of high climate risk but only in loans with low LTV ratios. Bids should be adjusted to account for climate risk, but the amount of the adjustment can be a function of the LTV. Concentration limits should be adjusted accordingly:

Climate risk pricing adjustments

Conclusion

When assessing the impact of climate risk on a mortgage portfolio, investors need to consider and seek to quantify not just how natural hazard events will affect home values but also how they will affect borrower behavior, specifically in terms of prepayments, delinquencies, and defaults.

We are already beginning to see climate factors working their way into the secondary mortgage markets via pricing adjustments and concentration screening. It is only a matter of time before these considerations move further up into the origination process and begin to manifest themselves in pricing and underwriting policy (as flood insurance requirements already have today).

Investors looking for a place to start can begin by incorporating a climate risk score into their existing credit box/pricing grid, as illustrated above. This will help provide at least a modicum of comfort to investors that they are being compensated for these hidden risks and (at least as important) will ensure that portfolios do not become overly concentrated in at-risk areas.

GET STARTED

Industry Veteran Patricia Black Named RiskSpan Chief Client Officer

ARLINGTON, Va., Sept. 19, 2022 — RiskSpan, a leading technology company and the most comprehensive source for data management and analytics for residential mortgage and structured products, has appointed Patricia Black as its Chief Client Officer.  

Black takes over responsibility for managing client success across the full array of RiskSpan’s Edge Platform and services offerings. She brings more than twenty years of diversified experience as a senior financial services executive. Her expertise ranges from enterprise risk management, compliance, finance, program management, audit and controls to operations and technology, regulatory requirements, and corporate governance  

As a senior leader at Fannie Mae between 2005 and 2016, Black served in a number of key roles, including as Chief Audit Executive in the aftermath of the 2008 financial crisis, Head of Strategic Initiatives, and Head of Financial Controls and SOX while the firm underwent an extensive earnings restatement process.  

More recently, Black headed operations at SoFi Home Loans where she expanded the company’s partner relationships, technological capabilities, and risk management practices. Prior to SoFi, as Chief of Staff at Caliber Home Loans, she was an enterprise leader focusing on transformation, strategy, technology and operations. 

“Tricia’s reputation throughout the mortgage industry for building collaborative relationships in challenging environments and working across organizational boundaries to achieve targeted outcomes is second to none,” said Bernadette Kogler, CEO of RiskSpan. “Her astounding breadth of expertise will contribute to the success of our clients by helping ensure we are optimally structured to serve them.”  

“I feel it a privilege to be able to serve RiskSpan’s impressive and growing clientele in this new capacity,” said Black. “I look forward to helping these forward-thinking institutions rethink their mortgage and structured finance data and analytics and fully maximize their investment in RiskSpan’s award-winning platform and services.” 

CONNECT WITH THE RISKSPAN TEAM

About RiskSpan, Inc.  

RiskSpan offers cloud-native SaaS analytics for on-demand market risk, credit risk, pricing and trading. With our data science experts and technologists, we are the leader in data as a service and end-to-end solutions for loan-level data management and analytics. 

Our mission is to be the most trusted and comprehensive source of data and analytics for loans and structured finance investments. 

Rethink loan and structured finance data. Rethink your analytics. Learn more at www.riskspan.com. 


Webinar Recording: Bumpy Road Ahead for GNMA MBS?

Recorded: Thursday, September 29th | 3:30 p.m. EDT

The panel discusses the likely impact of recent, and potential future, market events on GNMA MBS. Topics for discussion will include:

  • How will the forthcoming, more stringent originator/servicer financial eligibility requirements affect origination volumes, buyouts, and performance?
  • Who will fill the vacuum left by Wells Fargo?
  • What role will falling prices play in delinquency and buyout rates?
  • What will be the impact of potential Fed MBS sales.

Presenters

Mahesh Swaminahtan, CFA

Managing Director, MBS/ABS Strategist, Hilltop Securities

Fowad Sheikh

Senior Managing Director, RiskSpan

Mike Ortiz

Agency MBS Analyst, DoubleLine Group LP

 


Rising Rates; Rising Temperatures: What Higher Interest Rates Portend for Mortgage Climate Risk — An interview with Janet Jozwik  

Janet Jozwik leads RiskSpan’s sustainability analytics (climate risk and ESG) team. She is also an expert in mortgage credit risk and a recognized industry thought leader on incorporating climate risk into credit modeling. We sat down with Janet to get her views on whether the current macroeconomic environment should impact how mortgage investors prioritize their climate risk mitigation strategies.


You contend that higher interest rates are exposing mortgage lenders and investors to increased climate risk. Why is that?

JJ: My concern is primarily around the impact of higher rates on credit risk overall, of which climate risk is merely a subset – a largely overlooked and underappreciated subset, to be sure, and one with potentially devastating consequences, but ultimately one of many. The simple reason is that, because interest rates are up, loans are going to remain on your books longer. The MBA’s recent announcement of refinance applications (and mortgage originations overall) hitting their lowest levels since 2000 is stark evidence of this.

And because these loans are going to be lasting longer, borrowers will have more opportunities to get into trouble (be it a loss of income or a natural disaster) and everybody should be taking credit risk more seriously. One of the biggest challenges posed by a high-rate environment is borrowers don’t have a lot of the “outs” available to them as they do when they encounter stress during more favorable macroeconomic environments. They can no longer simply refi into a lower rate. Modification options become more complicated. They might have no option other than to sell the home – and even that isn’t going to be as easy as it was, say, a year ago. So, we’ve entered this phase where credit risk analytics, both at origination and life of loan, really need to be taken seriously. And credit risk includes climate risk.

So longer durations mean more exposure to credit risk – more time for borrowers to run into trouble and experience credit events. What does climate have to do with it? Doesn’t homeowners’ insurance mitigate most of this risk anyway?

JJ: Each additional month or year that a mortgage loan remains outstanding is another month or year that the underlying property is exposed to some form of natural disaster risk (hurricane, flood, wildfire, earthquake, etc.). When you look at a portfolio in aggregate – one whose weighted average life has suddenly ballooned from four years to, say eight years – it is going to experience more events, more things happening to it. Credit risk is the risk of a borrower failing to make contractual payments. And having a home get blown down or flooded by a hurricane tends to have a dampening effect on timely payment of principal and interest.

As for insurance, yes, insurance mitigates portfolio exposure to catastrophic loss to some degree. But remember that not everyone has flood insurance, and many loans don’t require it. Hurricane-specific policies often come with very high deductibles and don’t always cover all the damage. Many properties lack wildfire insurance or the coverage may not be adequate. Insurance is important and valuable but should not be viewed as a panacea or a substitute for good credit-risk management or taking climate into account when making credit decisions.

But the disaster is going to hit when the disaster is going to hit, isn’t it? How should I be thinking about this if I am a lender who recaptures a considerable portion of my refis? Haven’t I just effectively replaced three shorter-lived assets with a single longer-lived one? Either way, my portfolio’s going to take a hit, right?

JJ: That is true as far as it goes. And if in the steady state that you are envisioning, one where you’re just churning through your portfolio, prepaying existing loans with refis that look exactly like the loans they’re replacing, then, yes, the risk will be similar, irrespective of expected duration.

But do not forget that each time a loan turns over, a lender is afforded an opportunity to reassess pricing (or even reassess the whole credit box). Every refi is an opportunity to take climate and other credit risks into account and price them in. But in a high-rate environment, you’re essentially stuck with your credit decisions for the long haul.

Do home prices play any role in this?

JJ: Near-zero interest rates fueled a run-up in home prices like nothing we’ve ever seen before. This arguably made disciplined credit-risk management less important because, worst case, all the new equity in a property served as a buffer against loss.

But at some level, we all had to know that these home prices were not universally sustainable. And now that interest rates are back up, existing home prices are suddenly starting to look a little iffy. Suddenly, with cash-out refis off the table and virtually no one in the money for rate and term refis, weighted average lives have nowhere to go but up. This is great, of course, if your only exposure is prepayment risk. But credit risk is a different story.

And so, extremely low interest rates over an extended period played a significant role in unsustainably high home values. But the pandemic had a lot to do with it, as well. It’s well documented that the mass influx of home buyers into cities like Boise from larger, traditionally more expensive markets drove prices in those smaller cities to astronomical levels. Some of these markets (like Boise) have not only reached an equilibrium point but are starting to see property values decline. Lenders with excessive exposure to these traditionally smaller markets that experienced the sharpest home price increases during the pandemic will need to take a hard look at their credit models’ HPI assumptions (in addition to those properties’ climate risk exposure).

What actions should lenders and investors be considering today?

JJ: If you are looking for a silver lining in the fact that origination volumes have fallen off a cliff, it has afforded the market an opportunity to catch its breath and reassess where it stands risk-wise. Resources that had been fully deployed in an effort simply to keep up with the volume can now be reallocated to taking a hard look at where the portfolio stands in terms of credit risk generally and climate risk in particular.

This includes assessing where the risks and concentrations are in mortgage portfolios and, first, making sure not to further exacerbate existing concentration risks by continuing to acquire new assets in overly exposed geographies. Investors may be wise to go so far even to think about selling certain assets if they feel like they have too much risk in problematic areas.

Above all, this is a time when lenders need to be taking a hard look at the fundamentals underpinning their underwriting standards. We are coming up on 15 years since the start of the “Great Recession” – the last time mortgage underwriting was really “tight.” For the past decade, the industry has had nothing but calm waters – rising home values and historically low interest rates. It’s been like tech stocks in the ‘90s. Lenders couldn’t help but make money.

I am concerned that this has allowed complacency to take hold. We’re in a new world now. One with shaky home prices and more realistic interest rates. The temptation will be to loosen underwriting standards in order to wring whatever volume might be available out of the economy. But in reality, they need to be doing precisely the opposite. Underwriting standards are going to have tighten a bit in order effectively manage the increased credit (and climate) risks inherent to longer-duration lending.

It’s okay for lenders and investors to be taking these new risks on. They just need to be doing it with their eyes wide open and they need to be pricing for it.

Speak To an Expert

Live Demo of RiskSpan’s Award-Winning Edge Platform–3

Wednesday, August 24th | 1:00 p.m. EDT

Live Demo of RiskSpan’s award-winning Edge Platform. Learn more and ask questions at our bi-weekly, 45-minute demo.

Historical Performance Tool: Slice and dice historical loan performance in the Agency and PLRMBS universe to find outperforming cohorts.

Predictive Loan-Level Pricing and Risk Analytics: Produce loan-level pricing and risk on loans, MSRs, and structured products in minutes – with behavioral models applied at the loan-level, and other assumptions applied conveniently to inform bids and hedging.

Loan Data Management: Let RiskSpan’s data scientists consolidate and enhance your data across origination and servicing platforms, make it analytics-ready, and maintain if for ongoing trend analysis.


About RiskSpan:

RiskSpan offers cloud-native SaaS analytics for on-demand market risk, credit risk, pricing and trading. With our data science experts and technologists, we are the leader in data as a service and end-to-end solutions for loan-level data management and analytics.

Our mission is to be the most trusted and comprehensive source of data and analytics for loans and structured finance investments.

Rethink loan and structured finance data. Rethink your analytics. Learn more at www.riskspan.com.

Presenters

Joe Makepeace

Director, RiskSpan

Jordan Parker

Sales Executive, RiskSpan


Get Started
Log in

Linkedin   

risktech2024