Linkedin    Twitter   Facebook

Get Started
Log In

Linkedin

Articles Tagged with: General

Industry Veteran Patricia Black Named RiskSpan Chief Client Officer

ARLINGTON, Va., Sept. 19, 2022 — RiskSpan, a leading technology company and the most comprehensive source for data management and analytics for residential mortgage and structured products, has appointed Patricia Black as its Chief Client Officer.  

Black takes over responsibility for managing client success across the full array of RiskSpan’s Edge Platform and services offerings. She brings more than twenty years of diversified experience as a senior financial services executive. Her expertise ranges from enterprise risk management, compliance, finance, program management, audit and controls to operations and technology, regulatory requirements, and corporate governance  

As a senior leader at Fannie Mae between 2005 and 2016, Black served in a number of key roles, including as Chief Audit Executive in the aftermath of the 2008 financial crisis, Head of Strategic Initiatives, and Head of Financial Controls and SOX while the firm underwent an extensive earnings restatement process.  

More recently, Black headed operations at SoFi Home Loans where she expanded the company’s partner relationships, technological capabilities, and risk management practices. Prior to SoFi, as Chief of Staff at Caliber Home Loans, she was an enterprise leader focusing on transformation, strategy, technology and operations. 

“Tricia’s reputation throughout the mortgage industry for building collaborative relationships in challenging environments and working across organizational boundaries to achieve targeted outcomes is second to none,” said Bernadette Kogler, CEO of RiskSpan. “Her astounding breadth of expertise will contribute to the success of our clients by helping ensure we are optimally structured to serve them.”  

“I feel it a privilege to be able to serve RiskSpan’s impressive and growing clientele in this new capacity,” said Black. “I look forward to helping these forward-thinking institutions rethink their mortgage and structured finance data and analytics and fully maximize their investment in RiskSpan’s award-winning platform and services.” 

CONNECT WITH THE RISKSPAN TEAM

About RiskSpan, Inc.  

RiskSpan offers cloud-native SaaS analytics for on-demand market risk, credit risk, pricing and trading. With our data science experts and technologists, we are the leader in data as a service and end-to-end solutions for loan-level data management and analytics. 

Our mission is to be the most trusted and comprehensive source of data and analytics for loans and structured finance investments. 

Rethink loan and structured finance data. Rethink your analytics. Learn more at www.riskspan.com. 


“Reject Inference” Methods in Credit Modeling: What are the Challenges?

Reject inference is a popular concept that has been used in credit modeling for decades. Yet, we observe in our work validating credit models that the concept is still dynamically evolving. The appeal of reject inference, whose aim is to develop a credit scoring model utilizing all available data, including that of rejected applicants, is easy enough to grasp. But the technique also introduces a number of fairly vexing challenges.

The technique seeks to rectify a fundamental shortcoming in traditional credit modeling: Models predicting the probability that a loan applicant will repay the loan can be trained to historical loan application data with a binary variable representing whether a loan was repaid or charged off. This information, however, is only available for accepted applications. And many of these applications are not particularly recent. This limitation results in a training dataset that may not be representative of the broader loan application universe.

Credit modelers have devised several techniques for getting around this data representativeness problem and increasing the number of observations by inferring the repayment status of rejected loan applications. These techniques, while well intentioned, are often treated empirically and lack a deeper theoretical basis. They often result in “hidden” modeling assumptions, the reasonableness of which is not fully investigated. Additionally, no theoretical properties of the coefficient estimates, or predictions are guaranteed.

This article summarizes the main challenges of reject inference that we have encountered in our model validation practice.

Speak With A MODEL VALIDATION EXPERT

Selecting the Right Reject Inference Method

Many approaches exist for reject inference, none of which is clearly and universally superior to all the others. Empirical studies have been conducted to compare methods and pick a winner, but the conclusions of these studies are often contradictory. Some authors argue that reject inference cannot improve scorecard models[1]and flatly recommend against their use. Others posit that certain techniques can outperform others[2] based on empirical experiments. The results of these experiments, however, tend to be data dependent. Some of the most popular approaches include the following:

  • Ignoring rejected applications: The simplest approach is to develop a credit scoring model based only on accepted applications. The underlying assumption is that rejected applications can be ignored and that the “missingness” of this data from the training dataset can be classified as missing at random. Supporters of this method point to the simplicity of the implementation, clear assumptions, and good empirical results. Others argue that the rejected applications cannot be dismissed simply as random missing data and thus should not be ignored.
  • Hard cut-off method: In this method, a model is first trained using only accepted application data. This trained model is then used to predict the probabilities of charge-off for the rejected applications. A cut-off value is then chosen. Hypothetical loans from rejected applications with probabilities higher than this cut-off value are considered charged off. Hypothetical loans from the remaining applications are assumed to be repaid. The specified model is then re-trained using a dataset including both accepted and rejected applications.
  • Fuzzy augmentation: Similar to the hard cut-off method, fuzzy augmentation begins by training the model on accepted applications only. The resulting model with estimated coefficients is then used to predict charge-off probabilities for rejected applications. Data from rejected applications is then duplicated and a repaid or charged-off status is assigned to each. The specified model is then retrained on the augmented dataset—including accepted applications and the duplicated rejects. Each rejected application is weighted by either a) the predicted probability of charge-off if its assigned status is “charged-off,” or b) the predicted probability of it being repaid if its assigned status is “repaid.”
  • Parceling: The parceling method resembles the hard cut-off method. However, rather than classifying all rejects above a certain threshold as charged-off, this method classifies the repayment status in proportion to the expected “bad” rate (charge-off frequency) at that score. The predicted charge-off probabilities are partitioned into k intervals. Then, for each interval, an assumption is made about the bad rate, and loan applications in each interval are assigned a repayment status randomly according to the bad rate. Bad rates are assumed to be higher in the reject dataset than among the accepted loans. This method considers the missingness to be not at random (MNAR), which requires the modeler to supplement the additional information about the distribution of charge-offs among rejects.

Proportion of Accepted Applications to Rejects

An institution with a relatively high percentage of rejected applications will necessarily end up with an augmented training dataset whose quality is heavily dependent on the quality of the selected reject inference method and its implementation. One might argue it is best to limit the proportion of rejected applications to acceptances. The level at which such a cap is established should reflect the “confidence” in the method used. Estimating such a confidence level, however, is a highly subjective endeavor.

The Proportion of Bad Rates for Accepts and Rejects

It is reasonable to assume that the “bad rate,” i.e., proportion of charged-off loans to repaid loans, will be higher among rejected applications. Some modelers set a threshold based on their a priori belief that the bad rate among rejects is at least p-times the bad rate among acceptances. If the selected reject inference method produces a dataset with a bad rate that is perceived to be artificially low, actions are taken to increase the bad rate above some threshold. Identifying where to establish this threshold is notoriously difficult to justify.

Variable Selection

As outlined above, most approaches begin by estimating a preliminary model based on accepted applications only. This model is then used to infer how rejected loans would have performed. The preliminary model is then retrained on a dataset consisting both of actual data from accepted applications and of the inferred data from rejects. This means that the underlying variables themselves are selected based only on the actual loan performance data from accepted applications. The statistical significance of the selected variables might change, however, when moving to the complete dataset. Variable selection is sometimes redone using the complete data. This, however, can lead to overfitting.

Measuring Model Performance

From a model validator’s perspective, an ideal solution would involve creating a control group in which applications would not be scored and filtered and every application would be accepted. Then the discriminating power of a credit model could be assessed by comparing the charge-off rate of the control group with the charge-off rate of the loans accepted by the model. This approach of extending credit indiscriminately is impractical, however, as it would require the lender to engage in some degree of irresponsible lending.

Another approach is to create a test set. The dilemma here is whether to include only accepted applications. A test set that includes only accepted applications will not necessarily reflect the population for which the model will be used. Including rejected applications, however, obviously necessitates the use of reject inference. For all the reasons laid out above, this approach risks overstating the model’s performance due to the fact that a similar model (trained only on the accepted cases) was used for reject inference.

A third approach that avoids both of these problems involves using information criteria such as AIC and BIC. This, however, is useful only when comparing different models (for model or variable selection). The values of information criteria cannot be interpreted as an absolute measure of performance.

A final option is to consider utilizing several models in production (the main model and challenger models). Under this scenario, each application would be evaluated by a model selected at random. The models can then be compared retroactively by calculating their bad rates on accepted application after the financed loans mature. Provided that the accept rates are similar, the model with the lowest bad rate is the best.

Conclusion

Reject inference remains a progressing field in credit modeling. Its ability to improve model performance is still the subject of intense debate. Current results suggest that while reject inference can improve model performance, its application can also lead to overfitting, thus worsening the ability to generalize. The lack of a strong theoretical basis for reject inference methods means that applications of reject inference need to rely on empirical results. Thus, if reject inference is used, key model stakeholders need to possess a deep understanding of the modeled population, have strong domain knowledge, emphasize conducting experiments to justify the applied modeling techniques, and, above all, adopt and follow a solid ongoing monitoring plan.

Doing this will result in a modeling methodology that is most likely to produce reliable outputs for the institutions while also satisfying MRM and validator requirements.

Contact Us

[1] https://www.sciencedirect.com/science/article/abs/pii/S0378426603002036

[2] https://economix.fr/pdf/dt/2016/WP_EcoX_2016-10.pdf


Rising Rates; Rising Temperatures: What Higher Interest Rates Portend for Mortgage Climate Risk — An interview with Janet Jozwik  

Janet Jozwik leads RiskSpan’s sustainability analytics (climate risk and ESG) team. She is also an expert in mortgage credit risk and a recognized industry thought leader on incorporating climate risk into credit modeling. We sat down with Janet to get her views on whether the current macroeconomic environment should impact how mortgage investors prioritize their climate risk mitigation strategies.


You contend that higher interest rates are exposing mortgage lenders and investors to increased climate risk. Why is that?

JJ: My concern is primarily around the impact of higher rates on credit risk overall, of which climate risk is merely a subset – a largely overlooked and underappreciated subset, to be sure, and one with potentially devastating consequences, but ultimately one of many. The simple reason is that, because interest rates are up, loans are going to remain on your books longer. The MBA’s recent announcement of refinance applications (and mortgage originations overall) hitting their lowest levels since 2000 is stark evidence of this.

And because these loans are going to be lasting longer, borrowers will have more opportunities to get into trouble (be it a loss of income or a natural disaster) and everybody should be taking credit risk more seriously. One of the biggest challenges posed by a high-rate environment is borrowers don’t have a lot of the “outs” available to them as they do when they encounter stress during more favorable macroeconomic environments. They can no longer simply refi into a lower rate. Modification options become more complicated. They might have no option other than to sell the home – and even that isn’t going to be as easy as it was, say, a year ago. So, we’ve entered this phase where credit risk analytics, both at origination and life of loan, really need to be taken seriously. And credit risk includes climate risk.

So longer durations mean more exposure to credit risk – more time for borrowers to run into trouble and experience credit events. What does climate have to do with it? Doesn’t homeowners’ insurance mitigate most of this risk anyway?

JJ: Each additional month or year that a mortgage loan remains outstanding is another month or year that the underlying property is exposed to some form of natural disaster risk (hurricane, flood, wildfire, earthquake, etc.). When you look at a portfolio in aggregate – one whose weighted average life has suddenly ballooned from four years to, say eight years – it is going to experience more events, more things happening to it. Credit risk is the risk of a borrower failing to make contractual payments. And having a home get blown down or flooded by a hurricane tends to have a dampening effect on timely payment of principal and interest.

As for insurance, yes, insurance mitigates portfolio exposure to catastrophic loss to some degree. But remember that not everyone has flood insurance, and many loans don’t require it. Hurricane-specific policies often come with very high deductibles and don’t always cover all the damage. Many properties lack wildfire insurance or the coverage may not be adequate. Insurance is important and valuable but should not be viewed as a panacea or a substitute for good credit-risk management or taking climate into account when making credit decisions.

But the disaster is going to hit when the disaster is going to hit, isn’t it? How should I be thinking about this if I am a lender who recaptures a considerable portion of my refis? Haven’t I just effectively replaced three shorter-lived assets with a single longer-lived one? Either way, my portfolio’s going to take a hit, right?

JJ: That is true as far as it goes. And if in the steady state that you are envisioning, one where you’re just churning through your portfolio, prepaying existing loans with refis that look exactly like the loans they’re replacing, then, yes, the risk will be similar, irrespective of expected duration.

But do not forget that each time a loan turns over, a lender is afforded an opportunity to reassess pricing (or even reassess the whole credit box). Every refi is an opportunity to take climate and other credit risks into account and price them in. But in a high-rate environment, you’re essentially stuck with your credit decisions for the long haul.

Do home prices play any role in this?

JJ: Near-zero interest rates fueled a run-up in home prices like nothing we’ve ever seen before. This arguably made disciplined credit-risk management less important because, worst case, all the new equity in a property served as a buffer against loss.

But at some level, we all had to know that these home prices were not universally sustainable. And now that interest rates are back up, existing home prices are suddenly starting to look a little iffy. Suddenly, with cash-out refis off the table and virtually no one in the money for rate and term refis, weighted average lives have nowhere to go but up. This is great, of course, if your only exposure is prepayment risk. But credit risk is a different story.

And so, extremely low interest rates over an extended period played a significant role in unsustainably high home values. But the pandemic had a lot to do with it, as well. It’s well documented that the mass influx of home buyers into cities like Boise from larger, traditionally more expensive markets drove prices in those smaller cities to astronomical levels. Some of these markets (like Boise) have not only reached an equilibrium point but are starting to see property values decline. Lenders with excessive exposure to these traditionally smaller markets that experienced the sharpest home price increases during the pandemic will need to take a hard look at their credit models’ HPI assumptions (in addition to those properties’ climate risk exposure).

What actions should lenders and investors be considering today?

JJ: If you are looking for a silver lining in the fact that origination volumes have fallen off a cliff, it has afforded the market an opportunity to catch its breath and reassess where it stands risk-wise. Resources that had been fully deployed in an effort simply to keep up with the volume can now be reallocated to taking a hard look at where the portfolio stands in terms of credit risk generally and climate risk in particular.

This includes assessing where the risks and concentrations are in mortgage portfolios and, first, making sure not to further exacerbate existing concentration risks by continuing to acquire new assets in overly exposed geographies. Investors may be wise to go so far even to think about selling certain assets if they feel like they have too much risk in problematic areas.

Above all, this is a time when lenders need to be taking a hard look at the fundamentals underpinning their underwriting standards. We are coming up on 15 years since the start of the “Great Recession” – the last time mortgage underwriting was really “tight.” For the past decade, the industry has had nothing but calm waters – rising home values and historically low interest rates. It’s been like tech stocks in the ‘90s. Lenders couldn’t help but make money.

I am concerned that this has allowed complacency to take hold. We’re in a new world now. One with shaky home prices and more realistic interest rates. The temptation will be to loosen underwriting standards in order to wring whatever volume might be available out of the economy. But in reality, they need to be doing precisely the opposite. Underwriting standards are going to have tighten a bit in order effectively manage the increased credit (and climate) risks inherent to longer-duration lending.

It’s okay for lenders and investors to be taking these new risks on. They just need to be doing it with their eyes wide open and they need to be pricing for it.

Speak To an Expert

It’s time to move to DaaS — Why it matters for loan and MSR investors

Data as a service, or DaaS, for loans and MSR investors is fast becoming the difference between profitable trades and near misses.

Granularity of data is creating differentiation among investors. To win at investing in loans and mortgage servicing rights requires effectively managing a veritable ocean of loan-level data. Buried within every detailed tape of borrower, property, loan and performance characteristics lies the key to identifying hidden exposures and camouflaged investment opportunities. Understanding these exposures and opportunities is essential to proper bidding during the acquisition process and effective risk management once the portfolio is onboarded.

Investors know this. But knowing that loan data conceals important answers is not enough. Even knowing which specific fields and relationships are most important is not enough. Investors also must be able to get at that data. And because mortgage data is inherently messy, investors often run into trouble extracting the answers they need from it.

For investors, it boils down to two options. They can compel analysts to spend 75 percent of their time wrangling unwieldy data – plugging holes, fixing outliers, making sure everything is mapped right. Or they can just let somebody else worry about all that so they can focus on more analytical matters.

Don’t get left behind — DaaS for loan and MSR investors

It should go without saying that the “let somebody else worry about all that” approach only works if “somebody else” possesses the requisite expertise with mortgage data. Self-proclaimed data experts abound. But handing the process over to an outside data team lacking the right domain experience risks creating more problems than it solves.

Ideally, DaaS for loan and MSR investors consists of a data owner handing off these responsibilities to a third party that can deliver value in ways that go beyond simply maintaining, aggregating, storing and quality controlling loan data. All these functions are critically important. But a truly comprehensive DaaS provider is one whose data expertise is complemented by an ability to help loan and MSR investors understand whether portfolios are well conceived. A comprehensive DaaS provider helps investors ensure that they are not taking on hidden risks (for which they are not being adequately compensated in pricing or servicing fee structure).

True DaaS frees up loan and MSR investors to spend more time on higher-level tasks consistent with their expertise. The more “blocking and tackling” aspects of data management that every institution that owns these assets needs to deal with can be handled in a more scalable and organized way. Cloud-native DaaS platforms are what make this scalability possible.

Scalability — stop reinventing the wheel with each new servicer

One of the most challenging aspects of managing a portfolio of loans or MSRs is the need to manage different types of investor reporting data pipelines from different servicers. What if, instead of having to “reinvent the wheel” to figure out data intake every time a new servicer comes on board, “somebody else” could take care of that for you?

An effective DaaS provider is one not only that is well versed in building and maintain loan data pipes from servicers to investors but also has already established a library of existing servicer linkages. An ideal provider is one already set-up to onboard servicer data directly onto its own DaaS platform. Investors achieve enormous economies of scale by having to integrate with a single platform as opposed to a dozen or more individual servicer integrations. Ultimately, as more investors adopt DaaS, the number of centralized servicer integrations will increase, and greater economies will be realized across the industry.

Connectivity is only half the benefit. The DaaS provider not only intakes, translates, maps, and hosts the loan-level static and dynamic data coming over from servicers. The DaaS provider also takes care of QC, cleaning, and managing it. DaaS providers see more loan data than any one investor or servicer. Consequently, the AI tools an experienced DaaS provider uses to map and clean incoming loan data have had more opportunities to learn. Loan data that has been run through a DaaS provider’s algorithms will almost always be more analytically valuable than the same loan data processed by the investor alone.  

Investors seeking to increase their footprint in the loan and MSR space obviously do not wish to see their data management costs rise in proportion to the size of their portfolios. Outsourcing to a DaaS provider that specializes in mortgages, like RiskSpan, helps investors build their book while keeping data costs contained.

Save time and money – Make better bids

For all these reasons, DaaS is unquestionably the future (and, increasingly, the present) of loan and MSR data management. Investors are finding that a decision to delay DaaS migration comes with very real costs, particularly as data science labor becomes increasingly (and often prohibitively) expensive.

The sooner an investor opts to outsource these functions to a DaaS provider, the sooner that investor will begin to reap the benefits of an optimally cost-effective portfolio structure. One RiskSpan DaaS client reported a 50 percent reduction in data management costs alone.

Investors continuing to make do with in-house data management solutions will quickly find themselves at a distinct bidding disadvantage. DaaS-aided bidders have the advantage of being able to bid more competitively based on their more profitable cost structure. Not only that, but they are able to confidently hone and refine their bids based on having a better, cleaner view of the portfolio itself.

Rethink your mortgage data. Contact RiskSpan to talk about how DaaS can simultaneously boost your profitability and make your life easier.

REQUEST A DEMO

RiskSpan Introduces Media Effect Measure for Prepayment Analysis, Predictive Analytics for Managed Data 

ARLINGTON, Va., July 14, 2022

RiskSpan, a leading provider of residential mortgage  and structured product data and analytics, has announced a series of new enhancements in the latest release of its award-winning Edge Platform.

Comprehensive details of these new capabilities are available byrequesting a no-obligation demo at riskspan.com.

Speak to An Expert

Media Effect – It has long been accepted that prepayment speeds see an extra boost as media coverage alerts borrowers to refinancing opportunities. Now, Edge lets traders and modelers measure the media effect present in any active pool of Agency loans—highlighting borrowers most prone to refinance in response to news coverage—and plot the empirical impact on any cohort of loans. Developed in collaboration with practitioners, it measures rate novelty by comparing rate environment at a given time to rates over the trailing five years. Mortgage portfolio managers and traders who subscribe to Edge have always been able to easily stratify mortgage portfolios by refinance incentive. With the new Media Effect filter/bucket, market participants fine tune expectations by analyzing cohorts with like media effects.

Predictive Analytics for Managed Data – Edge subscribers who leverage RiskSpan’s Data Management service to aggregate and prep monthly loan and MSR data can now kick off predictive analytics for any filtered snapshot of that data. Leveraging RiskSpan’s universe of forward-looking analytics, subscribers can generate valuations, market risk metrics to inform hedging, credit loss accounting estimates and credit stress test outputs, and more. Sharing portfolio snapshots and analytics results across teams has never been easier.

These capabilities and other recently released Edge Platform functionality will be on display at next week’s SFVegas 2022 conference, where RiskSpan is a sponsor. RiskSpan will be featured at Booth 38 in the main exhibition hall. RiskSpan professionals will also be available to respond to questions on July 19th following their panels, “Market Beat: Mortgage Servicing Rights” and “Technology Trends in Securitization.”


About RiskSpan, Inc. 

RiskSpan offers cloud-native SaaS analytics for on-demand market risk, credit risk, pricing and trading. With our data science experts and technologists, we are the leader in data as a service and end-to-end solutions for loan-level data management and analytics.

Our mission is to be the most trusted and comprehensive source of data and analytics for loans and structured finance investments.

Rethink loan and structured finance data. Rethink your analytics. Learn more at www.riskspan.com.


Automated Legal Disclosure Generator for Mortgage and Asset-Backed Securities

Issuing a security requires a lot of paperwork. Much of this paperwork consists of legal disclosures. These disclosures inform potential investors about the collateral backing the bonds they are buying. Generating, reviewing, and approving these detailed disclosures is hard and takes a lot of time – hours and sometimes days. RiskSpan has developed an easy-to-use legal disclosure generator application that makes it easier, reducing the process to minutes.

RiskSpan’s Automated Legal Disclosure Generator for Mortgage and Asset-Backed Securities automates the generation of prospectus-supplements, pooling and servicing agreements, and other legal disclosure documents. These documents contain a combination of static and dynamic legal language, data, tables, and images.  

The Disclosure Generator draws from a collection of data files. These files contain collateral-, bond-, and deal-specific information. The Disclosure Generator dynamically converts the contents of these files into legal disclosure language based on predefined rules and templates. In addition to generating interim and final versions of the legal disclosure documents, the application provides a quick and easy way of making and tracking manual edits to the documents. In short, the Disclosure Generator is an all-inclusive, seamless, end-to-end system for creating, editing and tracking changes to legal documents for mortgage and asset-backed securities.   

The Legal Disclosure Generator’s user interface supports:  

  1. Simultaneous uploading of multiple data files.
  2. Instantaneous production of the first (and subsequent) drafts of legal documents, adhering to the associated template(s).
  3. A user-friendly editor allowing manual, user-level language and data changes. Users apply these edits either directly to a specific document or to the underlying data template itself. Template updates carry forward to the language of all subsequently generated disclosures. 
  4. A version control feature that tracks and retains changes from one document version to the next.
  5. An archiving feature allowing access to previously generated documents without the need for the original data files.
  6. Editing access controls based on pre-defined user level privileges.
CONTACT US

Overview

RiskSpan’s Automated Legal Disclosure Generator for Mortgage and Asset-Backed Securities enables issuers of securitized assets to create legal disclosures efficiently and quickly from raw data files.

The Legal Disclosure Generator is easy and intuitive to use. After setting up a deal in the system, the user selects the underlying collateral- and bond-level data files to create the disclosure document. In addition to the raw data related to the collateral and bonds, these data files also contain relevant waterfall payment rules. The data files can be in any format — Excel, CSV, text, or even custom file extensions. Once the files are uploaded, the first draft of the disclosures can be easily generated in just a few seconds. The system takes the underlying data files and creates a draft of the disclosure document seamlessly and on the fly.  In addition, the Legal Disclosure Generator reads custom scripts related to waterfall models and converts them into waterfall payment rules.

Here is a sample of a disclosure document created from the system.


REQUEST A DEMO

Blackline Version(s)

In addition to creating draft disclosure documents, the Legal Disclosure Generator enables users to make edits and changes to the disclosures on the fly through an embedded editor. The Disclosure Generator saves these edits and applies them to the next version. The tool creates blackline versions with a single integrated view for managing multiple drafts.

The following screenshot of a sample blackline version illustrates how users can view changes from one version to the next.

Tracking of Drafts

The Legal Disclosure Generator keeps track of a disclosure’s entire version history. The system enables email of draft versions directly to the working parties, and additionally retains timestamps of these emails for future reference.

The screenshot below shows the entire lifecycle of a document, from original creation to print, with all interim drafts along the way. 


Automated QC System

The Legal Disclosure Generator’s automated QC system creates a report that compares the underlying data file(s) to the data that is contained in the legal disclosure. The automated QC process ensures that data is accurate and reconciled.

Downstream Consumption

The Legal Disclosure Generator creates a JSON data file. This consolidated file consists of collateral and bond data, including waterfall payment rules. The data files are made available for downstream consumption and can also be sent to Intex, Bloomberg, and other data vendors. One such vendor noted that this JSON data file has enabled them to model deals in one-third the time it took previously.

Self-Serve System

The Legal Disclosure Generator was designed with the end-user in mind. Users can set up the disclosure language by themselves and edit as needed, with little or no outside help.

The ‘System’ Advantage

  • Remove unnecessary, manual, and redundant processes
  • Huge Time Efficiency – 24 Hours vs 2 Mins (Actual time savings for a current client of the system)
  • Better Managed Processes and Systems
  • Better Resource Management – Cost Effective Solutions
  • Greater Flexibility
  • Better Data Management – Inbuilt QCs


LEARN MORE

Why Accurate Loan Pool and MSR Cost Forecasting Requires Loan-by-Loan Analytics

When it comes to forecasting loan pool and MSR cash flows, the practice of creating “rep lines,” or cohorts, of loans with similar characteristics for analytical purposes has its roots in the Agency MBS market. One of the most attractive and efficient features of Agencies is the TBA market. This market allows originators and issuers to sell large pools of mortgages that have not even been originated yet. This is possible because all parties understand what these future loans will look like. All these loans will all have enough in common as to be effectively interchangeable with one another.  

Institutions that perform the servicing on such loans may reasonably feel they can extend the TBA logic to their own analytics. Instead of analyzing a hundred similar loans individually, why not just lump them into one giant meta-loan? Sum the balances, weight-average the rates, terms, and other features, and you’re good to go. 

Why the industry still resorts to loan cohorting when forecasting loan pool and MSR cash flows

The simplest explanation for cohort-level analytics lies in its simplicity. Rep lines amount to giant simplifying assumptions. They generate fewer technological constraints than a loan-by-loan approach does. Condensing an entire loan portfolio down to a manageable number of rows requires less computational capacity. This takes on added importance when dealing with on-premise software and servers. It also facilitates the process of assigning performance and cost assumptions. 

What is more, as OAS modeling has evolved to dominate the loans and MSR landscape, the stratification approach necessary to run Monte Carlo and other simulations lends itself to cohorting. Lumping loans into like groups also greatly simplifies the process of computing hedging requirements. 

Advantages of loan-level over cohorting when forecasting cash flows

Treating loan and MSR portfolios like TBA pools, however, has become increasingly problematic as these portfolios have grown more heterogeneous. Every individual loan has a story. Even loans that resemble each other in terms of rate, credit score, LTV, DTI, and documentation level have unique characteristics. Some of these characteristics – climate risk, for example – are not easy to bucket. Lumping similar loans into cohorts also runs the risk of underestimating tail risk. Extraordinarily high servicing/claims costs on just one or two outlier loans on a bid tape can be enough to adversely affect the yield of an entire deal. 

Conversely, looking at each loan individually facilitates the analysis of portfolios with expanded credit boxes. Non-banks, which do not usually have the benefit of “knowing” their servicing customers through depository or other transactional relationships, are particularly reliant on loan-level data to understand individual borrower risks, particularly credit risks. Knowing the rate, LTV, and credit score of a bundled group of loans may be sufficient for estimating prepayment risk. But only a more granular, loan-level analysis can produce the credit analytics necessary to forecast reliably and granularly what a servicing portfolio is really going to cost in terms of collections, loss mitigation, and claims expenses.  

Loan-level analysis also eliminates the reliance on stratification limitations. It facilitates portfolio composition analysis. Slicing and dicing techniques are much more simply applied to loans individually than to cohorts. Looking at individual loans also reduces the risk of overrides and lost visibility into convexity pockets. 

Loan-Level MSR Analytics

Potential challenges and other considerations 

So why hasn’t everyone jumped onto the loan-level bandwagon when forecasting loan pool and MSR cash flows? In short, it’s harder. Resistance to any new process can be expected when existing aggregation regimes appear to be working fine. Loan-level data management requires more diligence in automated processes. It also requires the data related to each individual loan to be subjected to QC and monitoring. Daily hedging and scenario runs tend to focus more on speed than on accuracy at the macro level. Some may question whether the benefits of such a granular, case-by-case analysis that identifying the most significant loan-level pickups requires actually justifies the cost of such a regime. 

Rethink. Why now? 

Notwithstanding these challenges, there has never been a better time for loan and MSR investors to abandon cohorting and fully embrace loan-level analytics when forecasting cash flows. The emergence of cloud-native technology and enhanced database and warehouse infrastructure along with the ability to outsource the hosting and computational requirements out to third parties creates practically limitless scalability. 

The barriers between loan and MSR experts and IT professionals have never been lower. This, combined with the emergence of a big data culture in an increasing number of organizations, has brought the granular daily analysis promised by loan-level analytics tantalizingly within reach.  

 

For a deeper dive into loan and MSR cost forecasting, view our webinar, “How Much Will That MSR Portfolio Really Cost You?”

 


RiskSpan Introduces Proprietary Measure for Plotting Burnout Effect on Prepays, Adds RPL/NPL Forecasting

ARLINGTON, Va., June 22, 2022 —

RiskSpan, a leading provider of residential mortgage and structured product data and analytics, has announced a series of new enhancements in the latest release of its award-winning Edge Platform.  

Comprehensive details of these new capabilities are available by requesting a no-obligation demo at riskspan.com.

  • Burnout Metrics MBS traders and investors can now look up a proprietary, cumulative burnout metric that quantifies the extent to which a defined pool of mortgages has continued to pay coupons above refinance rates over time. The metric goes beyond simple comparisons of note rates to historic prevailing rates by also tracking the number of times borrowers have ignored the “media effect” of repeatedly seeing rates reach record lows. Edge users can plot empirical prepay speeds as a function of burnout to help project performance of pools with various degrees of burnout. A virtual walk-through of this functionality is available here.
  • Reperforming Loans Investors in nonperforming and reperforming loans – particularly RPLs that have recently emerged from covid forbearance – can now project performance and cash flows of loans with deferred balances. Edge reads in the total debt owed (TDO) recovery method and has added key output fields like prepaid principal percent reduction and total debt owed to its cash flow report.
  • Hedge Ratios – The Edge Platform now enables traders and portfolio managers to easily compute, in one single step, the quantity of 2yr, 5yr, 10yr, or 30yr treasuries (or any combination of these or other hedges) that must be sold to offset the effective duration of assets in a given portfolio. Swaps, swaptions and other hedges are also supported. Clearly efficient and useful for any portfolio of interest-rate-sensitive assets, the functionality is proving particularly valuable to commercial banks with MSR holdings and others who require daily transparency to hedging ratios.  

### 

About RiskSpan, Inc. 

RiskSpan offers end-to-end solutions for data management, historical performance, predictive analytics and portfolio risk management on a secure, fast, and scalable platform that has earned the trust of the industry’s largest firms. Combining the strength of subject matter experts, quantitative analysts, and technologists, RiskSpan’s Edge platform integrates a range of datasets – structured and unstructured – and off-the-shelf analytical tools to provide you with powerful insights and a competitive advantage. Learn more at www.riskspan.com.  

SPEAK to An EXPERT

RiskSpan Announces Cloud-Native Mortgage Servicing Rights Application

ARLINGTON, Va., Mortgage fintech leader RiskSpan announced today that it has added a Mortgage Servicing Rights (MSR) application to its award-winning on-demand analytics Edge Platform.

The application expands RiskSpan’s unparalleled loan-level mortgage analytics to MSRs, an asset class whose cash flows have previously been challenging to forecast at the loan level. Unlike conventional MSR tools that assume large numbers of loans bucketed into “rep lines” will perform identically, the Edge Platform’s granular approach makes it possible to forecast MSR portfolio net cash flows and run valuation and scenario analyses with unprecedented precision.   

RiskSpan’s MSR platform integrates RiskSpan’s proprietary prepayment and credit models to calculate option-adjusted risk metrics while also incorporating the full range of client-configurable input parameters (costs and recapture assumptions, for example) necessary to fully characterize the cash flows arising from servicing. Further, its integrated data warehouse solution enables easy access to time-series loan and collateral performance. 

“Our cloud-native platform has enabled us to achieve something that has long eluded our industry – on-demand, loan-level cash flow forecasting,” observed RiskSpan CEO Bernadette Kogler. “This has been an absolute game changer for our clients.”

Loan-level projections enable MSR investors to re-combine and re-aggregate loan-level cash flow results on the fly, opening the door to a host of additional, scenario-based analytics – including climate risk and responsible ESG analysis. The flexibility afforded by RiskSpan’s parallel computing framework allows for complex net cash flow calculations on hundreds of thousands of individual mortgage loans simultaneously. The speed and scalability this affords makes the Edge Platform ideally suited for pricing even the largest portfolios of MSR assets and making timely trading decisions with confidence.

About RiskSpan 
RiskSpan offers end-to-end solutions for data management, trading risk management analytics, and visualization on a highly secure, fast, and fully scalable platform that has earned the trust of the industry’s largest firms. Combining the strength of subject matter experts, quantitative analysts, and technologists, RiskSpan’s Edge platform integrates a range of data-sets – structured and unstructured – and off-the-shelf analytical tools to provide you with powerful insights and a competitive advantage. Learn more at www.riskspan.com. 

GET STARTED WITH A RISKSPAN EXPERT TODAY!

Surge in Cash-Out Refis Pushes VQI Sharply Higher

A sharp uptick in cash-out refinancing pushed RiskSpan’s Vintage Quality Index (VQI) to its highest level since the first quarter of 2019.

RiskSpan’s Vintage Quality Index computes and aggregates the percentage of Agency originations each month with one or more “risk factors” (low-FICO, high DTI, high LTV, cash-out refi, investment properties, etc.). Months with relatively few originations characterized by these risk factors are associated with lower VQI ratings. As the historical chart above shows, the index maxed out (i.e., had an unusually high number of loans with risk factors) leading up to the 2008 crisis.

RiskSpan uses the index principally to fine-tune its in-house credit and prepayment models by accounting for shifts in loan composition by monthly cohort.

Rising Rates Mean More Cash-Out Refis (and more risk)

As the following charts plotting the individual VQI components illustrate, a spike in cash-out refinance activity (as a percentage of all originations) accounted for more of the rise in overall VQI than did any other risk factor.

This comes as little surprise given the rising rate environment that has come to define the first quarter of 2022, a trend that is likely to persist for the foreseeable future.

As we demonstrated in this recent post, the quickly vanishing number of borrowers who are in the money for a rate-and-term refinance means that the action will increasingly turn to so-called “serial cash-out refinancers” who repeatedly tap into their home equity even when doing so means refinancing into a mortgage with a higher rate. The VQI can be expected to push ever higher to the extent this trend continues.

An increase in the percentage of loans with high debt-to-income ratios (over 45) and low credit scores (under 660) also contributed to the rising VQI, as did continued upticks in loans on investment and multi-unit properties as well as mortgages with only one borrower.

Population assumptions:

  • Monthly data for Fannie Mae and Freddie Mac.
  • Loans originated more than three months prior to issuance are excluded because the index is meant to reflect current market conditions.
  • Loans likely to have been originated through the HARP program, as identified by LTV, MI coverage percentage, and loan purpose, are also excluded. These loans do not represent credit availability in the market as they likely would not have been originated today but for the existence of HARP.

Data assumptions:

  • Freddie Mac data goes back to 12/2005. Fannie Mae only back to 12/2014.
  • Certain fields for Freddie Mac data were missing prior to 6/2008.

GSE historical loan performance data release in support of GSE Risk Transfer activities was used to help back-fill data where it was missing.

An outline of our approach to data imputation can be found in our VQI Blog Post from October 28, 2015.

Data Source: Fannie Mae PoolTalk®-Loan Level Disclosure


Get Started
Log in

Linkedin   

risktech2024