Linkedin    Twitter   Facebook

Get Started
Log In

Linkedin

Blog Archives

Automate Your Data Normalization and Validation Processes

Robotic Process Automation (RPA) is the solution for automating mundane, business-rule based processes so that organizations high value business users can be deployed to more valuable work. 

McKinsey defines RPA as “software that performs redundant tasks on a timed basis and ensures that they are completed quickly, efficiently, and without error.” RPA has enormous savings potential. In RiskSpan’s experience, RPA reduces staff time spent on the target-state process by an average of 95 percent. On recent projects, RiskSpan RPA clients on average saved more than 500 staff hours per year through simple automation. That calculation does not include the potential additional savings gained from the improved accuracy of source data and downstream data-driven processes, which greatly reduces the need for rework. 

The tedious, error-ridden, and time-consuming process of data normalization is familiar to almost all organizations. Complex data systems and downstream analytics are ubiquitous in today’s workplace. Staff that are tasked with data onboarding must verify that source data is complete and mappable to the target system. For example, they might ensure that original balance is expressed as dollar currency figures or that interest rates are expressed as percentages with three decimal places. 

Effective data visualizations sometimes require additional steps, such as adding calculated columns or resorting data according to custom criteria. Staff must match the data formatting requirements with the requirements of the analytics engine and verify that the normalization allows the engine to interact with the dataset. When completed manually, all of these steps are susceptible to human error or oversight. This often results in a need for rework downstream and even more staff hours. 

Recently, a client with a proprietary datastore approached RiskSpan with the challenge of normalizing and integrating irregular datasets to comply with their data engine. The non-standard original format and the size of the data made normalization difficult and time consuming. 

After ensuring that the normalization process was optimized for automation, RiskSpan set to work automating data normalization and validation. Expert data consultants automated the process of restructuring data in the required format so that it could be easily ingested by the proprietary engine.  

Our consultants built an automated process that normalized and merged disparate datasets, compared internal and external datasets, and added calculated columns to the data. The processed dataset was more than 100 million loans, and more than 4 billion recordsTo optimize for speed, our team programmed a highly resilient validation process that included automated validation checks, error logging (for client staff review) and data correction routines for post-processing and post-validation. 

This custom solution reduced time spent onboarding data from one month of staff work down to two days of staff work. The end result is a fullyfunctional, normalized dataset that can be trusted for use with downstream applications. 

RiskSpan’s experience automating routine business processes reduced redundancies, eliminated errors, and saved staff time. This solution reduced resources wasted on rework and its associated operational risk and key-person dependencies. Routine tasks were automated with customized validations. This customization effectively eliminated the need for staff intervention until certain error thresholds were breached. The client determined and set these thresholds during the design process. 

RiskSpan data and analytics consultants are experienced in helping clients develop robotic process automation solutions for normalizing and aggregating data, creating routine, reliable data outputsexecuting business rules, and automating quality control testing. Automating these processes addresses a wide range of business challenges and is particularly useful in routine reporting and analysis. 

Talk to RiskSpan today about how custom solutions in robotic process automation can save time and money in your organization. 


CRT Deal Monitor: April 2019 Update

Loans with Less than Standard MI Coverage

CRT Deal Monitor: Understanding When Credit Becomes Risky 

This analysis tracks several metrics related to deal performance and credit profile, putting them into a historical context by comparing the same metrics for recent-vintage deals against those of ‘similar’ cohorts in the time leading up to the 2008 housing crisis.  

Some of the charts in this post have interactive features, so click around! We’ll be tweaking the analysis and adding new metrics in subsequent months. Please shoot us an email if you have an idea for other metrics you’d like us to track. 

Monthly Highlights: 

The seasonal nature of recoveries is an easy-to-spot trend in our delinquency outcome charts (loan performance 6 months after being 60 days-past-due). Viewed from a very high level, both Fannie Mae and Freddie Mac display this trend, with visible oscillations in the split between loans that end up current and those that become more delinquent (move to 90+ days past due (DPD)). This trend is also consistent both before and after the crisis – the shares of loans that stay 60 DPD and move to 30 DPD are relatively stable. You can explore the full history of the FNMA and FHLMC Historical Performance Datasets by clicking the 6-month roll links below, and then clicking the “Autoscale” button in the top-right of the graph. Loans with Less-than-Standard MI Coverage

This trend is salient in April of 2019, as both Fannie Mae Connecticut Avenue Securities (CAS) and Freddie Mac Structured Agency Credit Risk (STACR) have seen 6 months of steady decreases in loans curing, and a steady increase in loans moving to 90+ DPD. While both CAS and STACR hit lows for recovery to current – similar to lows at the beginning of 2018 – it is notable that both CAS and STACR saw multi-year highs for recovery to current in October of 2018 (see Delinquency Outcome Monitoring links below). While continued US economic strength is likely responsible for the improved performance in October, it is not exactly clear why the oscillation would move the recoveries to current back to the same lows experienced in early 2018.  

Current Performance and Credit Metrics

Delinquency Trends:

The simplest metric we track is the share of loans across all deals that is 60+ days past due (DPD). The charts below compare STACR (Freddie) vs. CAS (Fannie), with separate charts for high-LTV deals (G2 for CAS and HQA for STACR) vs. low-LTV deals (G1 for CAS and DNA for STACR).

For comparative purposes, we include a historical time series of the share of loans 60+ DPD for each LTV group. These charts are derived from the Fannie Mae and Freddie Mac loan-level performance datasets. Comparatively, today’s deal performance is much better than even the pre-2006 era.

Low LTV Deals 60 DPD

High LTV Deals 60 DPD

Delinquency Outcome Monitoring:

The tables below track the status of loans that were 60+ DPD. Each bar in the chart represents the population of loans that were 60+ DPD exactly 6 months prior to the x-axis date.  

The choppiness and high default rates in the first few observations of the data are related to the very low counts of delinquent loans as the CRT program ramped up.  

STACR 6 Month Roll

CAS 6 Month Roll

The table below repeats the 60-DPD delinquency analysis for the Freddie Mac Loan Level Performance dataset leading up to and following the housing crisis. (The Fannie Mae loan level performance set yields a nearly identical chart.) Note how many more loans in these cohorts remained delinquent (rather than curing or defaulting) relative to the more recent CRT loans.

Fannie Performance 6 Month Roll

Freddie Performance 6 Month Roll

Deal Profile Comparison:

The tables below compare the credit profiles of recently issued deals. We focus on the key drivers of credit risk, highlighting the comparatively riskier features of a deal. Each table separates the high–LTV (80%+) deals from the low–LTV deals (60%-80%). We add two additional columns for comparison purposes. The first is the ‘Coming Cohort,’ which is meant to give an indication of what upcoming deal profiles will look like. The data in this column is derived from the most recent three months of MBS issuance loan–level data, controlling for the LTV group. These are newly originated and acquired by the GSEs—considering that CRT deals are generally issued with an average loan age between 6 and 15 months, these are the loans that will most likely wind up in future CRT transactions. The second comparison cohort consists of 2006 originations in the historical performance datasets (Fannie and Freddie combined), controlling for the LTV group. We supply this comparison as context for the level of risk that was associated with one of the worst–performing cohorts. 

Credit Profile LLTV – Click to see all deals

Credit Profile HLTV – Click to see all deals

Deal Tracking Reports:

Please note that defaults are reported on a delay for both GSEs, and so while we have CPR numbers available for the most recent month, CDR numbers are not provided because they are not fully populated yet. Fannie Mae CAS default data is delayed an additional month relative to STACR. We’ve left loss and severity metrics blank for fixed-loss deals.

STACR Performance – Click to see all deals

CAS Performance – Click to see all deals


Robotic Process Automation – Warehouse Line Reporting

Robotic Process Automation (RPA) is the solution for automating mundane, business-rule based processes so that your high value business users can be deployed to more valuable work.

McKinsey defines RPA as “software that performs redundant tasks on a timed basis and ensures that they are completed quickly, efficiently, and without error.” RPA has enormous savings potential. In RiskSpan’s experience, RPA reduces staff time spent on the target-state process by an average of 95 percent. On recent projects, RiskSpan RPA clients on average saved more than 500 staff hours per year through simple automation. That calculation does not include the potential additional savings gained from the improved accuracy of source data and downstream data-driven processes, which greatly reduces the need for rework.

Managing warehouse lines of credit pose a unique set of challenges to both lending and borrowing institutions. These lines revolve based on frequent, periodic transactions. The loan-level data underlying these transactions, while similar from one transaction to the next, are sufficiently nuanced to require individual review. These reviews are painstaking and can take an inordinate amount of time.

Recently, a consumer financing provider approached RiskSpan with the challenge of tracking its requests to a warehouse lender, so that it could better manage its warehouse loan portfolio. This client had a series of manual reporting processes that it ran upon each request to the warehouse lender to inform oversight of its portfolio. It needed assistance improving the accuracy and resource burden required to produce the reports.

RiskSpan responded to the challenge by completing a rapid RPA readiness assessment and by implementing automation to solve for the data challenges it uncovered. In the readiness assessment, RiskSpan deployed a consultant to ensure that the existing reports were enough to meet the needs of the organization; that source data was enough for the desired reporting; and that data transformation processes (people and systems) were maintaining data quality from input to output.

Once these processes were analyzed and a target-state was confirmed, RiskSpan consultants quickly got to work. We automated ingestion of data for two of the existing reports, automated high-value parts of the data normalization processes and created automated quality control tests for each report.

This custom solution reduced the cycle time from one hour of staff work to 5 minutes of staff work at each warehouse lender request. This saved more than two full weeks of staff time over the course of the year and dramatically increased the scalability of this valuable process.

RiskSpan’s experience automating routine business processes reduced redundancies, eliminated errors, and saved staff time. Our solution reduced resources wasted on rework and its associated operational risk and key-person dependencies. Routine tasks were automated with customized validations. This customization effectively eliminated the need for staff intervention until certain error thresholds were breached. The client determined and set these thresholds during the design process.

RiskSpan data and analytics consultants are experienced in helping clients develop robotic process automation solutions for normalizing and aggregating data, creating routine, reliable data outputs, executing business rules, and automating quality control testing. Automating these processes addresses a wide range of business challenges and is particularly useful in routine reporting and analysis.

Talk to RiskSpan today about how custom solutions in robotic process automation can save time and money in your organization.


FHFA 1Q2019 Prepayment Monitoring Report

FHFA’s 2014 Strategic Plan for the Conservatorships of Fannie Mae and Freddie Mac includes the goal of improving the overall liquidity of Fannie Mae’s and Freddie Mac’s (the Enterprises) securities through the development of a common mortgage-backed security.

This report provides insight into how FHFA monitors the consistency of prepayment rates across cohorts of the Enterprises’ TBA-eligible MBS.

Download Report


GSE: Datamart Design and Build

The Problem

A government-sponsored enterprise needed a centralized data solution for its forecasting process, which involved cross-functional teams from different business lines.​

The firm also sought a cloud-based data warehouse to host forecasting outputs for reporting purposes with faster querying and processing speeds.​

The firm also needed assistance migrating data from legacy data sources to new datamarts. The input and output files and datasets had different sources and were often in different formats. Analysis and transformation were required prior to designing, developing and loading tables.  

The Solution

RiskSpan built and now maintains a new centralized datamart (in both Oracle and Amazon Web Services) for the client’s revenue and loss forecasting processes. This includes data modeling, historical data upload, and the monthly recurring data process.

The Deliverables

  • Analyzed the end-to-end data flow and data elements​
  • Designed data models satisfying business requirements​
  • Processed and mapped forecasting input and output files​
  • Migrated data from legacy databases to the new sources ​
  • Built an Oracle datamart and a cloud-based data warehouse (Amazon Web Services) ​
  • Led development team to develop schemas, tables and views, process scripts to maintain data updates and table partitioning logic​
  • Resolved data issues with the source and assisted in reconciliation of results


GSE: ETL Solutions

The Problem

The client needed ETL solutions for handling data of any complexity or size in a variety of formats and/or from different upstream sources.​

The client’s data management team extracted and processed data from different sources and different types of databases (e.g. Oracle, Netezza, Excel files, SAS datasets, etc.), and needed to load into its Oracle and AWS datamarts for it’s revenue and loss forecasting processes. ​

The client’s forecasting process used very complex large-scale datasets in different formats which needed to be consumed and loaded in an automated and timely manner.

The Solution

RiskSpan was engaged to design, develop and implement ETL (Extract, Transform and Load) solutions for handling input and output data for the client’s revenue and loss forecasting processes. This included dealing with large volumes of data and multiple source systems, transforming and loading data to and from data marts and data ware houses.

The Deliverables

  • Analyzed data sources and developed ETL strategies for different data types and sources​
  • Performed source target mapping in support of report and warehouse technical designs​
  • Implemented business-driven requirements using Informatica ​
  • Collaborated with cross-functional business and development teams to document ETL requirements and turn them into ETL jobs ​
  • Optimized, developed, and maintained integration solutions as necessary to connect legacy data stores and the data warehouses


Case Study: Web Based Data Application Build

The Client

Government Sponsored Enterprise (GSE)

The Problem

The Structured Transactions group of a GSE needed to offer a simpler way for broker-dealers to  create new restructured securities (improved ease of use), that provided flexibility to do business at any hour and reduce the dependence on Structured Transactions team members’ availability. 

The Solution

RiskSpan led the development of a customer-facing web-based application for a GSE. Their structured transactions clients use the application to independently create pools of pools and re-combinable REMIC exchanges (RCRs) with existing pooling and pricing requirements.​

RiskSpan delivered the complete end-to-end technical implementation of the new portal.

The Deliverables

  • Development included self-service web portal that provides RCR, pool-of-pool exchange capabilities, reporting features ​
  • Managed data flows from various internal sources to the portal, providing real-time calculations​
  • Latest technology stack included Angular 2.0, Java for web services​
  • Development, testing, and config control methodology featured DevOps practices, CI/CD pipeline, 100% automated testing with Cucumber, Selenium​
  • GIT, JIRA, Gherkin, Jenkins, Fisheye/Crucible, SauceLabs, for config control, testing, deployment

Case Study: Web Based Data Application Build

The Client

GOVERNMENT SPONSORED ENTERPRISE (GSE)

The Problem

The Structured Transactions group of a GSE needed to offer a simpler way for broker-dealers to  create new restructured securities (improved ease of use), that provided flexibility to do business at any hour and reduce the dependence on Structured Transactions team members’ availability. 


The Solution

RiskSpan led the development of a customer-facing web-based application for a GSE. Their structured transactions clients use the application to independently create pools of pools and re-combinable REMIC exchanges (RCRs) with existing pooling and pricing requirements.​

RiskSpan delivered the complete end-to-end technical implementation of the new portal.


The Deliverables

  • Development included self-service web portal that provides RCR, pool-of-pool exchange capabilities, reporting features ​
  • Managed data flows from various internal sources to the portal, providing real-time calculations​
  • Latest technology stack included Angular 2.0, Java for web services​
  • Development, testing, and config control methodology featured DevOps practices, CI/CD pipeline, 100% automated testing with Cucumber, Selenium​
  • GIT, JIRA, Gherkin, Jenkins, Fisheye/Crucible, SauceLabs, for config control, testing, deployment

CONTACT US

RiskSpan Edge & CRT Data

For participants in the credit risk transfer (CRT) market, managing the massive quantity of data to produce clear insights into deal performance can be difficult and demanding on legacy systems. Complete analysis of the deals involves bringing together historical data, predictive models, and deal cash flow logic, often leading to a complex workflow in multiple systems. RiskSpan’s Edge platform (RS Edge) solves these challenges, bringing together all aspects of CRT analysis. RiskSpan is the only vendor to bring together everything a CRT analyst needs:  

  • Normalized, clean, enhanced data across programs (STACR/CAS/ACIS/CIRT),
  • Historical Fannie/Freddie performance data normalized to a single standard,
  • Ability to load loan-level files related to private risk transfer deals,
  • An Agency-specific, loan-level, credit model,
  • Seamless Intex integration for deal and portfolio analysis,
  • Scalable scenario analysis at the deal or portfolio level, and
  • Vendor and client model integration capabilities.
  • Ability to load loan-level files related to private risk transfer deals.

Deal Comparison Table All of these features are built into RS Edge, a cloud-native, data and analytics platform for loans and securities. The RS Edge user interface is accessible via any web browser, and the processing engine is accessible via an application programming interface (API). Accessing RS Edge via the API allows access to the full functionality of the platform, with direct integration into existing workflows in legacy systems such as Excel, Python, and R. To tailor RS Edge to the specific needs of a CRT investor, RiskSpan is rolling out a series of Excel tools, built using our APIs, which allow for powerful loan-level analysis from the tool everyone knows and loves. Accessing RS Edge via our new Excel templates, users can:

  • Track deal performance,
  • Compare deal profiles,
  • Research historical performance of the full GSE population,
  • Project deal and portfolio performance with our Agency-specific credit model or with user-defined CPR/CDR/severity vectors, and
  • Analyze various macro scenarios across deals or a full portfolio

Loan Attribute Distributions

The web-based user interface allows for on-demand analytics, giving users specific insights on deals as the needs arise. The Excel template built with our API allows for a targeted view tailored to the specific needs of a CRT investor.

For teams that prefer to focus their time on outcomes rather than the build, RiskSpan’s data team can build custom templates around specific customer processes. RiskSpan offers support from premiere data scientists who work with clients to understand their unique concerns and objectives to integrate our analytics with their legacy system of choice. Loan Performance History The images are examples of a RiskSpan template for CRT deal comparison: profile comparison, loan credit score distribution, and delinquency performance for five Agency credit risk transfer deals, pulled via the RiskSpan Data API and rendered in Excel. ______________________________________________________________________________________________

Get a Demo


Fannie Mae’s New CAS REMIC: Why REITs Are Suddenly Interested in CRT Deals

Fannie Mae has been issuing credit-risk-transfer (CRT) deals under its Connecticut Avenue Securities (CAS) program since 2013. The investor base for these securities has traditionally been a diverse group of asset managers, hedge funds, private equity firms, and insurance companies. The deals had been largely ignored by Real Estate Investment Trusts (REITs), however. The following pie charts illustrate the investor breakdown of Fannie Mae’s CAS 2018-C06 deal, issued in October 2018. Note that REITs accounted for only 11 percent of the investor base of the Group 1 and Group 2 M-2 tranches (see note below for information on how credit risk is distributed across tranches), and just 4 percent of the Group 1 B-1 tranche. Things began to change in November 2018, however, when Fannie Mae began to structure CAS offering as notes issued by trusts that qualify as Real Estate Mortgage Investment Conduits (REMICs). The first such REMIC offering, CAS 2018-R07, brought about a substantial shift in the investor distribution, with REITs now accounting for a significantly higher share. As the pie charts below illustrate, REITs now account for some 22 percent of the M-2 tranche investor base and nearly 20 percent of the B-1 tranche.

What Could Be Driving This Trend?
It seems reasonable to assume that REITs are flocking to more favorable tax treatment of REMIC-based structures. These will now be more simplified and aligned with other mortgage-related securities, as Fannie Mae points out. Additionally, the new CAS REMIC notes meet all the REIT income and asset tests for tax purposes, and there is a removal on tax withholding restrictions for non-U.S. investors in all tranches. The REMIC structure offers additional benefits to REITs and other investors. Unlike previous CAS issues, the CAS REMIC—a bankruptcy-remote trust—issues the securities and receives the cash proceeds from investors. Fannie Mae pays monthly payments to the trust in exchange for credit protection, and the trust is responsible for paying interest to the investors and repaying principal less any credit losses. Since it is this new third-party trustee issuing the CAS REMIC securities, investors will be shielded from exposure to any future counterparty risk with Fannie Mae. The introduction of the REMIC structure represents an exciting development for the CAS program and for CRT securities overall. It makes them more attractive to REITs and offers these and other traditional mortgage investors a new avenue into credit risk previously available only in the private-label market.

End Note: How Are CAS Notes Structured?
Notes issued prior to 2016 as part of the CAS program are aligned to a structure of six classes of reference tranches, as illustrated below:
Catastrophic Risk
Two mezzanine tranches of debt are offered for sale to investors. The structure also consists of 4 hypothetical reference tranches, retained by Fannie Mae and used for allocation of cash flows. When credit events occur, write-downs are first applied to the Fannie Mae retained first loss position. Only after the entire first loss position is written down are losses passed on to investors in mezzanine tranche debt – first M2, then M1. Loan prepayment is allocated along an opposite trajectory. As loans prepay, principal is first returned to the investors in M1 notes. Only after the full principal balance of M1 notes have been repaid do M2 note holders receive principal payments. Beginning with the February 2016 CAS issuance (2016-C01), notes follow a new structure of seven classes of reference tranches, as illustrated below:
Catastrophic Risk
In addition to the two mezzanine tranches, a portion of the bottom layer is also sold to investors. This allows Fannie Mae to transfer a portion of the initial expected loss. When credit events occur, both Fannie Mae and investors incur losses. Additionally, beginning with this issuance, the size of the B tranche was increased to 100 bps, effectively increasing the credit support offered to mezzanine tranches. Beginning with the January 2017 CAS issuance (2017-C01), notes follow a structure of eight classes of reference tranches, as illustrated below:
Catastrophic Risk
Fannie Mae split the B tranche horizontally into two equal tranches, with Fannie Mae retaining the first loss position. The size of the B1 tranche is 50 bps, and Fannie Mae retains a vertical slice of the B1 tranche.


Get Started
Log in

Linkedin   

risktech2024