Navigating the Challenges of CECL
CONTRIBUTORS
David Andrukonis, CFA
Director
John Vandermeulen, CPA, CFA, FRM, CIA, CAMS
Managing Director
Janet Jozwik
Co-Head of Quantitative Analytics
Kishore Konuri
Database and Data Applications Lead
Chapter 1
Introduction
The Current Expected Credit Loss (CECL) model has gone through final deliberations at the Financial Accounting Standards Board (FASB); the dramatic impact of this change on loan accounting for banks of all sizes will be far reaching.
A Roadmap Forward
BACKGROUND
Properly accounting for troubled loans has always posed a challenge for banks. The 2008 financial crisis may have been just jarring enough for regulators to do something about it.
The crisis prompted the Financial Accounting Standards Board (FASB) as well as U.S. and international regulatory bodies to enact a better process for the estimation of potential credit loss within a portfolio of loans. Hence, in 2012, the FASB set out on a course to remedy loan accounting treatment via a project related to the impairment of financial instruments.
The FASB teamed up with the International Accounting Standards Board (IASB) on a joint project aimed at assisting financial institutions with estimations around allowance for loan losses.
This proposed accounting change has had a dramatic impact throughout the industry, requiring significant operational changes, as well as data management and systems enhancements that are still being worked out, as banks seek to understand all the ramifications of implementing CECL models.
CECL is perhaps the largest change to bank accounting in decades. While everyone would agree that improved methodologies for the recognition and measurement of credit losses for loans and debt securities is a good idea in theory, there is industry angst around how sweeping the operational and process changes will be, and how costly a proposition this can become.
HOW DID WE GET HERE?
Technically known as Accounting for Financial Instruments – Credit Losses, (Subtopic 825-15), CECL came about as a result of a desire to improve investor ability to comprehend bank financial statements.
The FASB stated their objectives as follows:
“The objective of this project is to significantly improve the decision usefulness of financial instrument reporting for users of financial statements.
The Boards believe that simplification of the accounting requirements for financial instruments should be an outcome of this improvement.
The Boards’ goal is to develop a single credit loss model for financial assets that enables more timely recognition reporting of credit losses.”
WHAT ARE THE BASICS?
CECL is a fundamental change to the way banks estimate losses within their loan portfolios.
Existing bank methods for loss accounting is based upon an “incurred loss” basis. Said another way, banks currently do not have to estimate potential losses on a loan unless the losses are probable and reasonably estimable.
The new approach, in contrast, is an “expected loss” methodology.
In its simplest terms, CECL requires a credit loss to be booked for accounting purposes at the origination of a loan, based upon what is expected to happen many years in the future.
For practical purposes, both models can be performed at the portfolio level, using historical loss performance as an anchor point.
Given the scale of the change, the FASB has tried to manage some of the many misunderstandings that come with such a dramatic adjustment in approach. Of primary concern among industry participants, in addition to the operational changes necessary, is the potential for large increases to banks’ allowance for loan and lease losses.
WHAT IS THE TIMING?
Originally proposed to the industry in December of 2012, the CECL rollout encountered numerous delays and industry discussions, as the accounting board sought input to ensure that the intent is realized through implementation.
The new standard underwent multiple iterations and was put out for industry feedback, as well as roundtable discussions. FASB received dozens of comment letters from industry participants regarding concerns with implementation.
With final standard now in place, the formal implementation timetable varies slightly, depending on a bank’s SEC registrant status. For SEC registrants, the implementation of this new standard is set to commence during the 2020 financial statement period. For non-SEC registrants, it will be implemented during the 2021 financial statement period.
WHAT ARE THE MARKET IMPACTS?
Data Management is Pivotal
CECL will require organizations to be more focused on data management than ever before. Entities of all sizes will need to do an assessment of required field-level information, and to the extent data collection, assimilation, cleansing, and organization need to be improved, will need to have a well-thought-out program to enhance their ability to turn data into information.
Forecasting Methodologies Will Need to Change
Loss estimation methodologies for financial services firms of all sizes will need to be modified.
While in some respects the new requirement will simplify accounting treatment (for instance, impaired assets and non-impaired assets will no longer be treated separately), methodologies for life-of-loan loss estimation will take on a new dimension.
Banks that are suitably equipped to assess historical loan-loss performance (overall and by origination year cohort) and those with the most efficient methods for predictive analytics will have a distinct competitive advantage over less-sophisticated competitors.
Process, Process, Process
Like most paradigm shifts of this magnitude, a well-documented game plan for conversion is essential. A thorough, documented, supportable, auditable, and repeatable process will be critical.
Requirements, methodologies, systems inputs, roles and responsibilities, and approval methods will all need to be transparent throughout the organization in order to appease financial statement auditors and regulators alike.
Top 10 Organizational Impacts
#1 – ESTIMATES OF IMPACT ARE DIVERSE
Portfolio composition clearly plays a significant role in determining how CECL will impact loss reserves. While the general consensus holds that reserves will need to increase, estimates range from as little as 2-3% to as much as 50-60%. Balance sheets with longer-term assets will generally see a larger impact. Prepayment assumptions, which are factored into CECL, take on new importance for longer-lived assets such as 30- year mortgages. Certain loan types, such as unconditionally cancellable lines of credit and renewable loans, may see allowance levels drop, as no losses beyond the contractual end date are permitted under CECL guidance, regardless of the unlikelihood of cancelling a line of credit or the likelihood of renewing the loan. Additionally, methodologies will have to take multiple asset classes into account, including commercial real estate and consumer loans.
It is never too early to begin assessing the impact and evaluating the various modeling options from an operational perspective. Internal and external parties are sure to be interested in what the impact will be and the methodology employed could have a significant impact on the overall impact of adoption.
#2 – CECL GUIDANCE DOES NOT PRESCRIBE A LOSS METHODOLOGY, HOWEVER THE CURRENT CONSENSUS LEANS TOWARDS IMPLEMENTING A DISCOUNTED CASH FLOW MODEL
While many methodologies can be utilized, including vintage analysis, loss rate method, roll-rate method or a probability of default method, the discounted cash flow methodology looks to be the most reasonable approach based on CECL guidance, particularly for portfolios with longer-term assets. While a discounted cash flow methodology may be more complex to implement, this methodology appears to yield an allowance that most closely reflects the true economics of the financial instrument. This is primarily due to the present value discounting inherent in this methodology, which is not explicitly considered in the other methodologies.
Overall, the FASB expects the sophistication of the methodology employed to be commensurate with the complexity of the institution and to reasonably reflect its expectations of future credit losses. The various modeling types and segmentation methods should be evaluated prior to implementation to determine which methodology is most suitable.
#2 – LOL MODELS ALREADY IN PRODUCTION WILL LIKELY NEED TO BE MODIFIED TO BE UTILIZED FOR CECL
Banks currently employing life-of-loan (LOL) models will likely need to modify them to comply with CECL. Many of these models include forecasts of new production as well as anticipated loan renewals, neither of which is included in the CECL calculation. Banks should perform a gap analysis for any model being considered for CECL. Future model validation requirements should also be taken into account.
#4 – ORGANIZATIONS NEED TO PREPARE DATA AND MODELS FOR SOX AND MODEL VALIDATION REVIEW
Datasets and models that were not previously subject to SOX and financial reporting control testing will now need to be reviewed. Data storage needs will be significant, and current databases should be evaluated for auditability and scalability. Controls around the databases supporting the life of loan calculation may need to be enhanced to meet financial reporting expectations.
#5 – ACCOUNTING CLOSE PROCESSES WILL NEED TO BE ENHANCED
ALLL processes have historically been able to largely ignore originations occurring near the end of the accounting period. CECL closes this loophole by requiring lenders to book a day-one loss upon origination. Banks have traditionally had very short closing cycles. Systems will need to provide the data necessary to book the lifetime loss potentially requiring origination systems to be enhanced in order to be able to capture that data in real time.
#6 – DATA NEEDS TO BE ENHANCED TO SUPPORT LOL LOSS CALCULATION
Portfolio data covering a full business cycle will be needed to support CECL calculations. Twelve years is a reasonable starting point to cover a full business cycle, but this could vary depending on asset type. Twelve years will capture results leading up to, during, and after the financial crisis. Additional history could reduce volatility in CECL calculations. Starting the assessment of data gaps to be performed in the near term and developing project plans based on that assessment are necessary to be operationally ready for CECL.
#7 – CREDIT DISCLOSURES NEED TO INCREASE SIGNIFICANTLY
Because no single prescribed methodology accompanies CECL, disclosures on how banks actually perform the calculation need to be robust and address how the forecast was derived, the time period for which a “reasonable and supportable” forecast was determined, and when historical loss rates were utilized.
The historic relationship between loan-level credit performance indicators (current delinquencies, defaults, LTV ratios, etc.) and the overall allowance level will not necessarily continue to hold in the future. Period-over-period improvements may occur (i.e., delinquency rates decrease), however these could be more than offset by a worsening change in the forecast. Disclosures need to bridge the gap when this occurs.
Credit-quality-indicators disclosures by year of origination (vintage) are required for SEC filers and are optional for non-public companies. Overall, the level and sophistication of these disclosures needs to be commensurate with the complexity and size of the institution.
#8 – ALLOWANCE WILL LIKELY BE MUCH MORE VOLATILE, POTENTIALLY NECESSITATING ADDITIONAL REGULATORY CAPITAL BUFFERS
Small changes to future forecasts will typically have significant impacts on the reserve. This could make the allowance significantly more volatile, and regulators may impose additional capital buffers to absorb this volatility. Operationally speaking, banks need to assess capital impacts to determine whether additional regulatory capital is needed to compensate for the initial impact and the increased volatility going forward.
#9 – ASSUMPTIONS WILL NEED TO ALIGN WITH OTHER PROCESSES SUCH AS ALM AND FORECASTING
Assumptions used to calculate the LOL loss are expected to be aligned with those used for other forecasts within the institution (e.g., income forecast, ALM, etc.). Auditors and regulators will not look favorably on utilizing one forecast for CECL purposes and a different forecast for other purposes.
#10 – ENTITIES NEED TO MONITOR CHANGES FROM THE TRANSITIONS RESOURCE GROUP (TRG)
The FASB established the TRG, made up of bankers, auditors, and regulators, to address issues and questions associated with CECL implementation. While the LOL concept is unlikely to change, banks would do well to follow the activities of the TRG as specific implementation guidance is likely to be issued over the coming years.
Chapter 2
Data Requirements
IS YOUR ORGANIZATION READY?
Sample Size Requirements for CECL Modeling
BETTER MODELING LOWERS SAMPLE REQUIREMENTS AND RELIANCE ON PROXY DATA
Many bankers are questioning whether they have enough internal loan data for CECL modeling. Ensuring data sufficiency is a critical first step in meeting the CECL requirements—banks need to find and obtain relevant third-party data if it isn’t. This section explains in plain English how to calculate statistically sufficient sample sizes to determine whether third-party data is required. More importantly, it shows modeling techniques that reduce the required sample size. Investing in the right modeling approach could ultimately save the time and expense of obtaining third-party data.
CECL DATA REQUIREMENTS: SAMPLE SIZE FOR A SINGLE HOMOGENOUS POOL
Let’s first consider the sample required for a single pool of nearly identical loans. In the case of a uniform pool of loans — with the same FICO, loan-to-value (LTV) ratio, loan age, etc. — there is a straightforward formula to calculate the sample size necessary to estimate the pool’s default rate, shown in Exhibit 1.
Exhibit 1
PD = probability of default as estimated from existing data
3.84 is a multiple associated with the standard confidence level of 95
%Error Margin = material error threshold in ALLL as a percentage of the pool’s principal balance. For example, if a bank has a $1 billion portfolio and ALLL estimation errors of less than $2.5 million are immaterial, then the %Error Margin is 0.25%
Average Loss Severity is expressed as a percentage of defaulted principal.
As the formula shows, the sample size depends on several variables, some of which must be estimated:
- Materiality Threshold and Confidence Level: Suppose the bank has a $1 billion loan portfolio and determines that, from a financial statement materiality standpoint, the ALLL estimate needs to be reliable to within +/- $2.5 million. Statistically, we would say that we need to be 95% confident that our loss reserve estimate is within an error margin of +/- $2.5 million of the true figure. The wider our materiality thresholds and lower our required confidence levels, the smaller the sample size we need.
- Loss Severity: As average loss severity increases, a greater sample size is needed to achieve the same error margin and confidence level. For example, if average loss severity is 0%, then an estimate zero losses would be appropriate regardless of default rates. Theoretically, the exercise of estimating default rates does not even need to be performed, and the required sample size is zero. On the opposite end, if average loss severity is 100%, every dollar of defaulted balance translates into a dollar of loss, so modelers can least afford to misestimate default rates. Required sample size will therefore be great.
- Default Rates: A preliminary estimate of default rate, based on the available sample, also affects the required sample size. Holding dollar error margin constant, fewer loans are needed for low-default-rate populations.
Example: Suppose we have originated a pool of low-risk commercial real estate loans. We have historical observations for 500 such loans, of which 495 paid off and five defaulted, so our preliminary default rate estimate is 1%. Of the five defaults, loss severity averaged 25% of original principal balance. We deem ALLL estimate errors within 0.25% of the relevant principal balance to be immaterial. Is our internal sample of 500 loans enough for CECL modeling purposes, or do we need to obtain proxy data?
Simply apply the formula from Exhibit 1:
In this case, our internal sample of 500 loans is more than enough to give us a statistical confidence interval that is narrower than our materiality thresholds. We do not need proxy data to inform our CECL model in this case.
CECL DATA REQUIREMENTS: SAMPLE SIZE ACROSS AN ASSET CLASS
If we have an asset class with loans of varying credit risk characteristics, one way to determine the needed sample is just to carve up the portfolio into many buckets of loans with like-risk characteristics, determine the number of loans needed for each bucket on a standalone basis per the formula above, and then sum these amounts. The problem with this approach – assuming our concern is to avoid material ALLL errors at the asset class level – is that it will dramatically overstate the aggregate number of loans required.
A better approach, which still involves segregating the portfolio into risk buckets, is to assign varying margins of error across the buckets in a way that minimizes the aggregate sample required while maintaining a proportional portfolio mix and keeping the aggregate margin of error within the aggregate materiality threshold. A tool like Solver within Microsoft Excel can perform this optimization task with precision. The resulting error margins (as a percentage of each bucket’s default rate estimates) are much wider than they would be on a standalone basis for buckets with low frequencies and slightly narrower for buckets with high default frequencies.
Even at its most optimized, though, the total number of loans needed to estimate the default rates of multiple like-risk buckets will skyrocket as the number of key credit risk variables increases. A superior approach to bucketing is loan-level modeling, which treats the entire asset class as one sample but estimates loan-specific default rates according to the individual risk characteristics of each loan.
LOAN-LEVEL MODELING
Suppose within a particular asset class, FICO is the only factor that affects default rates, and we segregate loans into four FICO buckets based on credit performance. (Assume for simplicity that each bucket holds an equal number of loans.) The buckets’ default rates range from 1% to 7%. As before, average loss severity is 25% and our materiality threshold is 0.25% of principal balance. Whether with a bucketing approach or loan-level modeling, either way we need a sample of about 5,000 loans total across the asset class. (We calculate the sample required for bucketing with Solver as described above and calculate the sample required for loan-level modeling with an iterative approach described below.)
Now suppose we discover that loan age is another key performance driver. We want to incorporate this into our model because an accurate ALLL minimizes earnings volatility and thereby minimizes excessive capital buffers. We create four loan age buckets, leaving us now with 4 × 4 = 16 buckets (again, assuming the buckets hold equal loan count). With four categories each of two variables, we would need around 9,000 loans for loan-level modeling but 20,000 loans for a bucketing approach, with around 1,300 in each bucket. (These are ballpark estimates that assume the loan-level model has been properly constructed and fit the data reasonably well. Estimates will vary by bank with the default rates and loss severities of the available sample. Also, while this article deals with loan count sufficiency, we have noted previously that the same dataset must also cover a sufficient timespan, whether the bank is using loan-level modeling or bucketing.)
Finally, suppose we include a third variable, perhaps stage in the economic cycle, LTV, Debt Service Coverage Ratio, or something else.
Again assume we segregate loans into four categories based on this third variable. Now we have 4^3= 64 equal-sized buckets. With loan-level modeling we need around 12,000 loans. With bucketing we need around 100,000 loans, an average of around 1,600 per bucket.
As the graph shows, a bucketing approach forces us to choose between less insight and an astronomical sample size requirement. As we increase the number of variables used to forecast credit losses, the sample needed for loan-level modeling increases slightly, but the sample needed for bucketing explodes. This points to loan-level modeling as the best solution because well-performing CECL models incorporate many variables. (Another benefit of loan-level credit models, one that is of particular interest to investors, is that the granular intelligence they provide can facilitate better loan screening and pricing decisions.)
Exhibit 2: Loan-Level Modeling Yields Greater Insight from Smaller Samples
CECL DATA REQUIREMENTS: SAMPLE SIZE FOR LOAN-LEVEL MODELING
Determining the sample size needed for loan-level modeling is an iterative process based on the standard errors reported in the model output of a statistical software package. After estimating and running a model on the existing sample, convert the error margin of each default rate (1.96 × the standard error of the default rate estimate to generate a 95% confidence interval) into an error margin of dollars lost by multiplying the default rate error margin by loss severity and the relevant principal balance. Next, sum each dollar error margin to determine whether the aggregate dollar error margin is within the materiality threshold, and adjust the sample size up or down as necessary.
Chapter 3
What Data do I need for CECL Modeling?
A DETAILED GUIDE TO CECL DATA COLLECTION
Even with CECL compelling banks to collect more internal loan data, we continue to emphasize profitability as the primary benefit of robust, proprietary, loan-level data. Make no mistake, the data template we outline below is for CECL modeling. CECL compliance, however, is a prerequisite to profitability. Also, while third-party data may suffice for some components of the CECL estimate, especially in the early years of implementation, reliance on third-party data can drag down profitability. Third-party data is often expensive to buy, may be unsatisfactory to an auditor, and can yield less accurate forecasts. Inaccurate forecasts mean volatile loss reserves and excessive capital buffers that dilute shareholder profitability. An accurate forecast built on internal data not only solves these problems but can also be leveraged to optimize loan screening and loan pricing decisions.
Below is a detailed table of data fields to collect. Banks should collect this dataset whether they plan to build credit models themselves or hire a vendor. A good vendor would expect a dataset like the one outlined below.
The table is not exhaustive for every asset class and circumstance, but it covers the basics (and then some) and is plenty serviceable for CECL modeling. A regional bank that had collected and preserved these data fields at a loan level over this past business cycle would be in a league of its own. In our new CECL world, it would hold a data asset worth perhaps more than its loan portfolio.
The following variables are useful in building probability of default, loss severity, and prepayment models – the models that inform credit loss forecasts. Note that variables in italics can be calculated from the collected variables.
Below the table are a few important notes.
Notes:
- Preserve the full time series: All data should be preserved, beginning with the origination data. Each new month of data should add to, not overwrite, prior data. If the loan servicing system cannot accommodate this, other reasonably priced databases are available. A full time series of data is required to establish delinquency roll rates, the impact of macroeconomics on delinquency transitions, and prepayment patterns. The cost of anything less than a full time series of data is lost accuracy. We have previously written that datasets should span at least ten years.
- Preserve original credit characteristics: A CECL model needs to forecast credit losses long into the future, based on credit characteristics available at the time the model is run. It does little good to learn relationships between today’s FICO scores and short-term default probabilities over the next twelve months. For the most part, the model must predict default based on static credit characteristics, with dynamism entering the model through the macroeconomic inputs, which might assume improving, deteriorating, or stable conditions. An exception exists where the credit characteristic itself can be reasonably predicted, as is the case with LTV on the basis of real estate indices.
- Capture updated credit characteristics: When it doubt, capture the data. We noted in the prior bullet point that updated credit characteristics may not always be useful, especially if they are not captured systematically and regularly across the portfolio. But a credit modeler might discover that a better model can be built using “most recent” credit characteristics rather than original credit characteristics in certain cases. Also, updated credit characteristics can be useful for portfolio segmentation.
- Notes on specific variables: The reasons for collecting some of the variables will be apparent. Here are the less self-explanatory CECL data requirements and the reasons for collecting these variables:
- Payment date permits matching loan outcomes with macroeconomic factors and calculating loan age, an important explanatory variable.
- Loan age is useful in establishing default probabilities as most assets exhibit different default probabilities at different stages in their life.
- Interest rate information is useful in confirming scheduled payment and as an explanatory variable in default and prepayment models. Loans exhibit higher default and prepayment probabilities, all else equal, when charged higher interest rates.
- Scheduled payment permits calculating prepayment and underpayment.
- TDRs and modifications inform default probabilities as they are signs of distress.
- Outstanding principal balance at end of period is useful to confirm scheduled payment, to measure loss severity, and possibly as an explanatory variable in prepayment and credit models.
- Why No Risk Ratings? It wouldn’t hurt to include risk ratings in the monthly data, and modelers who are committed to building a risk ratings migration model will need them. We prefer delinquency state transition models, however, because they are objective. Risk ratings are either subjective or else an amalgamation of metrics that could be disaggregated and modeled individually. For most banks, a risk rating is akin to a prediction – it ranks likelihood of future defaults or magnitudes of forecasted losses. Predicting future risk rating is thus like predicting a prediction. It is both more useful and more doable to predict objective outcomes, which is why we prefer to model based on delinquency status.
Chapter 4
How to Select Your CECL Methodology
DOABLE, DEFENSIBLE CHOICES AMID THE CLUTTER
CECL advice is hitting financial practitioners from all sides. As an industry friend put it, “Now even my dentist has a CECL solution.”
With many high-level commentaries on CECL methodologies in publication (including RiskSpan’s), we introduce this specific framework to help practitioners eliminate ill-fitting methodologies until one remains per segment. We focus on the commercially available methods implemented in the CECL Module of the RiskSpan (RS) Edge Platform, enabling us to be precise about which methods cover which asset classes, require which data fields, and generate which outputs. Our decision framework covers each asset class under the CECL standard and considers data availability, budgetary constraints, value placed on precision, and audit and regulatory scrutiny.
PERFORMANCE ESTIMATION VS. ALLOWANCE CALCULATIONS
Before evaluating methods, it is clarifying to distinguish performance estimation methods from allowance calculation methods (or simply allowance calculations). Performance estimation methods forecast the credit performance of a financial asset over the remaining life of the instrument, and allowance calculations translate that performance forecast into a single allowance number.
There are only two allowance calculations allowable under CECL: the discounted cash flow (DCF) calculation (ASC 326-20-30-4), and the non-DCF calculation (ASC 326-20-30-5). Under the DCF allowance calculation, allowance equals amortized cost minus the present value of expected cash flows. The expected cash flows (the extent to which they differ from contractual cash flows) must first be driven by some performance estimation method. Under the non-DCF allowance calculation, allowance equals cumulative expected credit losses of amortized cost (roughly equal to future principal losses). These future losses of amortized cost, too, must first be generated by a performance estimation method.
Next, we show how to select performance estimation methods, then allowance calculations.
Next, we show how to select performance estimation methods, then allowance calculations.
SELECTING YOUR PERFORMANCE ESTIMATION METHOD
Figure 1 below lays out the performance estimation methods available in RiskSpan’s CECL Module. We group methods into “Practical Methods” and “Premium Methods.”
In general, Practical Methods calculate average credit performance from a user-selected historical performance data set and extrapolate those historical averages – as adjusted by user-defined management adjustments for macroeconomic expectations and other factors – across the future life of the asset. When using a Practical Method, every instrument in the same user-defined segment will have the same allowance ratio.
Premium Methods involve statistical models built on large performance datasets containing instrument-level credit attributes, instrument-level performance outcomes, and contemporaneous macroeconomic data. While vendor-built Premium Methods come pre-built on large industry datasets, they can be tuned to institution-specific performance if the user supplies performance data. Premium Methods take instrument-level attributes and forward-looking macroeconomic scenarios as inputs and generate instrument-level, macro-conditioned results based on statistically valid methods. Management adjustments are possible, but the model results already reflect the input macroeconomic scenario(s).
the customer to supply historical performance data. All methods require the customer to provide basic positional data as of the reporting date (outstanding balance amounts, the asset class of each instrument, etc.)
FIGURE 1 – PERFORMANCE ESTIMATION METHODS IN RISKSPAN’S CECL MODULE
To help customers choose their performance estimation methods, we walk them through the decision tree shown in Figure 3. These steps to select a performance estimation method should be followed for each portfolio segment, one segment at a time. As shown, the first step to shorten the menu of methods is to choose between Practical Methods and Premium Methods. Premium Methods available today in the RS Edge Platform include both methods built by RiskSpan (prefixed “RS”) and methods built by our partner, S&P Global Market Intelligence (“S&P”).
The choice between Premium Methods and Practical Methods is primarily a tradeoff between instrument-level precision and scientific incorporation of macroeconomic scenarios on the Premium side versus lower operational costs on the Practical side. Because Premium Methods produce instrument-specific forecasts, they can be leveraged to accelerate and improve credit screening and pricing decisions in addition to solving CECLSuch adjustments may not withstand the intense audit and regulatory scrutiny that larger institutions face.
Suppose that for a particular asset class, an institution wants a Premium Method. For most asset classes, RiskSpan’s CECL Module selectively features one Premium Method, as shown in Figure 1. In cases where the asset class is not covered by a Premium Method in Edge, the next question becomes: does a suitable, affordable vendor model exist? We are familiar with many models in the marketplace, and can advise on the benefits, drawbacks, and pricing of each. Vendor models come with explanatory documentation that institutions can review pre-purchase to determine comfort. Where a viable vendor model exists, we assist institutions by integrating that model as a new Premium Method, accessible within their CECL workflow. Where no viable vendor model exists, institutions must evaluate their internal historical performance data. Does it contain enough instruments, span enough time, and include enough fields to build a valid model? If so, we assist institutions in building custom models and integrating them within their CECL workflows. If not, it’s time to begin or to continue a data collection process that will eventually support modeling, and in the meantime, apply a Practical Method.
To choose among Practical Methods, we first distinguish between debt securities and other asset classes. Debt securities do not require internal historical data because more robust, relevant data is available from industry sources. We offer one Practical Method for each class of debt security, as shown in Figure 1.
For asset classes other than debt securities, the next step is to evaluate internal data. Does it represent enough instruments (segment-level summary data is fine for Practical Methods) and span enough time to drive meaningful results? If not, we suggest applying the Remaining Life Method, a method that has been showcased by regulators and that references Call Report data (which the Edge platform can filter by institution size and location). If adequate internal data exists, eliminate methods that are not asset class-appropriate (see Figure 1) or that require specific data fields the institution lacks. Figure 2 summarizes data requirements for each Practical Method, with a tally of required fields by field type. RiskSpan can provide institutions with detailed data templates for any method upon request. From among the remaining Practical Methods, we recommend institutions apply this hierarchy:
- Vintage Loss Rate: This method makes the most of recent observations and datasets that are shorter in timespan, whereas the Snapshot Loss Rate requires frozen pools to age substantially before counting toward historical performance averages. The Vintage Loss Rate explicitly considers the age of outstanding loans and leases and requires relatively few data fields.
- Snapshot Loss Rate: This method has the drawbacks described above, but for well-aged datasets produces stable results and is a very intuitive and familiar method to financial institution stakeholders.
- Remaining Life: This method ignores the effect of loan seasoning on default rates and requires user assumptions about prepayment rates, but it has been put forward by regulators and is a necessary and defensible option for institutions who lack the data to use the methods above.
FIGURE 2 – DATA REQUIREMENTS FOR PRACTICAL METHODS
FIGURE 3 – METHODOLOGY SELECTION FRAMEWORK
SELECTING YOUR ALLOWANCE CALCULATION
After selecting a performance estimation method for each portfolio segment, we must select our corresponding allowance calculations.
Note that all performance estimation methods in RS Edge generate, among their outputs, undiscounted expected credit losses of amortized cost. Therefore, users can elect the non-DCF allowance calculation for any portfolio segment regardless of the performance estimation method. Figure 5 shows this.
A DCF allowance calculation requires the elements shown in Figure 4. Among the Premium (performance estimation) Methods, RS Resi, RS RMBS, and RS Structured Finance require contractual features as inputs and generate among their outputs the other elements of a DCF allowance calculation. Therefore, users can elect the DCF allowance calculation in combination with any of these methods without providing additional inputs or assumptions. For these methods, the choice between the DCF and non-DCF allowance calculation often comes down to anticipated impact on allowance level.
The remaining Premium Methods to discuss are the S&P C&I method – which covers all corporate entities, financial and non-financial, and applies to both loans and bonds – and the S&P CRE method. These methods do not require all the instruments’ contractual features as inputs (an advantage in terms of reducing the input data requirements). They do project periodic default and LGD rates, but not voluntary prepayments or liquidation lags. Therefore, to use the DCF allowance calculation in combination with the S&P C&I and CRE performance estimation methods, users provide additional contractual features as inputs and voluntary prepayment rate and liquidation lag assumptions. The CECL Module’s cash flow engine then integrates the periodic default and LGD rates produced by the S&P C&I and CRE methods, together with user-supplied contractual features and prepayment and liquidation lag assumptions, to produce expected cash flows. The Module discounts these cash flows according to the CECL requirements and differences the present values from amortized cost to calculate allowance. In considering this DCF allowance calculation with the S&P performance estimation methods, users typically weigh the impact on allowance level against the task of supplying the additional data and assumptions.
To use a DCF allowance calculation in concert with a Practical (performance estimation) Method requires the user to provide contractual features (up to 20 additional data fields), liquidation lags, as well as monthly voluntary prepayment, default, and LGD rates that reconcile to the cumulative expected credit loss rate from the performance estimation method. This makes the allowance a multi-step process. It is therefore usually simpler and less costly overall to use a Premium Method if the institution wants to enable a DCF allowance calculation. The non-DCF allowance calculation is the natural complement to the Practical Methods.
FIGURE 4 – ELEMENTS OF A DCF ALLOWANCE CALCULATION
FIGURE 5 – ALLOWANCE CALCULATIONS COMPATIBLE WITH EACH PERFORMANCE ESTIMATION METHOD
One you have selected a performance estimation method and allowance calculation method for each segment, you can begin the next phase of comparing modeled results to expectations and historical performance and tuning model settings accordingly and management inputs accordingly. We are available to discuss CECL methodology further with you; don’t hesitate to get in touch!
Chapter 5
DCF vs. Non-DCF Allowance
MYTH AND REALITY
FASB’s CECL standard allows institutions to calculate their allowance for credit losses as either “the difference between the amortized cost basis and the present value of the expected cash flows” (ASC 326-20-30-4) or “expected credit losses of the amortized cost basis” (ASC 326-20-30-5). The first approach is commonly called the discounted cash flow or “DCF approach” and the second approach the “non-DCF approach.” In the second approach, the allowance equals the undiscounted sum of the amortized cost basis projected not to be collected. For the purposes of this post, we will equate amortized cost with unpaid principal balance.
A popular misconception – even among savvy professionals – is that a DCF-based allowance is always lower than a non-DCF allowance given the same performance forecast. In fact, a DCF allowance is sometimes higher and sometimes lower than a non-DCF allowance, depending upon the remaining life of the instrument, the modeled recovery rate, the effective interest rate (EIR), and the time from default until recovery (liquidation lag). Below we will compare DCF and non-DCF allowances while systematically varying these key differentiators.
Our DCF allowances reflect cash inflows that follow the SIFMA standard formulas. We systematically vary time to maturity, recovery rate, liquidation lag and EIR to show their impact on DCF vs. non-DCF allowances (see Table 1 for definitions of these variables). We hold default rate and voluntary prepayment rate constant at reasonable levels across the forecast horizon. See Table 2 for all loan features and behavioral assumptions held constant throughout this exercise.
For clarity, we reiterate that the DCF allowances we will compare to non-DCF allowances reflect amortized cost minus discounted cash inflows, per ASC 326-20-30-4. A third approach, which is unsound and therefore excluded, is the discounting of accounting losses. This approach will understate expected credit losses by using the interest rate to discount principal losses while ignoring lost interest itself.
TABLE 1 – KEY DRIVERS OF DCF VS. NON-DCF ALLOWANCE DIFFERENCES (SYSTEMATICALLY VARIED BELOW)
TABLE 2 – LOAN FEATURES AND BEHAVIORAL ASSUMPTIONS HELD CONSTANT
Figure 1 compares DCF versus non-DCF allowances. It is organized into nine tables, covering the landscape of loan characteristics that drive DCF vs. non-DCF allowance differences. The cells of the tables show DCF allowance minus Non-DCF allowance in basis points. Thus, positive values mean that the DCF allowance is greater.
- Tables A, B and C show loans with 100% recovery rates. For such loans, ultimate recovery proceeds match exposure at default. Under the non-DCF approach, as long as recovery proceeds eventually cover principal balance at the time of default, allowance will be zero. Accordingly, the non-DCF allowance is 0 in every cell of tables A, B and C. Longer liquidation lags, however, diminish present value and thus increase DCF allowances. The greater the discount rate (the EIR), the deeper the hit to present value. Thus, the DCF allowance increases as we move from the top-left to the bottom-right of tables A, B and C. Note that even when liquidation lag is 0, 100% recovery still excludes the final month’s interest, and a DCF allowance (which reflects total cash flows) will accordingly reflect a small hit. Tables A, B and C differ in one respect – the life of the loan. Longer lives translate to greater total defaulted dollars, greater amounts exposed to the liquidation lags, and greater DCF allowances.
- Tables G, H and I show loans with 0% recovery rates. While 0% recovery rates may be rare, it is instructive to understand the zero-recovery case to sharpen our intuitions around the comparison between DCF and non-DCF allowances. With zero recovery proceeds, the loans produce only monthly (or periodic) payments until default. Liquidation lag, therefore, is irrelevant. As long as the EIR is positive and there are defaults in payment periods besides the first, the present value of a periodic cash flow stream (using EIR as the discount rate) will exceed cumulative principal collected. Book value minus the present value of the periodic cash flow stream, therefore, will be less than than the cumulative principal not collected, and thus DCF allowance will be lower. Appendix A explains why this is the case. As Tables G, H and I show, the advantage (if we may be permitted to characterize a lower allowance as an advantage) of the DCF approach on 0% recovery loans is greater with greater discount rates and greater loan terms.
- Tables D, E and F show a more complex (and more realistic) scenario where the recovery rate is 75% (loss-given-default rate is 25%). Note that each cell in Table D falls in between the corresponding values from Table A and Table G; each cell in Table E falls in between the corresponding values from Table B and Table H; and each cell in Table F falls in between the corresponding values from Table C and Table I. In general, we can see that long liquidation lags will hurt present values, driving DCF allowances above non-DCF allowances. Short (zero) liquidation lags allow the DCF advantage from the periodic cash flow stream (described above in the comments about Tables G, H and I) to prevail, but the size of the effect is much smaller than with 0% recovery rates because allowances in general are much lower. With moderate liquidation lags (12 months), the two approaches are nearly equivalent. Here the difference is made by the loan term, where shorter loans limit the periodic cash flow stream that advantages the DCF allowances, and longer loans magnify the impact of the periodic cash flow stream to the advantage of the DCF approach.
FIGURE 1 – DCF ALLOWANCE RELATIVE TO NON-DCF ALLOWANCE (DIFFERENCE IN BASIS POINTS)
Conclusion
- Longer liquidation lags will increase DCF allowances relative to non-DCF allowances as long as recovery rate is greater than 0%.
- Greater EIRs will magnify the difference (in either direction) between DCF and non-DCF allowances.
- At extremely high recovery rates, DCF allowances will always exceed non-DCF allowances; at extremely low recovery rates, DCF allowances will always be lower than non-DCF allowances. At moderate recovery rates, other factors (loan term and liquidation lag) make the difference as to whether DCF or non-DCF allowance is higher.
- Longer loan terms both a) increase allowance in general, by exposing balances to default over a longer time horizon; and b) magnify the significance of the periodic cash flow stream relative to the liquidation lag, which advantages DCF allowances.
- Where recovery rates are extremely high (and so non-DCF allowances are held low or to zero) the increase to defaults from longer loan terms will drive DCF allowances further above non-DCF allowances.
- Where recovery rates are moderate or low, the increase to loan term will lower DCF allowances relative to non-DCF allowances.
Note that we have not specified the asset class of our hypothetical instrument in this exercise. Asset class by itself does not influence the comparison between DCF and non-DCF allowances. However, asset class (for example, a 30-year mortgage secured by a primary residence, versus a five-year term loan secured by business equipment) does influence the variables (loan term, recovery rate, liquidation lag, and effective interest rate) that drive DCF vs. non-DCF allowance differences. Knowledge of an institution’s asset mix would enable us to determine how DCF vs. non-DCF allowances will compare for that portfolio.
Chapter 6
Stakeholders
WHAT CECL MEANS TO INVESTORS
For many institutions, CECL will mean a one-time reduction in book equity and lower stated earnings during periods of portfolio growth. These reductions occur because CECL implicitly double-counts credit risk from the time of loan origination, as we will meticulously demonstrate. But for investors, will the accounting change alter the value of your shares?
THREE DISTINCT MEASURES OF VALUE
To answer this question well, we need to parse three distinct measures of value:
- Book Value: This is total shareholders’ equity as reported in financial reports like 10-Ks and annual reports prepared in accordance with U.S. GAAP.
- Current Market Value (also known as Market Cap): Current share price multiplied by the number of outstanding shares. This is the market’s collective opinion of the value of an institution. It could be very similar to, or quite different from, book value, and may change from minute to minute.
- Intrinsic Value (also known as Fundamental Value or True Value): The price that a rational investor with perfect knowledge of an institution’s characteristics would be willing to pay for its shares. It is by comparing an estimate of intrinsic value versus current market value that we deem a stock overpriced or underpriced. Investors with a long-term interest in a company should be concerned with its intrinsic or true value.
HOW DOES AN ACCOUNTING CHANGE AFFECT EACH MEASURE OF VALUE
Accounting standards govern financial statements, which investors then interpret. An informed, rational investor will “look through” any accounting quirk that distorts the true economics of an enterprise. Book value, therefore, is the only measure of value that an accounting change directly affects.
An accounting change may indirectly affect the true value of a company if costly regulations kick in as a result of a lower book value or if the operational cost of complying with the new standard is cumbersome. These are some of the risks to fundamental value from CECL, which we discuss later, along with potential mitigants.
KEY FEATURE OF CECL: DOUBLE-COUNTING CREDIT RISK
The single-most important thing for investors to understand about CECL is that it double-counts the credit risk of loans in a way that artificially reduces stated earnings and the book values of assets and equity at the time a loan is originated. It is not the intent of CECL to double-count credit risk, but it has that effect, as noted by the two members of the Financial Accounting Standards Board (FASB) who dissented from the rule. (CECL was adopted by a 5-2 vote.)
Consider this simple example of CECL accounting: A bank makes a loan with an original principal balance of $100. CECL requires the bank to recognize an expense equal to the present value of expected credit lossesi and to record a credit allowance that reduces net assets by this same amount. Suppose we immediately reserve our $100 loan down to a net book value of $99 and book a $1 expense. Why did we even make the loan? Why did we spend $100 on something our accountant says is worth $99? Is lending for suckers?
Intuitively, consider that to make a loan of $100 is to buy a financial asset for a price of $100. If other banks would have made the same loan at the same interest rate (which is to say, they would have paid the same price for the same asset), then our loan’s original principal balance was equal to its fair market value at the time of origination. It is critical to understand that an asset’s fair market value is the price which market participants would pay after considering all of the asset’s risks, including credit risk. Thus, any further allowance for credit risk below the original principal balance is a double-counting of credit risk.
Here’s the underlying math: Suppose the $100 loan is a one-year loan, with a single principal and interest payment due at maturity. If the note rate is 5%, the contractual cash flow is $105 next year. This $105 is the most we can receive; we receive it if no default occurs. What is the present value of the $105 we hope to receive? One way to determine it is to discount the full $105 amount by a discount rate that reflects the risk of nonpayment. We established that 5% is the rate of return that banks are requiring of borrowers presenting similar credit risk, so an easy present value calculation is to discount next year’s contractual
$105 cash flow by the 5% contractual interest rate, i.e., $105 / (1 + 5%) = $100. Alternatively, we could reduce the contractual cash flow of $105 by some estimate of credit risk. Say we estimate that if we made many loans like this one, we would collect an average of $104 per loan. Our expected future cash flow, then, is $104. If we take the market value of $100 for this loan as an anchor point, then the market’s required rate of return for expected cash flows must be 4%. ($104 / (1 + 4%) = $100.) It is only sensible that the market requires a lower rate of return on cash flows with greater certainty of collection.
What the CECL standard does is require banks to discount the lower expected cash flows at the higher contractual rate (or to use non-discounting techniques that have the same effect). This would be like discounting $104 at 5% and calculating a fair market value for the asset of $104 / (1 + 5%) ≈ $99. This (CECL’s) method double-counts credit risk by $1. The graph below shows the proper relationship between cash flow forecasts and discount rates when performing present value calculations, and shows how CECL plots off the line.
FASB Vice Chairman James Kroeker and Board member Lawrence Smith described the double-counting issue in their dissent to the standards update: “When performing a present value calculation of future cash flows, it is inappropriate to reflect credit risk in both the expected future cash flows and the discount rate because doing so effectively double counts the reflection of credit risk in that present value calculation. If estimates of future cash flows reflect the risk of nonpayment, then the discount rate should be closer to risk-free. If estimates of future cash flows are based on contractual amounts (and thus do not reflect a nonpayment risk), the discount rate should be higher to reflect assumptions about future defaults.” Ultimately, the revised standard “results in financial reporting that does not faithfully reflect the economics of lending activities.”ii
The Accounting Standards Update notes two tangential counterpoints to Kroeker and Smith’s dissent. The first point is that banks would find alternative methods challenging, which may be true but is irrelevant to the question of whether CECL faithfully reflects true economics. The second point is that the valuation principles Kroeker and Smith lay out are for fair value estimates, whereas the accounting standard is not intended to produce fair value estimates. This concedes the only point we are trying to make, which is that the accounting treatment deviates (downwardly, in this case) from the fundamental and market value that an investor should care about.
HOW CECL AFFECTS EACH MEASURE OF VALUE
As noted previously, the direct consequences of CECL will hit book value. Rating agency Fitch estimates that the initial implementation of CECL would shave 25 to 50 bps off the aggregate tangible common equity ratio of US banks if applied in today’s economy. The ongoing impact of CECL will be less dramatic because the annual impact to stated earnings is just the year-over-year change in CECL. Still, a growing portfolio would likely add to its CECL reserve every year.iii
There are many indirect consequences of CECL that may affect market and true value:
- Leverage: The combination of lower book values from CECL with regulations that limit leverage on the basis of book value could force some banks to issue equity or retain earnings to de-leverage their balance sheet. Consider these points:
- There is a strong argument to be made to regulators that the capital requirements that pre- dated CECL, if not adjusted for the more conservative asset calculations of CECL, will have become more conservative de facto than they were meant to be. There is no indication that regulators are considering such an adjustment, however. A joint statement on CECL from the major regulators tells financial institutions to “[plan] for the potential impact of the new accounting standard on capital.”
- Withholding a dividend payment does not automatically reduce a firm’s true value. If the enterprise can put retained earnings to profitable use, the dollar that wasn’t paid out to investors this year can appreciate into a larger payment later.
- The deeper threat to value (across all three measures) comes if regulations force a permanent de-leveraging of the balance sheet. This action would shift the capital mix away from tax-advantaged debt and toward equity, increase the after-tax cost of capital and decrease earnings and cash flow per share, all else equal.v Because banks face the shift to CECL together, however, they may be able to pass greater capital costs on to their borrowers in the form of higher fees or higher interest rates.
- Banks can help themselves in a variety of ways. The more accurate a bank’s loss forecasts prove to be, the more stable its loss reserve will be, and the less likely regulators are to require additional capital buffers. Management can also disclose whether their existing capital buffers are sufficient to absorb the projected impact of CECL without altering capital plans. Conceivably, management could elect to account for its loans under the fair value option to avoid CECL’s double-counting bias, but this would introduce market volatility to stated earnings which could prompt its own capital buffers.
- Investor Perception of Credit Risk: Investors’ perception of the credit risk a bank faces affects its market value. If an increase in credit allowance due to CECL causes investors to worry that a bank faces greater credit risk than they previously understood, the bank’s market value will fall on this reassessment. On the other hand, if investors have independently assessed the credit risk borne by an institution, a mere change in accounting treatment will not affect their view. An institution’s true value comes from the cash flows that a perfectly informed investor would expect. Unless CECL changes the kinds of loans an institution makes or the securities it purchases, its true credit risk has not changed, and nothing the accounting statements say can change that.
- Actual Changes in Credit Risk: Some banks may react to CECL by shifting their portfolio mix toward shorter duration or less credit risky investments, in an effort to mitigate CECL’s impact on their book value. If underwriting unique and risky credits was a core competency of these banks, and they shift toward safer assets with which they have no special advantage, this change could hurt their market and fundamental value.
- Volatility: ABA argues that the inherent inaccuracies of forecasts over long time horizons will increase the volatility of the loss reserve under CECL.vi Keefe, Bruyette & Woods (KBW) goes the other way, writing that CECL should reduce the cyclicality of stated earnings.vii KBW’s point can loosely be understood by considering that long-term averages are more stable than short-term averages, and short-term averages drive existing loss reserves. Certainly, if up-front CECL estimates are accurate, even major swings in charge-offs can be absorbed without a change in the reserve as long as the pattern of charge-offs evolves as expected. While cash flow volatility would hurt fundamental value, the concern from volatility of stated earnings is that it could exacerbate capital buffers required by regulators.
- Transparency: All else equal, investors prefer a company whose risks are more fully and clearly disclosed. KBW reasons that the increased transparency required by CECL will have a favorable impact on financial stock prices.
- Comparability Hindered: CECL allows management to choose from a range of modeling techniques and even to choose the macroeconomic assumptions that influence its loss reserve, so long as the forecast is defensible and used firm-wide. Given this flexibility, two identical portfolios could show different loss reserves based on the conservatism or aggressiveness of management. This situation will make peer comparisons impossible unless disclosures are adequate and investors put in the work to interpret them. Management can help investors understand, for example, if its loss reserve is larger because its economic forecast is more conservative, as opposed to because its portfolio is riskier.
- Operational Costs: Complying with CECL requires data capacity and modeling resources that could increase operational costs. The American Bankers Association notes that such costs could be “huge.”ix Management can advise stakeholders whether it expects CECL to raise its operational costs materially. If compliance costs are material, they will affect all measures of value to the extent that they cannot be passed on to borrowers. As noted earlier, the fact that all US financial institutions face the shift to CECL together increases the likelihood of their being able to pass costs on to borrowers.
- Better Intelligence: Conceivably, the enhancements to data collection and credit modeling required by CECL could improve banks’ ability to price loans and screen credit risks. These effects would increase all three measures of value.
CONCLUSION
CECL is likely to reduce the book value of most financial institutions. If regulators limit leverage because of lower book equity or the operational costs of CECL are material, and these costs cannot be transferred on to borrowers, then market values and fundamental values will also sag. If banks react to the standard by pulling back from the kinds of loans that have been their core competency, this, too, will hurt fundamental value. On the positive side, the required investment in credit risk modeling offers the opportunity for banks to better screen and price their loans.
Bank management can provide disclosures to analysts and investors to help them understand any changes to the bank’s loan profile, fee and interest income, capital structure and operational costs. Additionally, by optimizing the accuracy of its loss forecasts, management can contain the volatility of its CECL estimate and minimize the likelihood of facing further limitations on leverage.
Chapter 7
Resources
Financial institutions can partner with consulting firms to get ahead of the new CECL standard. A comprehensive CECL solution requires expertise from a wide range of disciplines, including data management, econometric and credit risk modeling, accounting, and model risk governance.
Different financial institutions will need more outside resources in some of these areas than others. The ideal one-stop CECL partner, therefore, will have a breadth and depth of expertise such that your financial institution can trust substantial CECL-related work to them, but enough modularity in their offering so that you only pay for the services you need.
A final consideration is whether a firm can provide the level of customization in CECL modeling that fits your circumstances, from an off-the-shelf model to a custom build.
EXTENSIVE EXPERTISE
While the CECL standard is new, the data management and modeling skills it requires are not, nor is the principle of aligning model assumptions across departments or the need to satisfy the demands of auditors and regulators. Consultants and industry veterans who are experienced at helping financial institutions solve such problems can help prepare you for CECL.
For as sweeping a change as CECL is, it is important to partner with a firm with accounting and regulatory expertise that understands the standard and its implications.
Besides a strong understanding of CECL and its specific requirements, CECL demands expertise in the following specific areas:
DATA MANAGEMENT. Data management will be at the forefront of tackling the transition from an “incurred loss” approach to an “expected lifetime loss” approach. Institutions will need to store extensive data in a controlled environment which will be subject to heightened scrutiny due to its impact on financial reporting.
The process of gathering, consolidating, and organizing data from disparate systems can prove painful for most organizations. Financial institutions that need to build or enhance their database to support CECL estimates should seek a partner with data hosting platforms or data management experts who can deploy to the client site.
MODELING & ANALYTICS. Models that translate raw data into value-added, decision-useful information can set leading entities apart from their peers both in terms of performance and in the eyes of regulators. CECL will require major upgrades and extensions to existing models or, in many cases, new models. A CECL model must forecast life-of-loan cash flows, considering both the credit characteristics of the loan portfolio and short-term and long-term economic forecasts. A vendor with credit and econometric modeling expertise can provide the necessary modeling help.
CECL also marks an opportunity for firms of all sizes to address shortcomings in their analytic capabilities. Methods for analyzing big data, surveillance capabilities, and decision support visualization tools have improved dramatically and should now be viewed as “must-haves” for organizations interested in differentiating themselves from their competition. The ideal CECL partner will have analytics expertise along these lines.
RISK & CONTROLS ADVISORY. Meeting the CECL standard will involve operational changes. Models, assumptions, and datasets supporting CECL will be subject to a high level of scrutiny due to their impacts on financial reporting. A consulting firm with accounting expertise can aid in the interpretation of the new standard and in understanding requisite process changes.
Additionally, documentation will be a critical component to a company’s transition to the new standard. Being able to support new methodologies with thorough documentation will be central to avoiding regulatory turbulence.
MODULARITY
As noted previously, different financial institutions will need help with different aspects of CECL preparedness. The ideal consulting partner will have a flexible solution and service offering that can be targeted to meet the institution’s need.
In some cases it may be appropriate for an outside consulting firm to lead a specific activity, and in other cases to augment and complement full-time staff.
APPROPRIATE LEVELS OF CUSTOMIZATION
Do you need a credit model custom-built from your internal data, or will an off-the-shelf model do the job? A good CECL partner can help you think through these questions and provide a model with the appropriate level of customization.
As a CECL partner, RiskSpan can assist with data organization, portfolio stratification, model development, and loss reserving. CECL requires coordination among different departments. RiskSpan’s core competencies consist of finance and accounting, credit risk modeling, technology and data infrastructure. Our firm can share expertise and align efforts across departments to solve challenges that might arise as a result of CECL requirements.
CECL INVOLVES WELL-STRUCTURED DATA AND MODELING. A FEW NOTES ON RISKSPAN’S EXPERTISE IN THESE AREAS:
- Modeling: Established frameworks and expertise in analyzing loan performance, developing custom models and integrating off-the-shelf solutions.
- Data Management: Domain knowledge in data intake, storage, and optimized extraction for loan performance analytics.
Project Management & Gap Assessment
Calibrate and Deploy OTS (Off-The-Shelf) Model
Assess Data
Establish Database
Collect Data Until Sample Spans Business Cycle
Build Custom Model
Document and Assess Accounting Process
Validate Model (Independent Party)
Monitor and Maintain Model