Patrick Weightman
Patrick Weightman May 17, 2016

Preparing for CECL data requirements

The Financial Accounting Standards Board (FASB) is expected to release the final standard of the current expected credit loss (CECL) model in the first half of 2016. This model changes the guidance for the way banks account for their allowance for loan and lease losses (ALLL) and will include, among other provisions, forward-looking requirements, a longer loss horizon and removal of the “probable loss” threshold.

While there are still a number of questions and concerns from some bankers about the proposed model and the effects of implementation, some industry professionals are seeing this as an opportunity to refine and improve ALLL processes and methodologies to more accurately account for risk with an eye toward the future. These experts recognize that the need for collecting accessible, loan-level data is the key step prior to scenario building with more robust methodologies.

Range of Methodologies 

The CECL guidance is intentionally not prescriptive, which will allow institutions to use models that work best for their unique portfolio makeup. While there will likely be a range of methodologies used for CECL implementation and some may be better suited than others for certain institution sizes and portfolio types, the focus has been on three methods in particular: migration analysis, vintage analysis and probability of default/loss given default (PD/LGD). These three methodologies range in complexity and each requires unique data points.

The importance of having adequate data lies in the fact that CECL will require many institutions to adopt new methodologies, or at least make changes to their current methodology. Additionally, having ample access to historical data enables bankers to make more reasonable and supportable forecasts about future losses. In order to decide what changes are best, most institutions will run scenarios parallel to the institution’s current ALLL to see what is most accurate and what the potential impact may be for each method.

Put simply, if an institution does not possess the right data in an accessible format, scenarios and parallels can’t be performed. With adequate data, banks have the flexibility to test multiple methodologies prior to CECL implementation.

In Sageworks’  recent webinar, only 36% of 452 respondents from banks and credit unions indicated that they feel their data archives are sufficient for running scenarios for CECL. 34% of respondents are actively working to fix data inadequacies, while 22% don’t feel their data archives are sufficient and 8% need help checking and addressing data issues. This indicates that 64% of the participants  are not yet ready for modeling scenarios for CECL.

To begin preparing for CECL, bankers should understand the data points required to run certain types of calculations. By making a shortlist of data needed for select calculations, they can check if they are missing any key data points, such as historical loss, migration analysis and vintage analysis. Other data considerations may need to be factored in, depending on the institution, distinct calculation and data available. For qualitative data, ample documentation will need to be on-hand to satisfy any questions posed by auditors or regulators.

On top of archiving the right data, an institution also has to consider the quality of the information. Here are six, key considerations when evaluating institution data quality:

  • Transparency. It is important to understand how and where an institution’s data is stored. Having a clear process will help minimize confusion when data is needed to perform analysis or calculations. If a third-party vendor is used, ensure accountability.
  • Granularity. CECL may require more loan-level data, depending on the changes to an institution’s processes and calculations. Ensure that the database can handle the increased volume as well as the ability to account for the full life of loans.
  • Accessibility. Ensure data is readily available in usable formats. Storing info across disparate systems or in unusable formats can cause efficiency challenges in the future.
  • Holistic data. Ensure that data is stored for the whole portfolio and not just for losses.
  • Frequency. Data should be updated frequently to accommodate institution needs (i.e.  process daily, archive monthly).
  • Security. If the database is integrated with other solutions, it is important to ensure that data transferred between a core system and a vendor is secure. Data should be backed up frequently and have redundancy to minimize risk. 

 

While CECL implementation may seem like an obstacle to tackle in the distant future, it is imperative to evaluate for and address any data inadequacies now so that proper attention can be paid to scenario building after guidance is released. The good news is that it isn’t yet too late to begin collecting loan-level data – as much and as far back as possible.

Mr. Weightman is financial institutions marketing manager for Raleigh, N.C.-based Sageworks. He can be reached by email at patrick.weightman@sageworks.com

BAI Banking Strategies

Thank you for visiting BAI Banking Strategies. To view more, please Subscribe or Login.

Dismiss