Home / Banking Strategies / The bank data ABCs of M&As: How to merge information between legacy systems

The bank data ABCs of M&As: How to merge information between legacy systems

Dec 1, 2017 / Consumer Banking / Technology
Share

As 2017 nears its end, there’s no sign mergers and acquisitions will slow down. The underlying reasons are many: bankers now navigate a new regulatory environment with more confidence; bank stocks continue to rise; the economic outlook looks positive; and the Trump administration intends to reduce regulations and corporate taxes. All this suggests M&A activity will reach new heights.

As a result, more financial institutions face a unique challenge: how to manage the huge influx of data that comes when two institutions combine. Often that means business as usual as most banks settle on one of three options:

  • do nothing and leave the data in a legacy system,
  • dump the data to an external drive, or
  • let the core provider handle data conversion with the financial institution mapping all the legacy data into the new system to ensure it’s compatible

All of this can create unexpected cost, disruption and risk. Fortunately, many bank executives now know proven alternatives can optimize this challenging-but-crucial process—and position banks for short- and long-term success. All it takes is a more strategic, data-driven approach.

Tradition, but at what cost?

Traditionally, legacy vendors will complete a full data conversion—even when unnecessary. That costs banks on average anywhere from $40,000 to $250,000 the bank’s operating environment, resulting in further costs and negative customer impact. Banks that simply leave data in the legacy system will also shoulder many avoidable costs; the maintenance of the legacy servers (licensing costs, upgrades, support to address technical issues, etc.) usually averages between $10,000 to $100,000 per year. The legacy provider may also charge annual storage and research fees on top of that, also in the $10,000-$100,000 range.

Risk also represents a major downside of traditional M&A data management. As it can’t be audited or updated, unsecured data on old systems or external media can potentially fail, disappear, suffer damage or fall into the wrong hands. Neither indexed nor converted, this data can cost banks even more money because it’s hard to locate.

Customers can take a hit, too. While they usually access just the last 12-18 months of information, banks must provide historical data for 7-10 years past its useful life. When customers need to access it, banks might need much more time to locate it—if they can at all. Incomplete or incorrectly converted data can also hinder employees’ efforts to address customer needs.

In the worst cases, data breaches or mismanagement could lead to regulatory and audit issues or even lawsuits. That creates even more expense on the customer service side and damages the bank’s reputation.

While banks may find it easier and more comfortable in the short term to stick with familiar approaches, they also incur much more expense and risk. So where should banks begin?

Strategy strata

Banks in the M&A process have several data management options. To select the best one, banks first need to clean and organize existing and incoming data from the acquired institution. This avoids introducing suspect or unknown issues into the current system.

First, segment the data by each main type—for example, checks, statements, loan documentation or financial reports—that you can complete quickly and easily. You can break down segments even further by date (e.g. check statements from the last 12-18 months versus those prior).

Once data has been segmented on paper or in the data project plan, give an overall rating to each type. The rating system can be unique to each bank, but it should take the following into account:

  • What type of data is it?
  • Who needs access?
  • How critical is the data?
  • Do we need it for processes/functions in the bank itself or simply regulatory purposes?
  • How sensitive is the data?
  • Do we need to keep it?

After scoring, banks should group data into three categories:

  • data they need immediate access to (signature cards or the last 12 months of check images)
  • data they must keep but access infrequently (closed loans, internal or system reports, check images older than 12 months, etc.) and
  • data that’s nice to have but not required (cold reports, miscellaneous imaged documents and reports).

If you work with a conversion or migration vendor, they can easily categorize data once you and your business units define and document segments. These categories will determine how to implement the data; bank data compliance officers may already have this information.

Your bank’s best blueprint

With data segmented and grouped, banks can make more informed choices on the best, most cost effective data management blueprint during a merger or acquisition.

If a bank needs immediate access to all legacy data, it’s wise to undergo full conversion to map it into production. Banks can mitigate costs and ensure minimal disruption by using a third-party specialist with industry knowledge and experience working with many different systems—including those the bank migrates to and from. They’ll also have the bank’s best interests in mind since they have more objective business goals.

Most likely, banks will need instant access to just a small part of their overall data archive and infrequent access to the rest. If so, employ a hybrid approach where you convert a portion of data needed for immediate access into the production system; migrate the rest to a less expensive archive. This ensures your bank only pays to convert what you actually need within the core system. Quickly migrating other data minimizes costs and disruptions.

From intimidation to optimization

Data management needs that result from M&As can be intimidating. But it’s crucial to address needs head on and optimize the process. This not only reduces costs upfront and with long-term data storage: It also heads off retrieval frustrations, potential disruptions and future consequences. Smart data management during the M&A process positions banks for across the board success. Surely, the numbers will back this up, even as you back up the numbers.

Want more Banking Strategies? Sign up for our free newsletter!

Kris Bishop is the founder and president of Integrated Legacy Solutions (ILS).