Getting Big Data Right for Smaller Banks
Over the last year, we have spent time visiting small and mid-size banks across the U.S. to understand their current approach to managing their retail banking operations and their marketing approach to cross-sell, up-sell and retention. We found that while the larger banks have made significant headway in using big data and predictive analytics to improve sales and operating performance, the smaller banks have a long way to go.
Generally speaking, banks with less than 20 branches or so remain in the Excel world and Excel can only take you so far. Those with between 20 to 100 branches have experimented with some sort of business intelligence (BI) solution, but without any deployment of predictive analytics. Their use of BI solutions has been spotty as they have a limited number of employees who know how to use the technology and adding resouces or competencies is prohibitively expensive.
The 100 plus branch banks have made reasonable headway in the area of BI and predictive analytics, but the benefits are constrained to a small user group – deloyment and benefits are not bankwide, as the cost is too high. So everybody is not seeing the same data.
Big banks address the problem by throwing big bucks at it – no surprise there. Most multinational banks have almost every possible BI and analytics tool deployed at the enterprise level or within particular units. Vendors love these banks and most BI and analytics service providers, big or small, would rank such large banks among their top customers. To add to this, big banks hire armies of stat and math experts (only doctorates or masters may apply!). A large UK-based global bank, for example, has a team of approximately 250 such experts providing stat and math analytics for credit cards alone.
Essentially, big banks get the job done by buying a multitude of tools, hiring contractors to build the required BI and analytics and then setting up large internal teams for stat and math analytics. This generally, though not always, gets the job done. One large bank we know of spends about $20 million annually on a regional data warehouse and analytics program. Even after three years into the program, the regional CEO could not get a consistent view of product profit across the few countries that made up his region.
Spending the right amount of money is important, but even more important is getting the right intellectual property. This may come with hiring the right contractors or consultants but is generally the biggest challenge to a successful BI and analytics program.
The Small Bank Challenge
Based on the extensive interaction we have had with smaller banks, what is immediately clear is that these institutions are in danger of simply aping larger banks in how they are deploying BI and analytics within their organizations. They should note, however, that the same approach with smaller spends will obviously deliver smaller results. And smaller banks cannot get away with smaller spends. If it costs $10,000 to develop a customer dashboard, then that’s what it will cost irrespective of the fact that you have a hundred thousand customers or millions.
The crux of the matter, then, is “Can BI and analytics be done differently?” Smaller and poorer countries leapfrogged mainframes and legacy computers and directly deployed modern day servers and desktops. Can small and mid-size banks similarly leapfrog their existing limitations in BI and analytics and compete with big banks in terms of results rather than spend? Clearly incremental changes and minor cost cutting is not going to help. The entire approach to developing and deploying BI has to change. Only then can small and mid-size banks match the capabilities of the larger banks.
To identify such a different approach, let’s dissect BI and analytics capability into its constituent parts. First, there is the underlying software. Excel clearly does not qualify as an enterprise BI tool, but there are tools that provide enterprise capability and “best of need” as against “best of breed” capability. Such software forms the first building block of our new and different approach but represents only a small part of the overall expense.
A largeer chunk of the spend is directed at high cost contractors or internal staff that develop and deploy BI and analytics within the bank, including the extremely high end stat and math analysts who develop advanced predictive models. The biggest cost, however, is the “time to intelligence.” Acquiring a report or dashboard today can result in business gains or savings that are far greater than getting the same report or dashboard a few months later. Does a new approach to substantially reduce this effort and time exist? Does every bank need to build the same or similar reports? Does every bank need armies of statisticians to tinker with their data in similar ways?
With most other systems, from core banking to cards or from originations to collections, banks have been using one of a small set of commodity software solutions. If the front end or the data generation end of a bank is run by commodity software, then why must the back end or data analysis end be built uniquely and separately by each bank?
Packaged BI and analytics solutions constitute the new approach. These solutions are built by expert teams and can actually be deployed within weeks rather than months, since they include pre-built enterprise funcationality. And the cost benefit is increased if these solutions can be used with low-cost commodity software.
Time may be a great leveller but so are BI and analytics solutions in the banking world.
Mr. Anand is the managing director of New York City-based Cedar Management Consulting International LLC, a global consulting, advisory and analytics firm, and can be reached at [email protected]. Mr. Shidhaye is the director of product development at Cypress Analytics, Cedar’s business intelligence and analytics unit, and can be reached at [email protected].