Allocating IT Costs: Can Simpler Be Better?

Occam’s Razor: Assumptions introduced to explain that a thing must not be multiplied beyond necessity. (Sometimes construed as, “The right answer is usually the simplest answer.”)

It is a growing headache for the conscientious chief information officer (CIO) – how to fairly allocate Information Technology (IT) costs across the lines of business. Today, when the business lines are struggling to compensate for lost fees and new compliance burdens, they are closely scrutinizing every expense, including the bill from IT. A lot is riding on that bill: it affects the businesses’ profits and how many IT initiatives they can afford.

If the businesses rarely like what they see on their IT bill, it is not for lack of IT effort. Many IT groups sink serious resources into good-faith efforts to create a fair and accurate cost-allocation model. At large banks, teams of skilled cost accountants survey business heads on their views, create assumptions, carefully vet them with the businesses, and then build sophisticated models accordingly.

They might base assumptions on how many central processing units (CPUs) a business line uses, or how many applications, products, beneficiaries, square footage, time spent – or combinations thereof. There is no shortage of creativity and thoughtfulness that goes into these calculations – and no shortage of personnel and computer expense.

But, as a chief financial officer friend says, “One man’s allocation is another man’s allegation.” No matter how it gets done, when the allocations are made, nobody is happy. Why is that?

Change. No matter how much precision the cost accountants build into their models, they can only be accurate for the point in time at which the assumptions were formed. Then prices go up. A big project wraps up in one business, a major conversion kicks off in another and a new regulation takes effect for a third. Or an acquisition happens. IT resources shift, but the allocation model remains unchanged until the next scheduled adjustment, whether months or a year out. What was fair one day is now patently unfair. Waiting even three months leaves a long gap during which the model steadily decays in accuracy.

Overhead. Overhead costs, not attributed to any specific line of business, are a special headache. These costs can be significant; top IT executives might work across the whole IT organization, but probably not equally across the businesses. In any given year, there is going to be one line of business, or two, or three, but not all, which dominate their focus. The lines of business who are not getting the attention generally know it and top executives are not going to “punch a clock.” So no matter how many negotiations are held to arrive at the overhead allocations, there remains a perception of inequity, and often the reality.

Extrapolations. No matter how many variables the cost accountants capture, nor how much precision they strive for, they can’t account for every hour and every expenditure every day. Invariably, they have to extrapolate. They have to decide, when they populate the model, which business line will pay for each IT resource. If there is an IT team devoted to a project for wealth management, they can allocate the team to wealth management for the year. Or they can estimate the timeframe and allocate the team to wealth management accordingly. But then where to allocate the team for the rest of the year? Such dilemmas lead to judgment calls – each one reasonable but each one another step away from absolute precision. Multiply judgment calls like that times hundreds or thousands of resources and trust in the numbers diminishes.

Complexity. No matter how solicitously the cost accountants involve the business people in creating the model, nor how carefully they document their assumptions, the results are still dense and complex. It is a stretch to expect that business heads will find the time and insight to penetrate the costing complexity and fully appreciate what they are seeing. That translates to an apparent lack of transparency, another source of mistrust.

Effort/Cost. Real-time cost allocation would solve many of those issues but exacerbate another one – the high cost of precision. It would mean keeping the costing groups employed almost full-time in vetting and verifying how the IT resources are being used. When does the expense of perfecting the model outweigh the benefit? These skills don’t come cheap, and the efforts absorb a great deal of time from the people they must interface with. The costs of these efforts are hard to quantify, but it is safe to say that diminishing returns set in at some point.

So if today’s models are unsatisfactory despite dedicated effort and sizeable expense, is there a better way?

Consider a simplified hybrid approach. Allocate directly to each line of business all costs directly attributable to them. Then, allocate the remaining costs (those not directly attributable to a line of business) across all lines of business based on headcount. That is, if mortgage lending’s 100 people are a tenth of the total headcount that IT supports, mortgage lending pays a tenth of IT’s costs not already allocated directly to specific lines of business.

Granted, it lacks the precision of the (unattainable) perfect model, but it has compensating virtues.

It is transparent. The basis for the allocation to each line of business is out there, visible not just to IT and the receiving line of business but to the other businesses.

It is easily measured. No small army of cost accountants has to patrol the halls, counting resources, making assumptions, verifying them, and repeating the process every reporting cycle.

It is easily verified. No tough negotiations are needed, consuming the time of executives trying to protect their margins and get the best deal for their businesses. Everybody trusts it; everybody believes it.

It is controllable. Other than a big reduction in a line of business’s headcount, all can be comfortable that their allocation will be consistent with their headcount. If IT costs goes up, the allocation goes up. If IT costs fall, so does the allocation.

It is comparable. If a business’s headcount stayed flat from one period to the next but its IT costs suddenly rose in the second period, the questions for IT are fairly straightforward.

It is fast. No more long delays waiting for the data to come in, the allocations to be made, and the reports to be distributed before the period can be assessed.

Transitioning to this new methodology can hold some challenges. It is crucial to take the time to cleanse the current cost data, and identify and vet the direct costs you intend to allocate to individual lines of business. It is wise to take the time to get buy-in for what is a cultural change, not just an accounting change, and to understand compensation plan impacts, test real-time modeling feeds, and time the implementation carefully.

But just imagine taking back all the time, energy, and expense of today’s process and devoting it instead to strategic initiatives that have clear payoffs. Don’t let perfect be the enemy of good. Consider adopting a believable, measurable, controllable, real-time, simple methodology that competently serves the greater purpose.

Mr. Donnelli is a principal with ABeam Consulting USA. He can be reached at [email protected].