Business intelligence (BI) is worth little if it is not applied systematically to improve performance. This is the message from Bearing Point’s Australian CEO, Andy Robertson. He told a Wellington audience last week that BI is morphing into a “corporate performance management [tool]”.
Performance management is “the process of managing strategy through an integrated system of improvement methodologies, with the aid of technology,” says Robertson.
Robertson was being hosted by SAS, so a company message about SAS’s software was also apparent. SAS has been moving from away from pure statistical analysis to more sophisticated business performance management, both in data mining and in OLAP multidimensional analysis. This poses questions such as: which of our functions performed well, or badly, in which geographic region, in which month, and for what reasons?
Robertson, however, concentrated on higher-level issues in his talk. These included planning a strategy and business targets, with identification of key performance indicators. He explained how, after this, operational plans and budgets are drawn up; performance against the indicators is tracked (throughout a designated period) and business reviews are then conducted. The resulting information is then fed back, as amendments, into the strategy.
Current BI software is often under-used or used in the wrong way, says Robertson. The chief financial officer is confined to a bean-counting role, just looking at the numbers, rather than acting as a strategist who sees them in context, and so can fix problems as they arise and devise new strategies.
The software is also often lacking, he says. “Typical programs don’t address data integration or data quality.” Multiple point-to-point interfaces typically develop, with each department correcting the data errors that pertain to its own function. This produces multiple, disparate, and even contradictory, data-sets.
The first step in designing a corporate performance management system is to funnel all the data into one common gateway. It is then cross-checked and cleaned up. “[This way] data profiling and cleansing is done consistently and [just] once,” says Robertson. The departments then draw on this central repository. Data quality forums should discuss the treatment of data and also set “quality bars” below which input will not be accepted, he says.
“Reliable-enough” data, when viewed in an integrated way, often suggests what-if experiments. For instance, these could involve changing a price or an ingredient to see what effect it has on returns, or exploring the results that can be obtained by increasing selling time with the customer.
SAS’s Adam Stott speculated on why, since corporate performance management was defined in the early 1990s, we still have the same issues coming up?
Stott identified one issue as over-reliance on Excel. Spreadsheets get passed around, columns and rows added, and formulae changed, until no one has a clear idea of how the spreadsheet works, and whether there are any errors in it.
Centralised quality control is, once again, what is needed here, he said.
ERP can also prove an expensive, unproductive investment, he says. “All it does is gather information.”
It’s usually good as a transactional system, but it is not an analytical tool, said Stott. He suggested that more attention be paid to non-financial information, such as measures of customer satisfaction, speed of response to queries and staff turnover.
If it takes too long to consolidate performance figures, they will be useless, he said. “How relevant is a nine-month-old budget?” Quicker processing enables the CFO to spend more time on the planning aspects of the business.
Another danger sign is when the CFO cannot have confidence in the figures. If two people are asked to prepare a report on the same subject and different figures result it’s a sign of trouble.
Keep the simple user interfaces, like Excel, as the front-end, suggested Stott. But quality-controlled data and more sophisticated analysis should underlie these simplistic charts and graphs.