• Technology
  • CFO.com | US

Data Mining in the Meltdown: the Last, Best Hope?

In times of economic uncertainty, corporate decision makers need every scrap of information they can get to make an informed decision.

Demand for performance data is skyrocketing within organizations. Arthur Kordon, leader in the data mining and modeling group at the Dow Chemical Co., says his team has never been so inundated with requests as it has been during the economic crisis. “Executives are coming to us as sources of last hope to get answers,” Kordon says.

It’s not surprising: in times of economic uncertainty, corporate decision makers need every scrap of information they can get to make an informed decision. The importance of data and analytics has also been underscored by the banking crisis. At its heart, banking’s problems were not about greed, excessive risk-taking, and lax regulation, but about data quality, says Thomas Redman, president of Navesink Consulting Group and a keynote speaker at “Predictive Analytics in Perilous Times” a CFO conference held in San Francisco earlier this week.

“The data was wrong, or it simply wasn’t available. Bad data and the lack of transparency made it easier for the greed to go unchecked,” says Redman, a former Bell Labs quality specialist. In some cases, data was also mishandled, misused, or not used at all.

It behooves CFOs and other senior executives to head off such problems with a more proactive and aggressive strategy to capture high-quality data and leverage it in decision making, according to Redman. The consequences of not doing so: acting on data and analyses that should not be trusted or not acting on data and analyses that should be trusted. Both come with high costs.

So where should CFOs aim their guns? While past data errors can be costly, management’s energy is better spent on preventing future errors, Redman says – that can reduce the cost of poor data quality by one-half to two-thirds, he says. “When you find root causes you eliminate thousands of future errors at a stretch.”

From that starting point, there are many things CFOs can do to improve data accuracy. Two key tasks: establishing controls at all organizational levels to halt simple errors and formalizing management’s accountability for data quality.

The responsibility for data lies with business, not the information-technology department, Redman stresses. That requires managers to measure data quality at the source – and in terms meaningful to business units. For a bank, a quality measure could be as simple as the percent of customer statements that contain an error. For the U.S. Defense Logistics agency, the supplier to the Defense Department, it could be the percentage of weapons systems that ship on time to battlefield troops in Afghanistan.

Top-flight organizations also typically invest time to extract high-quality data from suppliers. “You must work back into your supply chain to make sure you know the quality of the ‘raw materials’ that go into things like financial products,” Redman says.

Striving for continuous improvement and setting aggressive error-rate targets are also habits that leading companies adopt, as is leadership from a broad-based, senior management team. What’s more, management will support cleanup efforts if they see it as crucial to overall corporate strategy.

Discuss

Your email address will not be published. Required fields are marked *