• Technology
  • CFO Magazine

Drowning in Data

A flood of corporate data, intensified by Sarbanes-Oxley compliance, threatens to overwhelm business managers.

Eventually that will lead many companies down the path of automation. As Watson notes, merely backing up everything won’t cut it: “You need auto indexing, and you need rules and parameters for the indexing.”

A ray of hope for technology vendors? Possibly. Some finance managers say, yes, they’ll likely be more inclined to purchase compliance software in 2004—once they’re through with all their documenting, that is. Says Naughton: “I don’t want to do this every year.”

John Goff is a senior editor at CFO.

White Goods for Data?

When Section 409 of the Sarbanes-Oxley Act of 2002 kicks in for real—in January 2004—many companies will have to report material events to the Securities and Exchange Commission within 48 hours. Meeting the short deadline could prove to be a bear, particularly for businesses that plan to examine financial data to determine if an event qualifies as material.

The snag: data analysis is still anything but real time. Data warehouses, which nowadays house terabytes of information, are rarely updated on a daily basis. What’s more, the traditional architecture for warehouses—a patchwork of various drives, servers, and software—is best suited for backward-looking, slow-cooking sorts of analysis. The large amount of data movement in a typical warehouse limits the slice of information that can be accessed in a single search (usually about 1 percent of the available data). To get a fanfold view, users must engage in repeated queries. Says one CEO: “You go back and forth, back and forth.”

Appliances to the Rescue

Managers who need to perform ad hoc queries—and need the results now—are generally out of luck. Notes Dan Vesset, a research manager at technology consultancy IDC: “Speed of decision-making, whether we’re talking about real time or near­real time, is still only a goal for most organizations.”

This may be changing, however. A new device, called a data appliance, could radically alter the time it takes to analyze data. Built from the ground up as a dedicated storage, retrieval, and analytics system, a data appliance is an all-in-one machine. Since server, storage, and software are integrated at the lowest level, there’s less movement of data. The result? A 10-to-50-times improvement in performance for products from data-appliance maker Netezza Corp., claims Jit Saxena, CEO of the Framingham, Massachusetts-based company.

Netezza sells five data-appliance models, ranging in price from under $1 million to $2.5 million. The basic unit, a rack, can store up to 4.5 terabytes of data. To increase capacity, customers simply buy additional racks. As for the vendor’s performance claims, Wakefield, Massachusetts-based Epsilon, which hosts data for financial-services companies and others, recently installed a Netezza data appliance. Mike Coakley, Epsilon vice president of marketing technology, recalls the benchmarking the company performed on the device before making a purchase. “We tested load times, queries, summarizations,” he says. “The results were astronomical—borderline ridiculous.”

Coakley claims the data appliance has cut load times at Epsilon from 11 hours to 3. Complex SAS queries on an Oracle database, he notes, used to take 2 hours; now they take 15 minutes. Says Coakley: “This is a real shift.” —J.G.

Discuss

Your email address will not be published. Required fields are marked *