Keep IT Simple, Stupid

As new software and IT services increase in complexity, the need for a unifying strategy is more important than ever.

Ask Philip Scorgie what the most significant IT improvement for his business has been in the past few years, and the answer won’t be newfangled wireless technologies or obscure EDI solutions. It will be simply that, for the first time, everyone at his company is reading from the same script. Scorgie is the chief information officer for Deacons, an Asian law firm employing about 650 people. Over the past four years, Scorgie has transformed how Deacons’s large Hong Kong-based practice shares information and manages its technology resources — by standardizing its IT infrastructure.

From a chaotic setup in which crucial legal information was scattered among incompatible databases and every PC was hand-built to different specifications, Scorgie shifted the entire practice onto a unified computing environment on near-identical machines from a single vendor. The result is a system that today is both easier to manage and vastly more stable; with everything configured the same way, crashes are easier to avoid. Upgrades are simpler because Scorgie no longer has to order from different vendors, schedule and download software patches from multiple sites, or process dozens of bills. And lawyers can at last easily access information they need from their desktops.

Soon the transformation will be complete. After 12 months of work, Microsoft SQL Server will replace a legacy practice management system at the end of 2002. Scorgie says the previous Unix-based system was arguably more reliable, but the lower maintenance cost and better compatibility of the new system far outweigh any disadvantages. “The bottom line is that integrated products always offer higher ultimate reliability because there’s less complexity to manage,” says Scorgie.

Raising the Flag

Those four words — less complexity to manage — could be a rallying cry for CFOs under pressure to get more bang for their IT buck at a time when corporate computing resources are being stretched in ever more directions. Integrating data, applications, and business processes seamlessly and cheaply has long been a core concern for IT managers, as well as for CFOs who rightly question why they should sign off on the umpteenth proprietary system that may require thousands of developer hours to implement.

But as globalization and the growth of E-business continue to up the levels of IT diversity most companies must manage, the need for some sort of unifying framework is more important than ever. And harder to achieve. “There’s a tremendous challenge for CIOs to reduce costs and boost productivity,” says Danny Tam, general manager for Sun Microsystems in Hong Kong. “That boils down to people looking for consolidation,” he says.

Nigel Lee, Asia Pacific director of consulting services for US-based IT services giant EDS, has noticed increased interest in standardization from IT managers who binged on technology during the Internet stampede. Now, in the tougher economic climate, those managers are under pressure from CFOs to rationalize the diversity of applications built up during the boom cycle.

For CFOs weighing integration strategies, the decision to standardize broadly comes down to a choice between two opposing schools of thought. On the one hand, there’s the “best-of-breed” approach, where you select systems best suited to individual tasks and then figure out how to get them to talk to each other. Standardizing, however, means jettisoning this approach in favor of end-to-end, “total solutions” that skirt the integration issue through unified components from a strictly limited number of vendors. Both approaches have their advantages, and which is right will depend on a company’s particular circumstances, the complexity of its needs and the extent to which it is wedded to legacy infrastructure.

Discuss

Your email address will not be published. Required fields are marked *