CFO IT welcomes your letters. Send them to: The Editor, CFO IT, 253 Summer St., Boston, MA 02210.
E-mail us at Scott@cfo.com. You can also contact a specific author by clicking on his or her byline at the beginning of any article.
Please include your full name, title, company name, address, and telephone number. Letters are subject to editing for clarity and length.
Your article “Spreadsheet Hell?” (Summer) overlooked the most important question: flexibility. The specialized financial-software vendors can yap all they want about how easy it is to change things in their system, but it comes at the cost of doing things within the framework they have provided. Spreadsheets make it nearly impossible to make major alterations to a large complex model, but if you’re starting from scratch or building a prototype, you have unlimited freedom to set things up the way you want. In using dedicated systems, I am constantly running into walls where the designers simply didn’t build in the tools needed to do the job. A quick dump to a spreadsheet, and I’m off and running.
It is impossible for software vendors to anticipate all the analytical needs of the companies they serve. That’s why there will always be a role for spreadsheets. It is, in fact, the same role they have always held and the role they have always been most appropriate for. If it appears that spreadsheets have extended their role into areas they should never have been used for, that only reflects the degree to which dedicated packages have lagged behind user needs. You will see that same equilibrium shift off and on in the future. Dedicated software vendors would be wise to watch that shift. But to talk about spreadsheets being obsolete is pure drivel. It misses the point.
I read your article concerning spreadsheets and thought that it is truly a dark day when CFOs are reluctant to implement time-saving technology due to concerns over Sarbanes-Oxley. If anything, CFOs need to press forward at breakneck speed to untangle the mess that many of us have created in relying so heavily on inadequate technology — primarily spreadsheets, which have so many inherent problems in maintaining data integrity. The spreadsheet “crutch” has evolved due to the proliferation of companywide “technologies du jour,” which have created a cobweb of disparate systems that make the CFO’s life a nightmare as we try to make sense of it all and tie it back together for financial reporting.
While most of your articles showcase large corporations, I feel especially fortunate to be implementing time-saving technology in a relatively small company environment. As a CFO, I have developed technology that has driven financial-reporting efficiency in an industry that has been highly regulated for years (community banking), and in which producing timely and accurate reports was of paramount importance long before Sarbanes-Oxley.
Peter W. Minford
Executive Vice President and CFO
First Bank of Idaho
Sun Valley Bancorp
Managing Member, Data Informatics LLC
Revenue Is What Matters
Regarding Nick Carr’s continuing rant on the questionable value- add of IT (“The Tract of the Matter,” Summer), if he were really cold-eyed and clearheaded, he would actually calculate the amount of revenue to allocate to IT along with all the other corporate resources and then make the argument about how much value IT actually adds. Alas, because he has no way of allocating revenue to IT, he cannot help us resolve this important question. In fact, almost none of the ROI-on-IT approaches your magazine routinely covers tackles the issue of estimating, using some objective and verifiable method, the amount of revenue that should be allocated to IT. To do this, they would have to estimate the amount of revenue to allocate to all corporate resources.
Wouldn’t it be nice if we could once and for all resolve the issue of how much value all corporate resources add by estimating how much revenue to allocate to each in proportion to its contribution — again using some objective and verifiable allocation method. The accounting profession has abandoned us in this quest since they don’t think it is even possible to allocate revenue inside corporate boundaries. My suggestion is that we look outside the accounting profession for now to find our answers, and then invite the accountants to the party and ask them to help resolve any issues with the revenue-allocation method. Until this issue is resolved, we will continue to mud wrestle in this subjective IT value quagmire.
Naval Postgraduate School
Deja Vu All Over Again
Congratulations to American enterprise. According to your cover story (“The Meter System,” Summer), they have rediscovered the remote service bureau and time-share computers. Think Infonet at Computer Sciences Corp. in the ’60s.
Andrew H. Olson
TEAM International Group
Utility Computing Is a Numbers Game
Norm Alster’s excellent cover story sets the right tone for a review of utility computing with examples from the venerable history of the IT hype machine. In these volatile economic times, converting IT costs from fixed to variable sounds like a winner, but the reality is more complex.
Decades ago, in the premicroprocessor mainframe days, utility computing (in the sense of time-sharing on a mainframe) was the only game available to many companies because of system costs and the limited number of people who could operate or program such systems. Today the motivation for utility computing (in the sense of running applications remotely on someone else’s computer system) has more to do with excess computer capacities (especially higher-end servers) and the differential margins on servers: larger servers command higher margins than PC-based systems, but few companies need the computing capacities of the higher-end machines and are reluctant to pay for them. Hence, large service-oriented vendors (IBM, HP) are promoting the concept of pay-as-you-go computing to both prevent margin erosion and capture market share.
Grid and utility computing are not identical. Grid computing involves distributing the load for a computational problem to multiple independent computer systems on a network (internally, externally, or both); the effect is to pool the resources of many machines to solve a problem (strength in numbers and all that). Grid computing was inspired by the needs of various national laboratories, such as Argonne and others: they wanted (and want) to attack “grand challenge” problems (e.g., protein folding) without each lab having to buy its own mammoth systems. In that sense, grid computing is both a funding opportunity and an interesting technical challenge. In the business world, grid computing is a response to the realization that all those PCs idling on desktops (especially at night) are an underutilized asset that with decent system software can reduce IT upgrade costs.
One key problem for utility or grid computing entails the cost of moving lots of data over a wide-area network. If large volumes of data must be moved over a WAN with a fast response, then the price of even minimally adequate network capacity will be high, perhaps costing more than the computer system over a couple of years. Vendors of utility computing that rely on high-volume data traffic over a WAN must factor in these costs, including how variation in the load can affect pricing. Unfortunately, competition in WAN service is not as fierce as that among computer-system vendors.
Interest in utility computing is motivated by the costs involved in the seemingly constant need to adapt one’s IT infrastructure and applications to changing business processes. Yet outsourcing one’s applications to a vendor’s remote data center usually means sharing standard applications on its servers. In most cases, the vendor cannot or will not customize the application to meet individual customer’s business-process needs. This lack of adaptability can be an ironic limitation to utility computing since nearly every package needs to be customized for the sake of organizational efficiency and competitive advantage.
Before opting for utility computing, consider the trend in computer technology. Over the past two decades, the price/performance of computer systems and components has dramatically improved (especially PCs). These factors have led to a world awash in computing capacity for most business applications. The physical size of systems has reduced space and HVAC costs as well. Application software is increasingly powerful, systems are easier to manage, and there are many more knowledgeable IT people than two decades ago. CFOs need to factor in the price/performance curve of systems when evaluating the possible cost savings from utility computing and whether their computational needs are rising faster than the computational price/performance curve is improving. If so, then utility options should be carefully examined, especially the service-level agreement. Often outright ownership is the more cost-effective route.