A number of surveys, including our own, have revealed the unsettling fact that many companies have bought more technology than they need. To grasp the magnitude of this waste, look no further than the powerful computer on your desk. What does it do most of the day? It waits…and waits…and waits. Even in those rare moments when you actually give it something to do—spell-check a document or recalculate a spreadsheet—you’re hardly pushing your PC to its limits. Now multiply that by the millions of computers around the world, many of them, like yours, sitting around doing mostly nothing.
Now imagine that all the unused processing power in these woefully underworked PCs could somehow be harnessed and shared over the Internet to run really big, taxing applications. That’s the idea behind a concept in IT known as grid computing.
Grid computing began as a way to build massively parallel processors from clusters of smaller computers, which was seen as a more-economical way to tackle compute-intensive problems than the handcrafted supercomputers used by researchers, academia, and a few large corporations. One well-known, if somewhat unconventional, grid-computing project is SETI@home, which started in 1996. SETI stands for Search for Extraterrestrial Intelligence, and the project relies on hundreds of thousands of PC users worldwide who donate a portion of their machines’ computing capacity to help search for signs of life in outer space.
Early grid-computing systems were custom-built to handle specific, compute-intensive tasks. But recently the development of a technical specification—called the Open Grid Services Architecture—has allowed software vendors to create business applications that can harness grid resources. Many of these applications are designed for specific vertical industries, including oil exploration, financial services, and semiconductor research.
A separate but related commercial application of grid computing involves allocating business applications around a company’s internal network. Essentially, a central software program scans the network for unused computing capacity. When it finds an idle machine, the software makes a copy of the application and the user data, then sends the copies to the machine.
These new applications stretch the definition of grid computing and cause some confusion, to the point where “you could say grid computing is in the process of being reinvented,” says Forrester Research analyst Galen Schreck.
How It Works
Grid computing is akin to peer-to-peer networking, the same technology at the heart of the music-sharing controversy. A network of computers behaves as if it were a giant, parallel-processing supercomputer, allowing a single software application to dynamically harness the unused processing power of all computers on a network. Special software “looks” around the network to see which computers are free, sends those computers a piece of the job at hand, then takes the product of these various computations and assembles them into a coherent result. Special software is needed to create a grid, and the application software is tuned to the grid. Also, when grid computing uses the Internet as its network backbone, special security software is needed to block unauthorized users.