• Technology
  • The Economist

Computing Power on Tap

A look at the most ambitious attempt yet to combine millions of computers seamlessly around the world -- to make processing power available on demand anywhere, rather like electrical power.

In another guise, the virtual organisation could be an industrial consortium developing, say, a passenger jet. The Grid would allow the consortium to run simulations of various combinations of components from different manufacturers, while keeping the proprietary know-how associated with each component concealed from other consortium members. In both cases, several unrelated types of calculation, using independent data sets, and running on a range of computers in different organisations that may not fully trust one another, have to be threaded together to achieve a coherent answer.

It is precisely because such undertakings would be a nightmare to negotiate over the Internet that the Grid has come into being. Unlike neatly defined science problems, which amaze more by their petabyte scale than their inherent complexity, virtual organisations are noteworthy because of their constantly shifting landscape of data and computer resources, and the authentication on which they rely. This is where the Grid would have its greatest value. It is such problems that could lead to new business models — just as the Internet created the conditions for e-commerce. Unfortunately, this is where the Grid’s developers have barely scratched the surface.

Timing Is All

The idea of distributed computing is as old as electronic computing. When devised more than 35 years ago, Multics — a multi-tasking operating system for mainframes that was finally retired last year, and is a distant ancestor of today’s Linux — had many of the Grid’s goals in its original mission statement (operation analogous to power services, system configurations that were changeable without having to reorganise the software, and so on). Other Grid precursors abound. The point is that the enthusiasm for the Grid today is not the result of some fresh and sudden insight into computer architecture. It is simply that the hardware has now improved enough to make an old idea work on a global scale.

Two hardware developments are bringing the Grid within reach. One is the rapid increase in network speeds. Today, a normal modem connection in the home has the same data-carrying capacity as the backbone of the Internet had in 1986. The latest version of the Internet protocol (IPv6) used for sending data from one computer to another will make it easier to standardise video conferencing and remote operation on other computers — core components of a Grid service.

Hardware providers are already anticipating the needs of the Grid in their planning. One example is Géant, a European network with transmission speeds measured in gigabits per second that is being touted by a company called Dante in Cambridge, England. Such data networks will be the equivalent of the high-voltage power cables that criss-cross countries.

The second factor that is helping to make the Grid possible is the growth in power of individual microprocessors. This continues to follow Moore’s law, with processors doubling in power every 18 months. Today, even the humblest PC has enough spare processing power and storage capacity to handle the extra software baggage needed to run Grid applications locally. This is crucial, because no matter how clever the Grid may be, large amounts of computer code will still have to be transported to and from individual processors so that they have the tools to deal with any unpredictable task.

When the Grid really takes off, it will render obsolete much of the computing world as it is today. Supercomputers will be the first to feel the pressure — much as networked PCs consigned mainframe computers to the basement. Ignore for a moment the pronouncements by industry leaders and government steering committees. The best thing about the Grid is that it is unstoppable: it is just too good an idea to remain dormant now that most of the enabling technology is in place.

Like the Internet, the Linux operating system and countless other open-source endeavours before it, the key breakthrough that will make the Grid an everyday tool will doubtless come not from some committee in Brussels, Geneva or Washington, but from some renegade programmer in Helsinki or Honolulu. When? It could be any time during the next decade — perhaps even next week.

Copyright © 2001 The Economist Newspaper and The Economist Group. All rights reserved.

Discuss

Your email address will not be published. Required fields are marked *