• Technology
  • The Economist

Profit from Peer-to-Peer

Despite Napster's travails, some fledgling firms are out to sell the idea of peer-to-peer computing to large enterprises. They promise to use the computing architecture to empower workers, unleash their creativity and solve communication problems.

Another area where interaction software’s ability to deliver up-to-date information counts is searching the Internet. Typical search engines in use today deliver content that is at best 24 hours old. Even then, most search engines dredge up only a fraction of the information that is available on the Internet. OpenCola and Infrasearch, formerly in San Mateo but recently acquired by Sun Microsystems, are developing the next generation of search engines. These will use peer-to-peer techniques to deliver more timely and comprehensive information for media groups and other large content owners.

Hardest Sell

Meanwhile, a number of young firms, including MojoNation of Mountain View, California, are working to create resource-utilisation programs that harness P2P’s ability to store files, distribute content and share the processing power of other machines. The goal here is partly to cut costs on such hardware as storage, servers and other equipment, but also to help manage traffic on the network. Of all the potential services offered by P2P, this could be the hardest sell. There may be too many problems associated with security and complexity — not to mention the plummeting cost of storage and servers — to make such peer-to-peer services practical. Also, such services appear to be going up against the likes of Akamai, a firm whose caching technology helps speed the performance of websites. As the sceptics point out, offering savings on things that are only marginal costs anyway is hardly a viable business model.

Finally, there are the distributed computing services that deliver supercomputing power to companies needing massive number-crunching capacity occasionally but unwilling to pay millions of dollars for it. Essentially, the technology being developed by firms such as United Devices of Austin, Texas; Entropia of San Diego, California; and Applied Meta of Cambridge, Massachusetts, breaks down large computations into small parcels that can be distributed among computers tethered to a network. Each PC simultaneously computes the data and returns the results to a central computer that assembles the parts into a whole.

The process can be used, for instance, to farm out individual frames of digital animation to different PCs for simultaneous rendering, and then to recombine the rendered frames into a fluid sequence. Once you add up the thousands or even millions of computers that can be roped in to do such calculations, the result is a parallel supercomputer with many teraflops (trillions of “floating point” operations per second) for a fraction of the cost of a supercomputer such as IBM’s chess-playing champion, Deep Blue, or its forthcoming protein-folding colossus, Blue Gene.

The supercomputing start-up firms reckon that there are fortunes to be made from selling this kind of service to companies. The applications could be anything from genetic research to financial Monte Carlo probability simulations. Most of the firms in question are addressing problems that require brute force number-crunching — such as digital rendering, engineering design, pharmaceutical research and financial modelling.

Discuss

Your email address will not be published. Required fields are marked *