Getting value from grid computing

By David Braue
Wednesday, 05 November, 2003


Growing interest in grid computing technologies will drive its rapid advancement as the technology grows from research darling to commercial reality, IBM's senior grid computing strategist, Rob Vrablik, predicted during a recent visit to Sydney.

As a concept, grid computing -- which links dozens, hundreds or thousands of computers to tackle the same computing task -- is well suited to the needs of researchers, who often need massive computing power to run complex simulations or models for short periods of time. Rather than buying and maintaining millions of dollars' worth of high-end computing equipment, access to a grid lets companies rent the number-crunching capabilities they need.

That's made it popular amongst a growing number of commercial customers, who IBM sees as a potentially significant source of revenue. The company is currently running more than a dozen commercial trials with companies such as Charles Schwab. One application, for example, used grid technologies to churn through data on more than 1 million US households in order to match their spending habits against any of a thousand data models. The grid-based infrastructure let the application call upon a broader range of resources, completing the analysis far faster than if it was limited to a single specific cluster.

"These subscribers can now get multiple iterations of these runs in the same time frame, and can fine-tune and change the parameters to better analyse the customers," says Vrablik. "We literally look around to find the unused capacity in the enterprise or campus; it's quite amazing when we paint this picture back to customers and say 'did you realise you had all this on-demand compute power?'"

Pharmaceutical giant Novartis got just such a surprise when it used Grid MetaProcessor -- a commercial iteration of the well-known United Devices (www.ud.com) smallpox and cancer cure grid software -- to harness the unused processing power of 2700 desktop PCs at its Basel, Switzerland headquarters. The result? Novartis got access to 5 teraflops (trillion floating point operations per second) of extra computing power that's now being applied to complex drug discovery problems.

Standard struggle

Advancing grid computing into the commercial world will provide far more benefits than simply becoming a new revenue source for IBM, however. Since grids will link many different types of systems, the industry has struggled to provide a standard way for applications to seamlessly communicate with all of those systems.

The answer lies in the creation of an abstraction layer that lets end-user applications call the grid as an entity, then translates that call into instructions for each individual type of system. Vrablik calls this concept a "meta operating system" because it is a collection of abstract calls that can be used to drive specific systems. After a number of early attempts, the industry has coalesced around the Globus Toolkit, an open standards effort backed by the Global Grid Forum that recently reached a version 3 release.

Providing a common target for applications will hasten adoption of grid technologies, in turn increasing investment in common-usage grids and helping technology leaders like IBM, Oracle, and Sun build grid awareness into their core products. Grid computing could eventually be built into everything from esoteric analytical applications to high-volume web servers, which could call upon grid resources to better handle unexpected peak loads.

IBM recently put its money where its mouth is with the announcement that it has been working with China's Ministry of Education on the China Education and Research Grid (CERG), which links 49 Linux-based IBM eServer xSeries systems and six Unix-based IBM pSeries servers across six universities. The system has already been used by China's Bioinformatics Grid System to help analyse the SARS virus.

When its first phase is completed in 2005, CERG will link nearly 100 universities across the country, providing six teraflops (trillion floating point calculations per second) of computing capacity; this will grow to some 15 teraflops -- putting the system in the top five supercomputers in the world.

The backing of commercial giants will make such grids both common and accessible. As commercial-grade grid technology trickles back into the research community, the result could be a quite viable grid computing infrastructure that could ultimately benefit small businesses struggling to gain access to the computing power they need. In Australia, GrangeNET has already linked up many universities and could be a potent driver for broader grid adoption.

"The requirements of Australian life science lend themselves very well to an uptake in grid technologies," says Vrablik. "There's a diverse community that is scattered geographically, and the community needs to access data and resources across multiple disciplines and locations. When these businesses can get access to commodity applications on a network fuelled by a grid, it will enable pay-by-the-drink computing."

Related Articles

Personality influences the expression of our genes

An international research team has used artificial intelligence to show that our personalities...

Pig hearts kept alive outside the body for 24 hours

A major hurdle for human heart transplantation is the limited storage time of the donor heart...

Breakthrough antibiotic for mycobacterial infections

The antibiotic candidate, named COE-PNH2, has been optimised to target Mycobacterium...


  • All content Copyright © 2024 Westwick-Farrow Pty Ltd