Grid promises infinite power

The scientists who brought you first the Internet and then the Web are now cooking up the "grid"

The scientists who brought you first the Internet and then the Web are now cooking up the "grid". This new technology for connecting up the world's computers will be millions of times more powerful than anything available today and far easier to use.

"Imagine being able to sit at your desk and access all the information in the whole world to answer your question, in the format you need. Not just a list of links to relevant websites or even selected online articles, but every piece of data ever collected on the subject, in a relevant, user-friendly way."

That vision of the next Internet generation comes from Britain's Particle Physics and Astronomy Research Council, which is working with scientists from Cern, the European centre for particle physics research in Geneva, to take the first step towards a global grid.

The impetus comes from scientists who need to share previously unimaginable amounts of data between laboratories. This led to the original Internet, set up by the US Advanced Research Projects Agency in 1968, and to the World Wide Web, invented at Cern in 1990.

READ MORE

Cern's latest atom smasher, the $1.8 billion (#1.87 billion) Large Hadron Collider (LHC) due to come on-stream in 2005, will present the greatest computing challenge in history. If the LHC is to live up to its promise delivering a new understanding of the universe thousands of billions of bits of computer data will have to be analysed collectively by 5,000 scientists in 150 universities.

A new approach is needed and, after evaluating various computing architectures, Cern has settled on the grid.

American scientists, led by Ian Foster of the University of Chicago and Carl Kesselman of the University of Southern California, have been developing this concept over the past three years, and the US government is providing about $100 million a year to set up several experimental grids.

The grid is a high-speed network that links super-computers, databases, specialised processors and personal machines. It differs from today's Internet mainly through its "middleware" programmes that make collaborative computing much easier and more reliable.

"Just as the electrical grid enables you to plug in a device without worrying about power generation, the computing grid lets you extract information from anywhere in the world without knowing where it is," says Mr Jim Sadlier, who is co-ordinating PPARC's grid programme.

"When you formulate a question, software behind the wall will extract the information even if you have no idea where it resides. The tortuous process of searching on the Internet will disappear."

The British government's Office of Science and Technology hopes to obtain about £100 million in this year's comprehensive spending review to build Britain's first grid. Mr John Taylor, director-general of research councils, envisages three main e-science applications besides the Cern collaboration:

Astronomers will create a virtual observatory combining data from ground-based observatories, spacecraft and database archives;

Biologists will handle the data explosion following the completion of the human genome project;

Environmental scientists will model complex systems such as climate change.

A standardised grid, created for science, would quickly be adopted by industry, says Mr Chris Jones, head of technology transfer at Cern. He sees a close analogy today with the situation at the end of the 1980s, when the US National Science Foundation laid down a standard Internet protocol and then Tim BernersLee at Cern invented the Web, with hyperlinks between documents.

The Web took off commercially in 1993 when Mr Marc Andreesen, the founder of Netscape, wrote a programme that enabled untrained users to navigate by clicking on words or icons.

"Public funding seeded the creation of the Internet as we see it today and I think the time has come to spend more public money preparing for the next 10 years," says Mr Jones.

Mr Bert Dekkers, head of IBM's European Internet laboratory in the Netherlands, says industry is already taking an interest in grid architecture. One oil company, for example, is considering it as a way to distribute the analysis of seismic and geological data around the world.

"The distinction between scientific and commercial computing is starting to blur, particularly for applications that involve remote visualisation," he says.

A huge rise in the capacity of the communications infrastructure will be required to bring the benefits of the grid to the consumer market. But this is achievable. Mr Dekkers says mobile Net connections will be 300 times faster in 2003 than today.

A forerunner of the sort of distributed computing that is difficult to achieve today but would become commonplace with the grid is Seti@home. This enlists more than one million personal computers worldwide to process radio-telescope data, in the hope of picking out signals from an extraterrestrial civilisation.

The scientists at the University of California, Berkeley, who are masterminding the search for extra-terrestrial intelligence, had trouble devising a programme that would use the Net to divide up the data, send it out to volunteers and then get it back. With the grid, anyone with a problem that captures the world's imagination could command virtually infinite computer power.