Superpowerful

COMPUTER TECHNOLOGY: The information technology revolution is old news but one area is developing at breakneck speed, with Ireland…

Image by Getty Images
Image by Getty Images

COMPUTER TECHNOLOGY:The information technology revolution is old news but one area is developing at breakneck speed, with Ireland playing a leading role in its development: meet the super-computer

BEFORE THE personal computer there was the mini-computer, costly fridge-sized devices used by businesses for tasks that seem laughably simplistic now. Before the mini-computer there was the mainframe, room-sized devices used for . . . you get the picture. In fact, mainframes are still in use today, mostly for tedious high-volume batch processing jobs but there was and is another kind of often gigantic computer that is a lot more interesting: the super-computer.

Super-computing or, more broadly, high-performance computing (HPC) is about as far removed from the world of e-mail, web browsing and Facebook as you can get. In a nutshell, super-computers are simply very fast computers, but there’s a lot more to the industry than that: the goal of super-computing – pure calculation speed – means these machines are used for computations in everything from climate research to oil and gas exploration, as well as military defence and “econophysics”, the place where banking and physics meet head-on.

The first super-computer was designed by Seymour Cray, later to found the famed Cray Inc, in the 1960s while he was working at Control Data Corporation. Cray’s machine was eventually overtaken by low-end computers that caught-up but by then he had gone on to develop new, faster machines. This, in a nutshell, is the story of super-computing. Developments at the bleeding-edge allow the rest of us to catch-up with yesterday’s performance – but by the time we do, the professionals are already living in tomorrow’s world.

READ MORE

The latest development is to shift as much processing as possible from the general CPU (central processing unit) cores to GPU (graphics processing unit) cores. The result is an exponential increase in computational speed for relevant applications.

Ireland is home to one of Europe’s best-equipped super-computing centres: the Irish Centre for High-End Computing (ICHEC). Headed-up by physicist Prof James Slevin, ICHEC was founded in 2005 and is jointly funded by Science Foundation Ireland (SFI) and the Higher Education Authority (HEA). Super-computers are frighteningly expensive, making funding vital. ICHEC, home to some of Ireland’s most powerful computers – machines that can compete on a global scale – effectively brokers its capacity to clients.

Slevin says there was some resistance to centralising super-computing capacity as every research institute wanted its own machines. “I pointed out that the last thing anybody needed was to have to look after their own system. Quite apart from the cost, they’re complex and require constant, dedicated attention,” he says.

Slevin says there are also other advantages to the ICHEC model. The largest GPU cluster in Europe is a machine designed by Silicon Graphics International (SGI) owned by the oil company Total in France. “The oil and gas people are very specialised, they have a core set of software – they can throw all their resources into two or three sets of codes whereas we might have 30 or 40 to run,” says Slevin. To keep up, ICHEC has to employ a dedicated team of super-computer software developers, something that results in a concentration of specialist skills in Ireland disproportionate to the country’s population size and international significance.

One of ICHEC’s clients is Met Éireann, which uses super-computing for both daily weather forecasting and long-term climate modelling. Observations are collected from across the Northern hemisphere using satellites and aircraft and all of this data is passed to ICHED where it is used to produce a three-dimensional picture of what the atmosphere looks like at a particular hour. “We end-up with a 3D grid and at the intersecting points can see temperature, humidity, wind speed and wind direction,” says Ray McGrath, head of research at Met Éireann.

Then there is the growing issue of predicting climate change. “We have a project called C41 Community Climate Change for Ireland. In the case of climate modelling you’re looking at running not for two days but for a hundred years,” says McGrath.

Computer modelling does not have the best reputation. The recent climate scandal involved the alleged manipulation of the datasets used to produce computer models. In the press, computer modelling has been presented as scientific fact. This is not necessarily the case – some describe it as a tool utilised by scientists, engineers and researchers. However, Slevin disagrees. “You have experimentation, theory and modelling – computational modelling is now taken as a third pillar of science,” he says. “There have been huge advances in material science thanks to computational modelling – nanotechnology, DNA, this stuff is not unimportant.”

Despite the arguments about their findings, computer models represent a significant leap forward for research. The key developments are speed and the ability to model systems as a whole.

During the 1970s, climate modelling was primarily pen- and paper-based. By 1980, computers were helping climatologists model selective environmental layers, such as the inner core of the ocean and polar caps. By 1990, these selective layers had complete models. Today, it takes one day of computing to model one day of earth’s climate. ExaScale – the development of systems that can handle a million trillion calculations per second – will allow climatologists to model one year of climate in one day. This will allow scientists to run real policy “what ifs” such as: “What happens if we double coal production?”; “What happens if car emissions are cut in half?” and so on.

One surprising area for super-computing is the “cloud”, the mass transfer of data storage to remote locations, thereby lowering server and energy costs.

The Irish government has earmarked this area for development as part of its “smart economy” plans. In July, Minister for Communications Eamon Ryan noted that EMC Ireland’s data centre in Cork already supported 1,700 jobs. “Choosing to prioritise the development of these centres across Ireland will create 10,000 high-value jobs over the next five to 10 years,” he said.

Remote data storage is not a new idea and can be done with typical low-end server equipment but the move to centralise processing on geographically distant computers, not dissimilar to ICHEC’s function, requires a qualitative leap in processing power. “There’s two things going on simultaneously,” says Jason Coari, senior marketing manager with super-computer hardware manufacturer SGI. “The services that the likes of Amazon are offering to businesses is one and the second thing that is going on is the internal cloud – an oil and gas company like Total [in France] are trying to provide HPC services over the cloud: high performance over a network.”

During the 1980s and 1990s SGI was known for its groundbreaking high-performance graphical workstations but the company declined in the last decade due to an assault from the low-end: Windows, Macintosh and Linux desktop computers. It emerged from bankruptcy protection a slimmer operation, merged with competitor Rackable Systems, and is now once again enjoying success by concentrating on the extreme high end of performance-oriented computing. For now this is primarily aimed at scientific research in the academy. Despite the focus on the cloud, SGI recently launched a desktop super-computer. “In marketplace adoption of cloud and HPC we’re on the leading edge, but things could change as people evaluate what needs to be on the desktop and what is better centralised,” says Jason Coari.

Other fields requiring HPC include bioinformatics, analysis of fluid dynamics in physics and, arguably, the two most interesting of all: military defence and “econophysics”. No one in these sectors was keen to talk about their work but that doesn’t mean no information is available. Famously, one failed super-computer manufacturer of the 1980s and 1990s, Thinking Machines Inc, was kept alive solely by contracts from US intelligence agencies. The US military, meanwhile, has just removed from its servers a document that announced plans to buy 2,200 Sony Playstation 3 games consoles to increase the capacity of its already existing Playstation-based super-computer. The Justification Review Document, which can still be found in cached form on the internet, says “the new PS3s will be placed in a cluster environment with an existing cluster of 336 PS3s by connecting each of the units”.

As bizarre as this sounds, it is true. Terra Soft Solutions, a firm in the US, has been working with Sony to develop Playstation 3-based super-computers since before the machine was available to the public. The games machine’s CPU, called Cell, is a cost-effective and powerful multicore system.

Whatever the American military is doing with this machine – and it's certainly not playing games – The Justification Review Documentnotes the power of the Cell CPU while suggesting a potential future move to GPU technology.

One US military Playstation 3 cluster (it may be the same one – clear information is not available) has the codename Roadrunner and is located at Los Alamos National Laboratory, the site of the development of the atom bomb. It broke a computation record in 2008 by achieving 1,026 quadrillion calculations per second. This machine is used for the study of nuclear weapons. Playstation 3 owners may be less than thrilled to know that because the machine’s retail price is lower than its manufacturing cost and Sony’s profits are made on software sales alone, they are in effect subsidising the US military’s super-computer purchases every time they buy a game.

The banking sector, which has an equally “interesting” reputation, is also a major super-computer user. One use is “econophysics”, the modelling and analysis of financial systems such as trading, banking and the stock, commodity and other markets that rejects economic theory in favour of physics-derived statistical computation. The discipline claims to have identified the statistical behaviour of the underlying networks in the financial system.

Super-computing’s darker side is impossible to ignore. Not only is it the reason so little is known about the sector outside of the computer industry, it also reflects the fact that, from the Hollerith census machines that prefigured IBM, to the simulated paranoia of the Cold War, computing has always been driven by something other than the personal productivity and communication we associate it with.

According to James Woudhuysen, professor of forecasting and innovation at De Montfort University, we have a complicated relationship with technology. “We have a love-fear relationship with technology – mainly fear. The research and development recession in the west clearly demonstrates this,” he says.

Unpleasant-sounding applications have underpinned the entire development of computing, from the integrated circuit onward. As Dennis Hayes demonstrated in his seminal book Behind the Silicon Curtain, despite Silicon Valley's propaganda of new technologies being funded by risk-taking venture capitalists, the entire information technology industry owes its development to Pentagon funding.

“American IT boosters have long seen the movement of electrons confirmation of the power of markets, just as people now see inter-firm competition in the terms of Darwin, ecosystems of innovation and so on. IT boosters know little of politics, or of economics. They forget that IT is a social question, and are no good at social theory anyway,” says Woudhuysen. Super-computing is no different, but at least in this area Ireland is not contributing to to the RD recession.