Y2K: crisis averted or con?
Action stations: back in 1999, the United Nation's International Civil Aviation Organisation feared for the worst as it monitored aviation systems globally.
Just 10 years ago, we started to breathe easily as a potential software disaster was averted. But was it all a lucratively cynical exercise, asks GORDON SMITH
IT’S A funny thing, perspective. Just over 10 years ago, media coverage about the Year 2000 date-change problem speculated on the possible repercussions of widespread computer failure.
In the build-up to the changeover from 1999 to 2000, dire warnings of digital disaster and financial meltdown were a frequent occurrence. If nothing was done to correct the problem, global catastrophe could result, we were told.
It potentially affected anything controlled by computers, from traffic lights to satellites and from ATMs to airports. A 1993 Computerworldarticle widely credited with bringing the problem to the world’s attention used a single word for its headline: Doomsday.
Its author, industry consultant Peter de Jager, described the looming event as “more devastating than a car crash”.
The problem centred on two digits – or more exactly, the lack of them. Limitations of computer memory had led software programmers to devise a workaround, representing years as two-digit abbreviations, so 1970 became simply 70. Shortening 2000 to 00 would cause many computers and embedded systems to think it was 1900.
Subtracting the year of someone’s birth from 00 would give a minus number that could affect anything from social welfare payments to medical records. Bank account interest calculations were also at risk.
“In every 10 lines of code, there’s some reference to time, and time triggers things,” says Frank Murray, chief executive of software services firm Piercom which was one of the most prominent Irish companies involved in tackling the problem.
“We knew it was quite endemic as an issue. The biggest problem was, people didn’t know where those triggers might be,” he says.
No one had ever thought to correct the date-change problem beforehand because few programmers reckoned their systems would still be the beating heart of banks and utilities decades hence.
“The problem was not one of negligence, it was one of endurance,” says Robert X Cringely, whose book Accidental Empires chronicles the rise of the computer industry. “There was no expectation that those extra digits would be needed. They just assumed that succeeding generations of hardware and software would come along and the stuff they were writing in 1953 would no longer be needed in 1999. To fault them on that would be ludicrous, except it turns out that old code never dies, unless you send out a death squad to kill it.”
It also turns out those software “death squads” came with a hefty price tag. De Jager now describes the 1993 article as understated. “I predicted it would cost billions of dollars; I never imagined it would cost close to $300 billion worldwide to fix,” he says.
Everyone was prepared for the worst but, by early January 2000, it became apparent that the world had not in fact come to an end.
Much of the reportage then shifted noticeably, striking a tone of annoyance at being duped rather than relief that nothing much had happened. Ten years on, Y2K is a punchline and the joke, it seems, was on us.
Interestingly, de Jager’s original article forecast the industry would suffer a credibility crisis. It did, but not exactly as he predicted. Programmers weren’t criticised for having messed up badly on the date problem. The snag was arguably that the problem was solved too well.
Y2K has become shorthand for global panics that turn out to be unfounded.
“The public perception perpetrated by the media that this was a hoax has done a great disservice to the industry,” says de Jager now.
“Organisations did not spend $300 billion worldwide because someone ‘said’ there was a problem. Nobody is that gullible. They spent $300 billion because they tested their systems with ‘00’ dates – and the systems stopped working,” he insists.
“People said afterwards ‘what was the fuss about?’ and thought it was consultants pulling the wool over their eyes but I think Y2K is a good example of risk mitigation,” adds Gerry Joyce, a risk management expert with LinkResQ. “The problem was identified early enough and further problems didn’t happen because steps were taken to prevent them.”
Felix O’Regan, head of public affairs with the Irish Bankers’ Federation (IBF), worked through the night on December 31st, 1999, receiving status reports from all of the banks operating in Ireland. “It wasn’t a stressful night,” he recalls. “In certain respects it was harder to keep my eyes open because there hadn’t been any problems.”
That wasn’t just good luck but the culmination of extensive preparations, he adds. An IBF working group spent all of 1999 looking at how banks’ IT systems would be able to interoperate with one another. Individual banks had begun their own in-house Y2K programmes much earlier.
A Government steering group also met regularly to ensure the date change didn’t trigger further complications in the wider economy. “That was so retailers wouldn’t find themselves in trouble if card-processing systems malfunctioned, and that consumers wouldn’t be affected if the ATM system didn’t work,” says O’Regan.
There is some dispute over exactly how much was spent to fix the problem. RTÉ news at the time referred to a figure as high as $600 billion worldwide. At the time, the Small Firms Association estimated the cost to Irish small business at about £400 million. A perceived lack of value for money may partly account for some of the post-Y2K negative backlash.
Cringely offers another explanation. “I believe a terrific amount of Y2K fraud took place. There was a lot of money that was spent and it wasn’t visible. The question is whether the right work was done and my guess is probably about half that money was just wasted,” he says.
On balance, he believes it was a price worth paying. “We’re lucky if we only had to pay for it twice. In big IT projects, where the system is reworked, the failure rate is over 50 per cent. By failure I mean you get nothing for your money. Either we got lucky or we overbought enough or it wasn’t as hard as we thought but I still tend to think we had to get it done.”
Frank Murray sees a parallel with the dot.com boom which was in full swing at the same time. He believes fundamentally sound technology was subverted by people who saw dollar signs. “A lot of companies who came to the problem in an opportunistic revenue-generating way made some money,” he says.
Y2K was no big payday for Piercom. “Commercially it could have been better for the company,” admits Murray. “We didn’t come to it saying ‘we’re going to make lots of dosh here’. We were a technology company that saw a problem that needed solving.” Piercom did, however, win plaudits for its work, including a Smithsonian award for the toolset it developed to fix the bug.
There were also wider benefits. Many processes developed to solve Y2K were quickly put to good use on the euro currency changeover. “I’ve no doubt that’s why that went so successfully,” says O’Regan.
Meanwhile, Cringley credits Y2K with giving people a sense that the world was more networked and connected than it really was, preparing us for a time when technology really does follow us anywhere.
He also believes we may never again see such a concentrated spend on IT than at the turn of the century.
“I wouldn’t argue that it will never happen again, but I don’t think we have standing in front of us an opportunity or motivation for it happening soon.”