Imagine buying an electric toaster and shortly afterwards learning it has a serious design flaw. The toaster works all right at the moment, but in six months' time the flaw may cause it to fail and possibly become dangerous.
The design flaw is the result of an elementary error on the part of the manufacturer. The manufacturer has been aware of this flaw for a long time but made little effort to correct it.
You return the toaster to the shop where you bought it and ask that it be either replaced with a flawless toaster or returned to the manufacturer to have the flaw repaired. You are told that the manufacturer will neither replace nor repair such toasters, that the repair is your responsibility, and that, at this stage, you are irresponsibly tardy in dealing with the matter. You go away feeling guilty and hire an expert at your own expense to fix the toaster.
"No," I can hear you protest, "I wouldn't do that - I would demand satisfaction from the manufacturer." Well, maybe you would, but not if the Year 2000 (Y2K) computer problem is a good example to go by.
The Y2K problem is also referred to as the millennium bug or the millennium date problem. Many computer-control systems have a Y2K flaw which must be corrected before January 1st next or havoc will ensue.
The correction is costing a fortune both in terms of money and scarce computer programmers' time. The burden of this expense and trouble has fallen on the shoulders of the purchasers of this flawed technology and not on the information technology (IT) sector that produced the faulty systems.
We are now in the midst of a constant din of publicity, activity, and hype about the Y2K problem, but one element is conspicuously absent from all of the clamour - I am unaware of any significant criticism or anger directed by the users of this flawed technology against the perpetrators of the mistake.
WHAT is the basis for the Y2K problem? Early computer programmers were under pressure to save memory when writing software. One of the techniques used to economise was to record the figure for the year as two numbers. For example, June 10th, 1980 was stored as 10/06/80. Using two figures for the year works fine so long as you remain in the same century. However, if you tell your computer that 99 means the year 1999, when it moves on to the year 2000 and looks at the figure 00, it thinks it is in the year 1900.
Programmes continued to be written and chips designed throughout the 1970s and 1980s using the two-digit year convention. Programmers predicted that all these systems would be entirely replaced by 2000. They were wrong. As a result, very many computer systems currently in operation identify calendar year dates only through the last two digits of the year. Without conversion, these systems would not recognise 00 as 2000.
Hence the current frantic activity to reprogramme systems in order to correct this problem. Without this conversion, programmes that depend for their operation on the recognition of the correct date would crash, or at least fail to function correctly, on January 1st, 2000.
Visualise a bank computer that calculates the interest on your deposit account. The machine subtracts the date at the start of the period from the date at the end of the period and multiplies the answer by the interest rate. Say you have £100,000 on deposit at 5 per cent per annum interest. The bank will credit your account with £5,000 if the period in question is one year. However, if the end of the period in question falls in 2000 the computer will (if it has not been corrected) subtract 99 from 00 and get an answer of minus 99 years.
WHO knows what the computer would make of this? It might well decide that you are owed £5,000 for minus 99 years and that therefore you owe the bank £495,000. On the other hand it might ignore the minus sign and deposit £495,000 interest in your account.
Nobody really knows how an unconverted computer programme would behave. A possible disastrous scenario is that irretrievable data would be overwritten with nonsensical figures.
Almost all business software is date-dependent. The management of mortgages, insurance premiums, invoices and orders all depend on the correct computer manipulation of dates.
Beyond the business sector, all kinds of areas could be affected by the Y2K problem, e.g., public utilities, maintenance and provision of supplies, control of power stations, defence systems, weather forecasting, shipping, many other essential services, many household appliances, motor vehicles, and airplanes (the latter three mainly because of embedded chips).
Intensive worldwide efforts have been going on over the last two or three years to fix the Y2K problem. The cost of a global fix will be between $300 - 600 billion, and this excludes consequential disruption losses, half of which may be ascribed to exhaustive system testing. This is a massive cost incurred by a design flaw that had foreseeable consequences.
What sort of industry is the IT sector that it can make such a simple and massive blunder so early in its career?
What kind of relationship have we got with this sector that we apparently don't feel the need to scold it for its damaging carelessness and it doesn't feel an obligation to even adopt a sheepish attitude in our presence? Have we elevated IT into a god that must be worshipped come what may?
History shows that such arbitrary gods expend most of their energy looking after themselves and have little left to look after their subjects.
William Reville is a senior lecturer in biochemistry and director of microscopy at UCC.