THREE years from today you may be in despair. The computers you rely on to do your job have thrown a collective fit: generating rubbish results, refusing to accept input you know is valid, crashing constantly. Similar problems plague your suppliers and customers. This morning's post brought a batch of forms starting "this account is now 99 years overdue and unless we receive immediate payment...". Your own billing system won't issue invoices, since it thinks all pre Christmas deliveries were made 99 years into the future.
You'd forget work for a week and head off for a holiday if you thought it was safe. But that charter flight tragedy in the Canaries has put a lot of people off and there have been near misses as air traffic control systems "forgot" about planes in their airspace, or collapsed altogether.
You might even retire, but your pension fund thinks you're 35, not 65 and isn't offering much. This aside, the value of the fund probably took a hammering in the stock market meltdown on January 1st and 2nd when automatic trading systems decided every stock they held was under performing and dumped them on the market to "sell at any price".
Welcome to the computer industry's millennium hangover. After 30 years of good times, during which computers moved into every area of our lives, they have shown up the fallibility of their human creators.
MAYBE it won't be like that at all. Maybe the efforts already under way will succeed and January 1st 2000 will pass as peacefully as most of the recent dates on which we were told lurking viruses would paralyse computers world wide. Maybe...
More general fear and uncertainty about the progress into a new millennium underlie a lot of the hype about the Year 2009 problem. Most "pre millennium tension" articles published around New Year made at least passing reference to the particular problems for computers.
Many people - including computer professionals - are concerned about our reliance on computers for all sorts of tasks, from flying an aircraft to judging how many seconds a chicken needs in the microwave. They are concerned also about the blind faith put in computerised systems and the impossibility of justifying that faith by proving that a given system will work properly in all circumstances.
Modern programs are too large to be grasped whole by any individual. With millions of lines of code, they are too complex ever to be certified utterly free of bugs or errors. So there is every reason to be nervous of something that exposes basic flaws in systems we rely on.
AS every schoolchild knows by now, the origin of the problem is relatively simple. In the early days of computing memory, processing power and storage were very expensive and scarce. One way to conserve them was to be a succinct as possible with any information being held. It made sense to store only two digits for the year: "70" instead of "1970".
Usually, this worked quite well. The problems only arise when the system is asked to move from 1999 ("99" to its way [of thinking) to 2000 (which to the system would be "00" or 1900). What happens when the system thinks most of its files are 99 years old? How does it calculate interest past 1999?
Early systems were not expected to last until the turn of the century. But even when the early computers were scrapped the programs they ran were often moved to new hardware. Software which has been heavily customised is a large investment and companies will go on using it for as long as possible. That is not much longer, without modification.
Estimates of the cost of fixing the problem range up to $600 billion - or half of the world's information technology budget.