While many were ready to party "like it was 1999," many others predicted catastrophe at the end of the year from a small assumption made long ago when computers were first being programmed.
Considering how much of our everyday lives were run by computers by the end of 1999, the new year was expected to bring serious computer repercussions. Some doomsayers warned that the Y2K bug was going to end civilization.
Other people worried more specifically about banks, traffic lights, the power grid, and airports -- all of which were run by computers. Even microwaves and televisions were predicted to be affected by the Y2K bug. As computer programmers madly dashed to update computers with new information, many in the public prepared themselves by storing extra cash and food supplies.
What Is the Y2K Problem?
The cause of the Y2K problem was pretty simple. Until 2000, computer programmers had the habit of using two digit placeholders for the year portion of the date in their software. For example, the expiration date for a typical insurance policy or credit card was stored in a computer file in MM/DD/YY format (e.g. - 08/31/99). Programmers did this for a variety of reasons, including:
The 2-digit year format created a problem for most programs when "00" is entered for the year. The software did not know whether to interpret "00" as "1900" or "2000". Most programs therefore defaulted to 1900. The code that most programmer's wrote made no assumption about the century and therefore, by default, it is "19". This wouldn't have been a problem except that programs performed lots of calculations on dates. For example, to calculate how old you are, a program will take today's date and subtract your birthdate from it. That subtraction worked fine on two-digit year dates until today's date and your birthdate were in different centuries. For example, if the program thinks that today's date is 1/1/00 and your birthday is 1/1/65, then it may calculate that you are -65 years old rather than 35 years old. As a result, date calculations gave erroneous output and software crashes or produced the wrong results.
The solution, obviously, was to fix the programs so that they worked properly. There were a couple of standard solutions:
Either of these fixes was easy to do at the conceptual level - you go into the code, find every date calculation and change them to handle things properly. It was just that there were millions of places in software that had to be fixed, and each fix had to be done by hand and then tested. For example, an insurance company might have 20 or 30 million lines of code that performs its insurance calculations. Inside the code there might be 100,000 or 200,000 date calculations. Depending on how the code was written, it may be that programmers have to go in by hand and modify each point in the program that uses a date. Then they have to test each change. The testing is the hard part in most cases - it can take a lot of time.
If you figure it takes one day to make and test each change, and there were 100,000 changes to make, and a person worked 200 days a year, then that means it would have taken 500 people a year to make all the changes. If you also figure that most companies din't have 500 idle programmers sitting around for an year to do it and they had to go hire those people. This become a pretty expensive problem.A programmer costs something like $150,000 per year (once you include everything like the programmer's salary, benefits, office space, equipment, management, training, etc.), you can see that it can cost a company tens of millions of dollars to fix all of the date calculations in a large program. But they had to do it anyway. That was the time when people with some database knowledge were benefited heavily.
So, when the date changed to January 1, 2000, very little actually happened. With so much preparation and updated programming done before the change of date, the catastrophe was quelled and only a few, relatively minor millennium bug problems occurred.
Considering how much of our everyday lives were run by computers by the end of 1999, the new year was expected to bring serious computer repercussions. Some doomsayers warned that the Y2K bug was going to end civilization.
Other people worried more specifically about banks, traffic lights, the power grid, and airports -- all of which were run by computers. Even microwaves and televisions were predicted to be affected by the Y2K bug. As computer programmers madly dashed to update computers with new information, many in the public prepared themselves by storing extra cash and food supplies.
What Is the Y2K Problem?
The cause of the Y2K problem was pretty simple. Until 2000, computer programmers had the habit of using two digit placeholders for the year portion of the date in their software. For example, the expiration date for a typical insurance policy or credit card was stored in a computer file in MM/DD/YY format (e.g. - 08/31/99). Programmers did this for a variety of reasons, including:
- That's how everyone did it in their normal lives. When you write a check by hand and you use the "slash" format for the date, you write it like that.
- It takes less space to store 2 digits instead of 4 (not a big deal now because hard disks are so cheap, but it was once a big deal on older machines).
- Standards agencies did not recommend a 4-digit date format until recently.
- No one expected a lot of this software to have such a long lifetime. People writing software in 1970 had no reason to believe the software would still be in use 30 years later.
The 2-digit year format created a problem for most programs when "00" is entered for the year. The software did not know whether to interpret "00" as "1900" or "2000". Most programs therefore defaulted to 1900. The code that most programmer's wrote made no assumption about the century and therefore, by default, it is "19". This wouldn't have been a problem except that programs performed lots of calculations on dates. For example, to calculate how old you are, a program will take today's date and subtract your birthdate from it. That subtraction worked fine on two-digit year dates until today's date and your birthdate were in different centuries. For example, if the program thinks that today's date is 1/1/00 and your birthday is 1/1/65, then it may calculate that you are -65 years old rather than 35 years old. As a result, date calculations gave erroneous output and software crashes or produced the wrong results.
The solution, obviously, was to fix the programs so that they worked properly. There were a couple of standard solutions:
- Recode the software so that it understands that years like 00, 01, 02, etc. really mean 2000, 2001, 2002, etc.
- "Truly fix the problem" by using 4-digit placeholders for years and recoding all the software to deal with 4-digit dates. [Interesting thought question - why use 4 digits for the year? Why not use 5, or even 6? Because most people assume that no one will be using this software 8,000 years from now, and that seems like a reasonable assumption. Now you can see how we got ourselves into the Y2K problem...]
Either of these fixes was easy to do at the conceptual level - you go into the code, find every date calculation and change them to handle things properly. It was just that there were millions of places in software that had to be fixed, and each fix had to be done by hand and then tested. For example, an insurance company might have 20 or 30 million lines of code that performs its insurance calculations. Inside the code there might be 100,000 or 200,000 date calculations. Depending on how the code was written, it may be that programmers have to go in by hand and modify each point in the program that uses a date. Then they have to test each change. The testing is the hard part in most cases - it can take a lot of time.
If you figure it takes one day to make and test each change, and there were 100,000 changes to make, and a person worked 200 days a year, then that means it would have taken 500 people a year to make all the changes. If you also figure that most companies din't have 500 idle programmers sitting around for an year to do it and they had to go hire those people. This become a pretty expensive problem.A programmer costs something like $150,000 per year (once you include everything like the programmer's salary, benefits, office space, equipment, management, training, etc.), you can see that it can cost a company tens of millions of dollars to fix all of the date calculations in a large program. But they had to do it anyway. That was the time when people with some database knowledge were benefited heavily.
So, when the date changed to January 1, 2000, very little actually happened. With so much preparation and updated programming done before the change of date, the catastrophe was quelled and only a few, relatively minor millennium bug problems occurred.
What an interesting article!!!!! Being embedded engineer I found it very essential to know these things..things are places in very simple way ...anyone with non technical background ,will also easily understand the concept...gr8 work!
ReplyDelete