Y2K and ME
Both ends of the spectrum deal with the Y2K or Year Two Thousand issue. There is ignorance, apathy, denial or at the other end doomsday predictions. Which one you believe depends upon the information you receive and your personality disposition.
The average person that does not use a computer has no clue what is going on except for bits and pieces about a Millennial Bug they have heard about and that Y2K has something to do with the year 2000.
While the average person may not know the technical aspects of it, it is the average person that will most affected by it. Many people still do not have personal computers in their home, but they have computers in their cars. They bank at the ATM that is controlled by a computer. Checking, savings and retirement accounts are managed by computers as well as every bill they pay. Inventories of the stores they shop and computers manage government agencies from local to national. The average person is affected in almost every aspect of his/her life by a computer. They have a lot of stock in what happens on Jan 1, 2000.
A brief summary of the issue is that programmers in the 70’s and 80’s did not create a date area to allow the system to roll over to 2000. This means that for business computers that use those programs, COBOL being the primary one, the year 2000 will look like the year 1900. All the files on a computer are dated. Therefore when the main computer clock reads 1900 it will lock up or crash because all the files on the machine will be created after the current date. Anything created in 1999 will not be recognized 99 years before its creation.
Why did they do that? There are several reasons. When these programs were written the ability to store date was limited. Computers were not anywhere near as powerful. So to save space and speed up their systems they did short cuts wherever they could. Did they not foresee the problem? I believe yes and no. They understood that in 2000 they would have a problem, but I believe they thought that by this time software and hardware would change so rapidly that their system would be either completely modified or obsolete. That belief also meant that they felt there was not need to create complex documentation.
They were right in a few areas. Software and hardware has advanced so rapidly that a top of the line personal computer is old six months after you buy it and an antique in about three years. In many ways, the mainframe style computers they designed these programs for are obsolete.
What they did not allow for was that many third world countries and big companies would not be able to or desire to spend the money to make the changeover from non 2000 complaint software and hardware to the complaint ones. Some folks are still looking for the silver bullet. Quick fixes that will just go though the codes and make all the changes fast and cheaply. I do not foresee such a solution.
Why is it so hard? Well, remember the programmers did not document their work so that someone could go into a program and know for sure how the programmer made a specific subroutine function. While everything should work to a standard there are times when the book doesn’t work and innovation and plain luck are what it takes to make it work. If you do not share your innovation through documentation the next programmer that sits down will make a change that should work and it crashes the machine. He then gets innovative and may find what the first one did or he may do something else. All that the first programmer did may not be completely overwritten. So when the third programmer sits down he is looking at code he thinks is by the book and tries something that crashes and away we go again.