QuickBASIC/QBASIC newsletter

 
Article of the Month
 
The Millennium Bug
 
                    The reason why The Millennium (Y2K) Bug exists can be summed up in one word:
            Procrastination.  This internal clock error present in many computers was caused by
            the programmers slacking on their programming for too long.  But why did programmers
            originally make the clocks with only two digits for the year in the first place?  Well, to
            explain that, one must first understand some of the history of clocks in computers...

                    The Y2K Bug wasn't the first serious clock error in computers; the first one
            actually occurred when the clocks flipped to 1970 on some main-frames.  These clocks
            only had enough space for 10 years: Jan 1st, 1960 - Dec 31st, 1969.  The reason these
            clocks only had one digit for the years was because the computers at the time had a
            very limited amount of memory, (Some only had a few Kb of memory in their vacuum-
            tubes. heh) so only using one nibble (a HEX-digit) for the date greatly saved memory
            so the clock could use the other nibble for the hour, minute, or second and leave room
            for calculations in the rest of the memory.
                    After the microchip was invented in the late 1960s, RAM became available to use
            in these mainframes.  These computers were still left on continuously, so there was no
            need for permanent storage of time.

                    It wasn't until 1981 that the first home computer with a built-in clock was released
            by IBM that BIOS and CMOS chips really started gaining use in the computer industry
            to store time.  The BIOS chip used in all computers in the '80s was released on April
            4th, 1980 (you can tell by the way all of the clocks reset to that date when the bug is
            present).
                    Memory was still a problem in computers of this time (mainly due to expense),
            so the BIOS chips only incorporated two digits for the date (one byte).  This allowed
            the programs of the time to be written so that they could read the two digits for use
            in databases, calendars, spreadsheets, etc. (most of which were written in COBOL).
                    As time went on, the original programs were simply upgraded, leaving the code
            that included the date-reading to be buried.  Then in the early 1990s, the BIOS chips
            were updated to include four digits (one word), but the programs still only contained
            the code for reading the last two digits.  Then, COBOL died away, as everyone went
            to C/C++, PASCAL, FORTRAN, and Ada.  And guess what...the old COBOL code
            still remained, only able to read two digits.

                    Now, everyone is in a mad rush to correct the Y2K Bug in the files, requiring
            that either all the programs be rewritten from scratch, or all the old programmers
            must be rehired to scan the 1000s of lines of COBOL-code to find where the date-
            reading lies.  It is estimated that it will cost $300 billion world-wide to correct the
            Y2K Bug.  See what procrastination does?
                    Will the Y2K disaster be averted?  Only time will tell...
 

--Danny Gump

 

Back