The Millennium Bug

y2kExcerpt from Time Magazines’ “The History and the Hype”

Two digits. That’s all. Just two lousy digits. 1957, they should have written, not 57. 1970 rather than 70. Most important, 01-01-2000 would have been infinitely preferable to 01-01-00. Though most of the dire predictions connected with that date–the Year 2000 computer bug’s moment of truth–are unlikely to come true, a little computer-generated chaos would provide a fitting conclusion to a 40-year story of human frailties: greed, shortsightedness and a tendency to rush into new technologies before thinking them through.

How did this happen? Who is responsible for the bug we call Y2K? Conventional wisdom goes something like this: back in the 1950s, when computers were the size of office cubicles and the most advanced data-storage system came on strips of punched cardboard, several scientists, including a Navy officer named Grace Murray Hopper, begat a standard programming language called COBOL (Common Business-Oriented Language). To save precious space on the 80-column punch cards, COBOL programmers used just six digits to render the day’s date: two for the day, two for the month, two for the year. It was the middle of the century, and nobody cared much about what would happen at the next click of the cosmic odometer. But today the world runs on computers, and older machines run on jury-rigged versions of COBOL that may well crash or go senile when they hit a double-zero date. So the finger of blame for the approaching crisis should point at Hopper and her COBOL cohorts, right?

Wrong. Nothing, especially in the world of computing, is ever that simple. “It was the fault of everybody, just everybody,” says Robert Bemer, the onetime IBM whiz kid who wrote much of COBOL. “If Grace Hopper and I were at fault, it was for making the language so easy that anybody could get in on the act.” And anybody did, including a group of Mormons in the late ’50s who wanted to enlist the newfangled machines in their massive genealogy project–clearly the kind of work that calls for thinking outside the 20th century box. Bemer obliged by inventing the picture clause, which allowed for a four-digit year. From this point on, more than 40 years ahead of schedule, the technology was available for every computer in the world to become Y2K compliant.

Programmers ignored Bemer’s fix. And so did his bosses at IBM, who unwittingly shipped the Y2K bug in their System/360 computers, an industry standard every bit as powerful in the ’60s as Windowsnis today. By the end of the decade, Big Blue had effectively set the two-digit date in stone. Every machine, every manual, every maintenance guy would tell you the year was 69, not 1969.

Bookmark the permalink.

Comments are closed