Silver-level article

Y2K

From RationalWiki
Revision as of 07:27, 3 July 2020 by Bongolian (talk | contribs)
Jump to navigation Jump to search
Dooooom.
We need the best
Technology
Icon Tech Portal.svg
Programming for Dummies

The year 2000 problem (also known as Y2K, the Millennium Bug, and a number of other names) refers to the predicted repercussions of a design flaw in much mainframe computer software, which became a public concern during the 1990s since it threatened to cause havoc in data centers around the world at the changeover from 1999 to 2000.

Causes

The year 1900.

In an era where computer memory was measured by the byte or "word" and came in the form of things like drumsWikipedia and ferrite coresWikipedia, programmers of bygone days (the 50s through the 70s, mostly) generally tended to represent dates in 6-byte text strings (usually in a format such as yyddmm or ddmmyy, depending on the local standards) to save memory.[note 1] The problem arose in the two-digit year format — any period longer than 99 years, 12 months, and 31 days was not representable, which would presumably lead to unpredictable results at the date rollover.

This problem had been anticipated as early as the mid 1980s, but at the time, when increases in computing power were inevitable and software maintenance was expected to be routine, software implementers and technicians didn't consider this to be a significant issue, believing, when they considered it at all, that the programs in question would be replaced before the millennium rollover. However, as these old programs continued in daily use with few or no updates, a certain sense began to gather around a group of computer consultants in the mid '90s that there could be a problem.

Rational reactions

There was considerable controversy over the magnitude of the potential problem.[1] Within the computer industry, though few people advocated outright ignoring the problem, most seemed to feel that it would be mostly an annoyance, and that the real concern was not so much in data centers as date-sensitive embedded control devices such as industrial monitoring and control devices, ATMs, and security systems. Banks were certainly alarmed at the notion, as a failure in their interest calculation systems could cause them to lose money. Within the IT world, however, there were numerous minor issues — many operating systems (Microsoft Windows in particular) used 2-digit years internally, and though the problem itself was conceptually simple, it required many man-hours of picking through code and archived data — an issue more of tedium than ingenuity.

By 1997-98, the COBOLWikipedia programmers of the world, hitherto hiding their identities out of shame, were raking in substantial amounts of money while trying to fix the problem; similar efforts were being made with embedded systems, which by their nature are much harder to repair.

Irrational reactions

Onozomg.gif
In 1999 I was having lunch with a congressman from Oklahoma named Ernest Istook. We had just had a debate in Texas about his proposal to bring government-sponsored prayer back to public schools. After finishing the main course, he said, "Barry, the conservative Republican caucus had a meeting the other day and we’ve solved the problem of the Y2K bug." I can be a straight man, so I said, "Gee, so what did you guys decide?" He answered, “Well, when the computers can’t recognize the year 2000, they flip back to 1900, and we like it better that way.” There is a lot of sad truth in that joke.[2]

Inevitably, the media and the public took an interest in the potential harm that this issue could cause to industries, national security, and the world economy. Christian fundamentalists, seeing an opportunity for evangelism, blew the expected results wildly out of proportion, trying to shoehorn Y2K into their end times theology, with Reconstructionist writer Gary North spearheading a movement towards large-scale survivalism. This spawned a cottage industry of low-tech appliances (many bought from companies that had once made most of their business supplying the Amish and groups like them) and survival-themed books and campaigns, the intent being that either the Second Coming would follow or the fundies of the world would be the only people with the resources to reconstruct, paving the way for a theocracy. (After Y2K came and went, many of the same people would recycle themselves after 9/11 as "counterterrorism experts". Different bottle, same snake oil.) Y2K became the subject of much fictional literature, mostly of a pulpy nature, often science fiction or religious but sometimes other genres (including pr0n), and one particularly bad NBC TV movie[3][4]. Y2K temporarily became a byword for any major but avoidable technical failure.

What actually happened

The last relic of the extinct so-called "human race", which was destroyed due to a mysterious software bug.

A few mistakes

While the Y2K scare can be credited for causing many companies to do significant data infrastructure upgrades and generally clean house on a lot of legacy software and data, the final work was nowhere near complete as New Year's Eve 1999-2000 approached, with many party plans but quite a few anxieties (and a lot of techies getting considerable overtime for working New Year's Eve). Midnight approached at the International Date Line, with New Zealand being among the first to be affected by the changeover... and not much happened. There were a few glitches and minor crashes here and there, the scariest involving alarm systems in Japanese nuclear reactors,[5] and a spy satellite went on the fritz.[6] Some clocks and calendar displays, including the clock at the United States Naval Observatory (which keeps the official date and time for the US), also got whacked out, either displaying the year as 1900 or 19100.

The largest mistake impacted the results of prenatal screenings for Down syndrome in the UK. Investigators found the Y2K glitch caused computers to miscalculate the maternal ages of 154 pregnant mothers, causing them to receive inaccurate information on the chance of their child having Down syndrome. As a result, two low-risk women got abortions, and four babies with Down syndrome were born after having been labeled "low risk."[7]

But not many mistakes

Y2K compliant!

But for the most part the rollover went smoothly, even in areas and industries where there had been little to no preparation. The cascading supply-chain failures and embedded control system disasters that were predicted never materialized, and some systems (most notably Unix-like and Mac OS systems, but also many embedded systems that didn't really care about the date to begin with) didn't have any problems at all. The main actual victims were ill-maintained vertical market apps for PCs, the sort of obscure but essential business software that has one writer and very few users. A few more issues flared up on March 1, 2000, and at the beginning of 2001, as 2000 was a leap year and many programs had not accounted for it, but these were even smaller problems than what had transpired the previous year.

Y2K, expected by some to be a global disaster as cataclysmic as World War II or the Spanish flu epidemic, instead went into the history books as an over-hyped fizzle. A few expired credit cards and the odd broken satellite do not make a disaster, and Y2K passed rapidly out of pop culture. The only memory of it now can be found in a few software packages, mostly programming libraries written in the 1990s and still actively maintained, that still carry Y2K compliance statements, most likely because no one could be bothered to excise something like that from the documentation. The economy remained more or less healthy until April 2000, when the money train ran out for the dotcom boom... so, in a sense, Y2K (as in, the year 2000 itself) did wreck the computer industry after all.

Incredibly, a few more problems came up at the beginning of 2020. Due to some lazy date windowing when dealing with Y2K, a handful of computer programs started malfunctioning and made a bit of a mess of things. However, most of these were simple annoyances and had no serious impact.[8]

9/9/99

As sort of a lead-up to Y2K, there were some minor concerns around what would happen on 9 September 1999, mostly centered around shitty programming[9] that turned out to be negligible. In one instance, sales callers over the previous decades were instructed to enter "9/9/99" into a "next call" date field for any clients they were not going to contact again (as opposed to having an "inactive client" field)... thus scheduling them to call back former customers on 9 September 1999. Total anarchy!

Aftermath

"Nuclear systems will be Y2K compliant"? We fucking hope so!

Was it all for naught? No one really knows how much of the painlessness of the transition could be attributed to preparation and how much was a factor of the problem being not as big as expected. And, in fact, Y2K preparations had a considerable effect in terms of overall disaster readiness, allowing many companies and governments substantially greater operational flexibility and recoverability in the wake of events such as 9/11 and the 2003 Northeast US Blackout.

Furthermore we do not have any information about what would have happened if nothing had been done. We do know that (a) a lot of work was done and that (b) not a lot went went wrong. What we cannot conclusively say is that b was the result of a.

What is generally agreed is that the demagogues who tried to turn it into a mass settling of accounts with God and humanity were eventually incorrect, and that everyone got a little hot under the collar about a problem that did not turn out to be quite as big a deal as some expected.

Y2K38

But just when you thought it was safe to go back in the water... In 2038, 32-bit Unix and Unix-like (e.g. Linux) systems will experience a similar problem with "date rollover"; this has been dubbed, unimaginatively enough, the Year 2038 problemWikipedia. The PC industry has been shifting gradually to 64-bit hardware and software,[note 2] but there will probably still be a lot of legacy stuff sitting around, and most embedded systems (not only tablets and smartphones, but automobiles, industrial equipment, etc.) are not 64-bit. This of course means the world will end and we're all DOOMED! But in reality, just as with Y2K, it will entail a lot of boring work upgrading and replacing stuff. At least they're starting work on this now, a couple of decades ahead, instead of in the last few years before it.

People still preparing for Y2K

Until May 17, 2017, the United States still required that its agencies report on their preparedness for Y2K. President Donald Trump wisely dropped its requirement that day.[10] You have to wonder what exactly they were submitting for 17 years.

A case study of a Y2K book

The Millennium Bug: How to Survive the Coming Chaos by Michael S. HyattWikipedia[11] was a fairly typical layman's guide to the bug. Hyatt's writing leaned heavily on fear-mongering and argument from authority — at one point, he included fluency in Pascal and "three different dialects of BASIC" as part of his qualifications, some very trivial resume items likely to result in gales of laughter from more sober-minded geeks. He offered three possible scenarios: brownout (major inconvenience), blackout (life-threatening economic failure), and meltdown (complete societal collapse), considering some combination of the second two most likely. (The scenario of "not much of anything" was never even mentioned.) The survival solutions Hyatt advocated were apparently calculated to appeal to right-wing libertarian and survivalist values, including moving to a small town and stocking up on guns and food.[12]

After the Y2K bug proved to be mostly a bust, Hyatt would go on to write self-help business books. One hopes the author of Living Forward: A Proven Plan to Stop Drifting and Get the Life You Want did eventually see his wish come true.

External links

Notes

  1. The "right" way to do this is to represent the date as a numerical offset from a specified zero date (the epoch, in techspeak). Unix's time_t does this, using seconds from 1970-01-01 0000h UTC — the UNIX Epoch — as its standard increment. The signed 32-bit time_t runs out in the year 2038, when most Unix-like systems are expected to long since have converted to 64-bit or larger integers (as almost all new computer systems save embedded products and netbooks have done over the last couple of years — anything with 4 or more gigabytes of memory needs a 64-bit system). We'll see about that, but it still won't be anything like what people feared for Y2K. In any case, date strings are old and busted.
  2. We'll not have to worry about date rollovers until 292,277,026,596 (well, actually the year 2,147,485,547). Take it easy.

References

  1. New York Times archive of the Year 2000 Problem
  2. Special Book Excerpt: A Little List. Ten Reasons Not To Trust The Religious Right
  3. IMDb: Y2K (1999)]
  4. The first hint of trouble in the Y2K movie was when an airliner crossed the International Date Line on December 31st, and promptly fell from the sky. In a case of truth being stranger than fiction, years after Y2K this actually almost happened, except with the F-22 fighter jet, because the authors of the avionics software screwed up. Guess $66 billion isn't enough to get good programmers.
  5. Y2K bug fails to bite, BBC News
  6. US satellites safe after Y2K glitch, BBC News
  7. NHS faces huge damages bill after millennium bug error by Martin Wainwright (13 Sep 2001 22.24 EDT) The Guardian.
  8. Stokel-Walker, Chris. "A lazy fix 20 years ago means the Y2K bug is taking down computers now" (in en-US). 
  9. Sept. 9, 1999: 9/9/99 No Big Deal for Computers, Wired
  10. "Trump Orders Government to Stop Work on Y2K Bug, 17 Years Later", Bloomberg.
  11. Hyatt, Michael S. The Millennium Bug: How to Survive the Coming Chaos. Washington DC: Regnery Publishing, 1998.
  12. Grossman, Anna Jane. "Fondly Remembering the Y2K Panic." Motherboard, 31 December 2010 (recovered 15 January 2017).