Google Groups no longer supports new Usenet posts or subscriptions. Historical content remains viewable.
Dismiss

Computer bugs in the year 2000

2,798 views
Skip to first unread message

Spencer Bolles

unread,
Jan 18, 1985, 11:43:17 PM1/18/85
to

I have a friend that raised an interesting question that I immediately
tried to prove wrong. He is a programmer and has this notion that when we
reach the year 2000, computers will not accept the new date. Will the
computers assume that it is 1900, or will it even cause a problem? I
violently opposed this because it seemed so meaningless. Computers have
entered into existence during this century, and has software, specifically
accounting software, been prepared for this turnover? If this really
comes to pass and my friend is correct, what will happen? Is it anything
to be concerned about? I haven't given it much thought, but this programmer
has. I thought he was joking but he has even lost sleep over this. When
I say 'friend,' I'm NOT referring to myself, if it seemed that way.

"I've never really written anything like that before"

Spencer L. Bolles

j...@wdl1.uucp

unread,
Jan 21, 1985, 11:36:26 AM1/21/85
to
Referring to an article on Julian Dates by Gordon King in Dr
Dobb's Journal #80 (June 1983) pages 66-70.

Most computer systems use some form of modified Julian date
internally because it is compact to store and simple arithmetic
can be used on them.

A Julian date algorithm for a 16 bit computer is valid for ~179.4
years (65,536 days) This is used as an offset from a base year,
usually 1900. Such an algorithm would then stop working in 2079.
The base year has to be chosen fairly carefully because of leap
years.

John R Blaker
UUCP: ...!fortune!wdl1!jrb
ARPA: jrb@FORD-WDL1
and blaker@FORD-WDL2

Gary Marc Levin

unread,
Jan 21, 1985, 11:47:25 AM1/21/85
to
> I have a friend that raised an interesting question that I immediately
> tried to prove wrong. He is a programmer and has this notion that when we
> reach the year 2000, computers will not accept the new date. Will the
> computers assume that it is 1900, or will it even cause a problem?
> ...
> Spencer L. Bolles

The problem won't be the computers, but the software. Some software is
bound to be wrong, only considering the last two digits of the year.

Actually, the year 2000 will probably make some faulty software work
correctly for 100 years longer than they should. 2000 is the second
level exception to the leap year rule.

Leap years are those years divisible by 4,
EXCEPT those divisible by 100,
EXCEPT those divisible by 400.

Programs that assume that all multiples of 4 are leap years are wrong,
but the problem won't come up until 2100.
--
Gary Levin / Dept of CS / U of AZ / Tucson, AZ 85721 / (602) 621-4231

Liudvikas Bukys

unread,
Jan 21, 1985, 7:09:44 PM1/21/85
to

Spencer L. Bolles:
"... He is a programmer and has this notion that when we reach the

year 2000, computers will not accept the new date. Will the computers
assume that it is 1900, or will it even cause a problem? ..."

Hey! No big deal! So what if every piece of code that prints dates with
ctime[3] starts believing every year in the 21st century is Year 2, thanks to
a little parenthesization error?

cp[2] = '0' + t->tm_year >= 200;

Or, as Joe Bob would say,

"It could happen here."

P.S. I will leave unnamed the
particular Unix version I pulled this
source line from. I don't know which
of the popular factions introduced it
first or fixed it first. I don't want
to know, and please don't tell me.

D Gary Grady

unread,
Jan 22, 1985, 12:07:52 PM1/22/85
to
<>
> The problem won't be the computers, but the software. Some software is
> bound to be wrong, only considering the last two digits of the year.

And thereby hangs a tale: In 1978, when I was working in banking, I
ran across a curious date storage format. It seems that transaction
dates were coded with the last digit of the year in one nibble, the
month in hex in the next, and the date (in packed decimal) in the next
two. I asked one of the more senior systems analysts about this and
she informed me that when the record was originally designed, only the
month and day (in packed decimal) had been included. This caused
sorting problems on statements printed in January, because checks
written in the December of the previous year would sort after checks
written in January of the current. So the format had been modified to
the one I just described.

"Good grief!" said I. "What happens in January of 1980?" She turned
pale and admitted she had considered that before but managed to put it
out of her mind. "So why not go ahead and fix it now?" I asked.

She pointed out that fixing it would require expanding the demand
deposit master record format, a mammoth undertaking. About a billion
COBOL programs would have to be recompiled. At this shop we were still
on cards and a rush compile took about a week. "You want to do that?"
she inquired. This time I turned pale. We considered our options,
knowing that one or the other of us would be called upon to fix the
problem. And you know what we did?

First, I modified the daily demand deposit program with code that
checked for the date and about mid-1979 started printed warnings on the
console of what would happen come new year. Then the systems analyst
and I got new jobs. This is known as stepwise interactive development.

--
D Gary Grady
Duke U Comp Center, Durham, NC 27706
(919) 684-3695
USENET: {seismo,decvax,ihnp4,akgua,etc.}!mcnc!ecsvax!dgary

Norman Diamond

unread,
Jan 22, 1985, 4:31:10 PM1/22/85
to
> > I have a friend that raised an interesting question that I immediately
> > tried to prove wrong. He is a programmer and has this notion that when we
> > reach the year 2000, computers will not accept the new date. Will the
> > computers assume that it is 1900, or will it even cause a problem?
> > ...
> > Spencer L. Bolles
>
> The problem won't be the computers, but the software. Some software is
> bound to be wrong, only considering the last two digits of the year.
> but the problem won't come up until 2100.
> ...

> Gary Levin / Dept of CS / U of AZ / Tucson, AZ 85721 / (602) 621-4231

Leap years are not the only problem, and some software already is wrong.
There was some 105-year-old lady who hadn't registered for school, and
the truant officers came after her. I think this happened in the
U.S. midwest, around 8 years ago.

-- Norman Diamond

UUCP: {decvax|utzoo|ihnp4|allegra|clyde}!watmath!watdaisy!ndiamond
CSNET: ndiamond%watd...@waterloo.csnet
ARPA: ndiamond%watdaisy%waterlo...@csnet-relay.arpa

"Opinions are those of the keyboard, and do not reflect on me or higher-ups."

Gadfly

unread,
Jan 22, 1985, 10:49:49 PM1/22/85
to
--

>> I have a friend that raised an interesting question that I
>> immediately tried to prove wrong. He is a programmer and has this
>> notion that when we reach the year 2000, computers will not accept
>> the new date. Will the computers assume that it is 1900, or will
>> it even cause a problem?...

>> Spencer L. Bolles

Your friend is probably aluding to the leap-century correction
in the Gregorian Calendar. Most date programs do not make any
subtler correxions than leap-year (and some don't even do that).
There is no Feb 29 in a century year unless that year is divisible
by 400. Thus, 1900 was not a leap year (look it up), but 2000
will be. So, all un-leap-century-corrected programs will be
safe until 2100, and most folks will slide blissfully into the
next millenium never even stopping to think about their calendar's
fine tuning.
--
*** ***
JE MAINTIENDRAI ***** *****
****** ****** 22 Jan 85 [3 Pluviose An CXCIII]
ken perlow ***** *****
(312)979-7188 ** ** ** **
..ihnp4!iwsl8!ken *** ***

dar...@ism780.uucp

unread,
Jan 23, 1985, 12:32:34 AM1/23/85
to
> He is a programmer and has this notion that when we
> reach the year 2000, computers will not accept the new date.

This brings to mind the famous PDP-8 date problem. Under OS-8, the year was
encoded as 3 bits (!!), which promptly ran out in 75 or 76, at which point
the powers that were managed to scrape another bit. Anybody out there still
running OS-8? What did you do when the year turned to sh*t again? (Buy an
8080 based machine for improved performance and memory capability?)

--Darryl Richman, INTERACTIVE Systems Inc.
...!cca!ima!ism780!darryl
The views expressed above are my opinions only.

s...@cepu.uucp

unread,
Jan 23, 1985, 10:20:57 AM1/23/85
to
In article <8...@reed.UUCP> bol...@reed.UUCP (Spencer Bolles) writes:
>
> I have a friend that raised an interesting question that I immediately
>tried to prove wrong. He is a programmer and has this notion that when we
>reach the year 2000, computers will not accept the new date. Will the
>computers assume that it is 1900, [...]s even lost sleep over this. When

>I say 'friend,' I'm NOT referring to myself, if it seemed that way.

Well, it depends on several things, (1) the 'base' date, (2) how many
bits are uses to encode the offset, and (3) the resolution used.

For example OS/8 (a operating system for the PDP-8 and 12) used 3
bits for they year and a base date of Jan 1 1970. On Jan 1 1978 it
broke. Unix (v7 anyway) uses 32 bits to record the time in seconds
since 0000Z01JAN70 (Midnight GMT Jan 01,1970) this will break sometime
in 2038 (Jan 18 about 3 AM GMT). Other operating systems use different
epochs and different resolutions and will break at different times.
--
Stephen C. Woods (VA Wadsworth Med Ctr./UCLA Dept. of Neurology)
uucp: { {ihnp4, uiucdcs}!bradley, hao, trwrb}!cepu!scw
ARPA: cepu!scw@ucla-cs location: N 34 3' 9.1" W 118 27' 4.3"

Mike Schloss

unread,
Jan 23, 1985, 6:41:47 PM1/23/85
to

I have heard the same rumor from some reliable sources. When I was working
summers for Prudential a while back I was told the story about this and the
people were serious. One guy, a serious system programmer, not a hack, told
me he was setting his retirement date according to the date this problem will
manifest itself. The story goes as follows:

In IBM's OS/VSI, OS/VSII, and MVS all files have a time stamp
associated with them, usually the creation date. If upon creation
the file is deemed to be temporary the the time stamp becomes the
expiration date and defaults to sometime in the future. The
difference between a creation date and expiration date is the
expiration date has the high order bit set. [See the problem coming]
The problem is that sometime in 2000 (I dont think its midnight
Jan 1) the most significant bit in the timestamp will change
and the system will then think that all files on all disk drives
are temporary and should have been deleted a long time ago.
Net result ... All files get deleted.

d...@ucla-cs.uucp

unread,
Jan 23, 1985, 7:41:43 PM1/23/85
to
From what I've read, many programs broke at the start of 1970 because they
stored the year as a single digit; fewer, but still a good number, broke in
1980. I think the real trouble will come on January 3, 2000, not January 1,
since the 3rd is the first business day. I think the problems will come
in subtle ways -- most companies will catch the obvious implications of a
two-digit year cycling around, but buried away in some obscure code...

-- David Smallberg, d...@ucla-cs.ARPA, {ihnp4,ucbvax}!ucla-cs!das

Sam Kendall

unread,
Jan 23, 1985, 8:01:51 PM1/23/85
to
> ... [T]his notion that when we

> reach the year 2000, computers will not accept the new date.

Yeah, this thought occurred to me when I took COBOL years ago and found
that data was encoded in decimal, and years often encoded in 2 digits.
I don't know about the IBM OS creation date/temporary file problem, but
other than that, the COBOL two-decimal-digit-year problem is the major
one. This is a pretty common thing to do in COBOL programs; COBOL is
the most-used computer language (I think, and in any case it certainly
is in the business/bureaucratic world); there are plenty of programs
that have been running for years, and for which the sources have been
lost.

I am posting this because I think a lot of people have never seen a
COBOL program, and so don't realize why the year 2000 will be trouble.

I think, though, that IBM will get moving on this problem around the
year 1995, if only so that the society on which they depend for profits
will continue to exist.

Sam Kendall {allegra,ihnp4,ima,amd}!wjh12!kendall
Delft Consulting Corp. decvax!genrad!wjh12!kendall

Eric Stern

unread,
Jan 24, 1985, 1:06:02 AM1/24/85
to

I used to work for a company that packed dates into 16 bit words
in such a way so that being the last part of the century, all dates
were negative numbers. However, certain files could contain either
of two types of records, the distinguishing characteristing being
that one type of record contained a date at a particular offset.
Of course, the check for this kind of record was whether the number
at that offset was negative or not, so when the century rolls over
this test would fail. I pointed this feature out to several people,
who rightly were not concerned, as by the time this became a problem,
their software would have migrated to a different system and would
probably be largely rewritten.

However, I have heard that CDC operating systems had a problem
at a certain date in the past, where the computer would refuse
to boot up when this date was reached. Calls came in to CDC
from all over the world as midnight advanced westward.

Eric G. Stern

Richard H. E. Smith II

unread,
Jan 24, 1985, 5:54:01 AM1/24/85
to
In article <68...@watdaisy.UUCP> ndia...@watdaisy.UUCP (Norman Diamond) writes:
>>> I have a friend that raised an interesting question that I immediately
>>> tried to prove wrong. He is a programmer and has this notion that when we
>>> reach the year 2000, computers will not accept the new date. Will the
>>> computers assume that it is 1900, or will it even cause a problem?
>> The problem won't be the computers, but the software. Some software is
>> bound to be wrong, only considering the last two digits of the year.
>> but the problem won't come up until 2100.
>Leap years are not the only problem, and some software already is wrong.
>There was some 105-year-old lady who hadn't registered for school, and
>the truant officers came after her.... -- Norman Diamond

Some software blows up on dates at other times. I'm aware of some old
DEC software (don't worry... you're NOT using it... it's single user!)
that keeps the date year as a 5 bit offset from 1972. Let's see...
1972+31=2003, so it blows up in 2004. Probably, tho, the display-a-year
routine isn't written to handle beyond 31-dec-99, since no one expects
that RT11 (oops, now I said it) will still be used then. I hope.


--
----------
Dick Smith ..ihnp4!wlcrjs!rhesmith

la...@extel.uucp

unread,
Jan 24, 1985, 1:05:00 PM1/24/85
to

Another problem is that we have gotten into the habit of only using the
last 2 digits of the year (look at your checkbook). Even worse is that
some business software only allows a 2 character wide field for the
date. Perhaps the designers did not expect their program to be in use
in the year 2000 but I would not be suprised to see a considerable
amount of 370 code running in the year 2000.

Just think that in a few years you will be able to refer to the
year 2002 as aught-two! By the way the Websters Thesaurus also lists
ought as an alternate spelling to aught.

Larry Pajakowski
ihnp4!tellab1!extel!larry

John Bruner

unread,
Jan 24, 1985, 3:13:51 PM1/24/85
to
Back in the V6 days at Purdue/EE we purchased the CULC adaptation
of DEC's Fortran-IV-Plus compiler for our PDP-11's. Part of the
package was a UNIX version of MACRO-11. I noted with amusement
that the output conversion routine for the date and time (used to
produce listings) for MACRO-11 was never intended to handle a
year greater than 1979 -- when I ran it in 1980 it printed the
date as "13-SEP-7:".

Of course, by that time we only had one PDP-11/45 still running V6.
--
John Bruner (S-1 Project, Lawrence Livermore National Laboratory)
MILNET: j...@mordor.ARPA [jdb@s1-c] (415) 422-0758
UUCP: ...!ucbvax!dual!mordor!jdb ...!decvax!decwrl!mordor!jdb

Doug Pardee

unread,
Jan 25, 1985, 12:01:25 PM1/25/85
to
> >> I have a friend that raised an interesting question that I
> >> immediately tried to prove wrong. He is a programmer and has this
> >> notion that when we reach the year 2000, computers will not accept
> >> the new date. Will the computers assume that it is 1900, or will
> >> it even cause a problem?...
>
> Your friend is probably aluding to the leap-century correction
> in the Gregorian Calendar.

Oh, dear oh dear. Folks, there is an outside world out there and
that world uses computers to do REAL STUFF. One of the "real stuff"
things that computers do out there is to store data in files, both
on tape and on disk. Things like the balance in your checking account
(or the amount that it's overdrawn :-)

There is SO MUCH data in those files, and tapes and disks cost SO MUCH
to buy and store, that those files have "expiration dates", at which
time a program (run daily, as a rule) will see that they have expired
and will remove all traces of them from the various directories, and
will return the disk space or reel of tape to the "available" pool.

I imagine you are aware that IBM's System/360/370/30xx machines
handle nearly all such transactions (to the unending dismay of
Honeywell, Burroughs, Univac, etc.) In the IBM world, the date
of December 31, 1999 is the highest (latest) date that can be
specified. So if you have stuff that you want to keep forever,
you put a date of 99365 on it. I leave it to your imagination
what will happen on 12/31/99 when all of those computers find
all of those disk files and tapes are to be scratched.

A variation results from the natural cycle of many such files. For
example, a monthly backup tape in a 4-month cycle will be kept for
four months, no? Although IBM doesn't supply any routine to
compute such a date, virtually every site has written or bought one.
So on, say, 10/01/99 a 4-month file will be set to expire on
02/01/00. Guess what happens the next morning? Bye-bye file!

There are a number of other effects which will result, all from
the fact that the computer will NOT be able to compare two dates
to find out which one is later. Unless the programmer anticipated
the problem, the formula for figuring out how many days elapsed
between two dates won't work. How do you figure, e.g., interest
earned, if you don't know the time period involved?

Dates and time ARE of the utmost importance to the business world!

There are minor effects, too. Like when your company's ten-year
forecast says that you'll be making a good profit in 1903. Looks
really professional on the ol' annual report.
--
Doug Pardee -- Terak Corp. -- !{hao,ihnp4,decvax}!noao!terak!doug

Tim Smith

unread,
Jan 25, 1985, 4:26:42 PM1/25/85
to

If you are really worried about timewrap breaking programs in subtle ways,
then set your clock ahead now, and find the bugs. That will give you several
years to fix them. If you are binary only, you might NEED several years
to get you vendor to fix them! :-)
--
Duty Now for the Future
Tim Smith
ihnp4!wlbr!callan!tim or ihnp4!cithep!tim

Ron Natalie <ron>

unread,
Jan 27, 1985, 6:19:06 PM1/27/85
to
>
> Spencer L. Bolles:
> "... He is a programmer and has this notion that when we reach the
> year 2000, computers will not accept the new date. Will the computers
> assume that it is 1900, or will it even cause a problem? ..."
>
> Hey! No big deal! So what if every piece of code that prints dates with
> ctime[3] starts believing every year in the 21st century is Year 2, thanks to
> a little parenthesization error?
>
> cp[2] = '0' + t->tm_year >= 200;
>
Of course, UNIX time (seconds past midnight GMT 1 Jan 1970 in 32 bits)
falls apart around 2042.

Ron Natalie <ron>

unread,
Jan 27, 1985, 6:24:11 PM1/27/85
to
> For example OS/8 (a operating system for the PDP-8 and 12) used 3
> bits for they year and a base date of Jan 1 1970. On Jan 1 1978 it
> broke. Unix (v7 anyway) uses 32 bits to record the time in seconds
> since 0000Z01JAN70 (Midnight GMT Jan 01,1970) this will break sometime
> in 2038 (Jan 18 about 3 AM GMT). Other operating systems use different
> epochs and different resolutions and will break at different times.
> --
Uh, huh. Anyone remember the form letter programs from version 6?
It stopped working around 1979, never to move again. V6 nroff also
used to have a bug that caused certain strange effects to occasionally
appear and disappear every nine hours or so.

-Ron

Tim Smith

unread,
Jan 28, 1985, 4:01:58 PM1/28/85
to
> In IBM's OS/VSI, OS/VSII, and MVS all files have a time stamp
> associated with them, usually the creation date. If upon creation
> the file is deemed to be temporary the the time stamp becomes the
> expiration date and defaults to sometime in the future. The
> difference between a creation date and expiration date is the
> expiration date has the high order bit set. [See the problem coming]
> The problem is that sometime in 2000 (I dont think its midnight
> Jan 1) the most significant bit in the timestamp will change
> and the system will then think that all files on all disk drives
> are temporary and should have been deleted a long time ago.
> Net result ... All files get deleted.

Look, if you have a bit that marks a file as temporary or permanent, and
that bit is set at file creation time, then there is no problem with files
created BEFORE the high order bit of the date is set. The system will NOT
decide that they are all temporary and delete them! The only problems
will be with files created after the high order bit of the date is set.

[ Unless, of course, the use AT&T Common Object File format, which, according
to my copy of the manual, keeps the timestamp as the number of seconds
relative to the CURRENT time! :-) ]

Henry Spencer

unread,
Jan 28, 1985, 7:21:56 PM1/28/85
to
Forecasting programs are already encountering this sort of problem.
1975 was a bad year for 25-year forecasts...
--
Henry Spencer @ U of Toronto Zoology
{allegra,ihnp4,linus,decvax}!utzoo!henry

Robert Stroud

unread,
Jan 29, 1985, 2:57:39 PM1/29/85
to
I don't know about 2000 (I can guess :-) but I do have an anecdote
that relates to a summer job I had back in 1979. We got a 'phone call
from the suppliers of some application software along the following lines...

Them: Are you planning to use the machine on August 17th 1979?

Us: Probably not - it's a Saturday.

Them: Well if you do, whatever you do, when you boot the machine, don't
tell it it's August 17th! Lie and pretend it's August 18th.

It turned out that the internal coding of "August 17th 1979" matched
a character sequence used by the application to denote EOF!

That's true - honest! Names of machines, operating systems and software
suppliers are suppressed to protect the guilty. I wouldn't swear to the
exact date, but it was around that time.

Robert Stroud,
Computing Laboratory,
University of Newcastle upon Tyne.

ARPA robert%cheviot%newcastl...@mit-multics.arpa
UUCP ...!ukc!cheviot!robert

Eugene Miya

unread,
Jan 29, 1985, 3:20:38 PM1/29/85
to
<2...@callan.UUCP>

Recently, I was one of the operations officers for the 1984 ACM National
Meeting. The theme of that conference was "The Fifth Generation." While
putting the conference together, one of the other people (Bob Van Tuyl
of GTE) joked:

If there is any one thing which is going to hold back the
'Fifth Generation,' it's going to be the 'Second Generation.'

You have just given evidence to support that conclusion. ;-)

--eugene miya
NASA Ames Research Center
{hplabs,ihnp4,dual,hao,vortex}!ames!aurora!eugene
em...@ames-vmsb.ARPA

Kim Christian Madsen.

unread,
Jan 29, 1985, 3:44:37 PM1/29/85
to
Well you can fix the bug(s) on a specific machine, but the main purpose must
be to create a standard fix so no machine will be affected in an unpleasant
way when 2000 comes. (or even before)!!!
--
Kim Chr. Madsen. Institute of Datalogy,
University of Copenhagen
{decvax,philabs,seismo}!mcvax!diku!kimcm

Alexis Dimitriadis

unread,
Jan 29, 1985, 10:25:59 PM1/29/85
to

With most library functions, you do not need to reset the machine clock--
just call them with the right number of seconds, and see what they do.
(You might even catch some of the overflow problems that have been discussed
here).
I attached a simple program that does that, just run it and give it
the number of years you want to go forward (or backward, if < 0),
or can substitute your pet functions for time() and ctime().
E.g., I found that we DO have the bug in ctime that prints every year
after 2000 as year 2. (and without a trailing newline...)

alexis @ reed

---------------------------
#include <stdio.h>
#include <sys/time.h>
#define YEAR 31536000 /* only roughly, but who cares */

main()

{
long time(), clock;
float increment;
char * ctime();

time(&clock);
fputs(ctime(&clock), stdout);

while (scanf("%f", &increment) > 0) {
clock += (long) (increment * YEAR);
fputs(ctime(&clock), stdout);
}
}

br...@ism780.uucp

unread,
Jan 30, 1985, 1:22:01 AM1/30/85
to
> From what I've read, many programs broke at the start of 1970
> because they stored the year as a single digit; fewer, but still a
> good number, broke in 1980.
Not as well known is the fact that many COBOL banking and/or accounting
programs that broke in 1970 were fixed by allowing the year field to be
interpreted as a binary field rather than a decimal field. This was
intended as a temporary measure until the database records could be
reorganized with a wider date field. (When you've got several million
records and several hundred programs, adding just one byte to each record
takes a bit of doing and most records have more than one date field.) Many
of those same systems broke again at the beginning of 1976. I recall that
when I started working for Western Bancorp in Sept. 1976 that some of my
co-workers were nine months later still regaling each other with tales of
which banks got caught by that one. I seriously plan on closing my
checking account several months before the end of the centuary and hiding
all my cash under my mattress until all the smoke clears.

Bruce Adler {sdcrdcf,uscvax,ucla-vax,vortex}!ism780!bruce
Interactive Systems Corp. decvax!yale-co!ima!bruce

Ed Nather

unread,
Jan 31, 1985, 11:34:17 AM1/31/85
to
For those of you fixing things in your software:

The year 2000 *is* a leap year, despite what many algorithms tell you.
The year 2400 is *not* a leap year.

With minimal effort, you can make things work until 2399. You may be
subject to complaints after that.

--
Ed Nather
Astronony Dept, U of Texas @ Austin
{allegra,ihnp4}!{noao,ut-sally}!utastro!nather

Ron Natalie <ron>

unread,
Jan 31, 1985, 9:03:36 PM1/31/85
to
> For those of you fixing things in your software:
>
> The year 2000 *is* a leap year, despite what many algorithms tell you.
> The year 2400 is *not* a leap year.
>
> With minimal effort, you can make things work until 2399. You may be
> subject to complaints after that.
>
Now you've really got me confused. Why is 2400 not a leap year?

Ron Natalie <ron>

unread,
Jan 31, 1985, 9:05:54 PM1/31/85
to
What is really amusing about all of this is that if people didn't insist on
putting specific checks in their code for the last year of each century not
being a leap year, everything would have been OK. It is doubtful that most
simple utilities care about dates before 1901 or after 2099.

-Ron

Landon C. Noll

unread,
Jan 31, 1985, 9:48:14 PM1/31/85
to
In article <7...@ames.UUCP> eug...@ames.UUCP (Eugene Miya) writes:
> If there is any one thing which is going to hold back the
> 'Fifth Generation,' it's going to be the 'Second Generation.'
>

Oh, you mean MBI and Big Green and Cobol? Or do you mean Big Mama and
her Fifth Sister? :-)

chongo <is that why they call it release 2?> /\VV/\

Landon C. Noll

unread,
Jan 31, 1985, 10:00:25 PM1/31/85
to
In article <3...@terak.UUCP> do...@terak.UUCP (Doug Pardee) writes:
> Unless the programmer anticipated
>the problem, the formula for figuring out how many days elapsed
>between two dates won't work. How do you figure, e.g., interest
>earned, if you don't know the time period involved?

Are you suggesting that people pull their money out of the banks on
Dec 31, 1999? If so, then maybe you should suggest that people avoid
the rush and grab it Dec 30, or maybe Dec 29, ....

I think a date overflow is far better than a input transaction overflow.. :-)

Soon I will test another area of the 2000 date problems. Magazine subscription
dates. Well due to a strange set of events, I have a subscription to
this mag. which ends in 1999. (of which I have paid nothing for) Well
the othter day they sent me a renewal notice, so im going to actually
pay for another year and ...

Berry Kercheval

unread,
Feb 1, 1985, 12:33:45 PM2/1/85
to
In article <3...@terak.UUCP> do...@terak.UUCP (Doug Pardee) writes:
>In the IBM world, the date
>of December 31, 1999 is the highest (latest) date that can be
>specified. So if you have stuff that you want to keep forever,
>you put a date of 99365 on it. I leave it to your imagination
>what will happen on 12/31/99 when all of those computers find
>all of those disk files and tapes are to be scratched.


I once heard an apocryphal story to the effect that a Systems Programmer
at Large Unnamed Corp. was debugging something late one night and for
some reason it became necessary to set the system date at 99365.

Guess what happened at midnight?

Guess who is now a plumber?

--
Berry Kercheval Zehntel Inc. (ihnp4!zehntel!zinfandel!berry)
(415)932-6900

james armstrong

unread,
Feb 1, 1985, 1:18:58 PM2/1/85
to
>The year 2000 *is* a leap year, despite what many algorithms tell you.
>The year 2400 is *not* a leap year.

Actually, 2400 is a leap year. 2100, 2200, and 2300 are not.

Dave Martindale

unread,
Feb 2, 1985, 1:05:39 AM2/2/85
to
In article <9...@utastro.UUCP> nat...@utastro.UUCP (Ed Nather) writes:
>The year 2000 *is* a leap year, despite what many algorithms tell you.
>The year 2400 is *not* a leap year.
>
>With minimal effort, you can make things work until 2399. You may be
>subject to complaints after that.

Are you absolutely sure of this? (your trailer DOES say you work come
from an astronomy department....)

My understanding was that years divisible by 4 were leap years, except
that years divisible by 100 were not, except that years divisible by
400 were - giving 97 leap days every 400 years.

According to that pattern, 2000 IS a leap year, and the naive year-mod-4
algorithms will work properly until 2099, not 2399.

Ron Bemis

unread,
Feb 3, 1985, 6:02:35 AM2/3/85
to
For those of you fixing things in your software:

> The year 2000 *is* a leap year, despite what many algorithms tell you.

Agreed (by everybody, I think).


> The year 2400 is *not* a leap year.

How do you figure? Shouldn't that say 2100?
Leap if divisible by 4
Unless divisible by 100
Unless divisble by 400
--
_____
Ron Bemis / o o \ Support Bacteria -
Tektronix | \___/ | It's the only culture
Redmond, OR \_____/ Some people have!

Theo van der Storm

unread,
Feb 3, 1985, 11:08:10 AM2/3/85
to

(msd = mean solar day)
1 year = 365.2422 msd = 365 + 1/4 - 1/100 + 1/400 + error
That's why we have:
leapyear 1 out of 4
non leap year 1 out of 100
leapyear 1 out of 400 (So 2400 is a leap year.)
Read any basic astronomy book.
--

Theo van der Storm, 52 20'N / 4 52'E, {seismo|decvax|philabs}!mcvax!vu44!tstorm

Norman Diamond

unread,
Feb 4, 1985, 2:02:17 PM2/4/85
to
> The year 2000 *is* a leap year, despite what many algorithms tell you.
> The year 2400 is *not* a leap year.

So, the guy made a mistake. Why aren't astronomers permitted to make as
many mistakes as programmers? I even make mistakes occasionally (though
not that one).

-- Norman Diamond

UUCP: {decvax|utzoo|ihnp4|allegra|clyde}!watmath!watdaisy!ndiamond
CSNET: ndiamond%watd...@waterloo.csnet
ARPA: ndiamond%watdaisy%waterlo...@csnet-relay.arpa

"Opinions are those of the keyboard, and do not reflect on me or higher-ups."

Bill Vaughn

unread,
Feb 15, 1985, 10:44:40 AM2/15/85
to
> In article <79...@brl-tgr.ARPA> r...@brl-tgr.ARPA (Ron Natalie <ron>) writes:
> >> For those of you fixing things in your software:
> >>
> >> The year 2000 *is* a leap year, despite what many algorithms tell you.
> >> The year 2400 is *not* a leap year.
> >>
> >> With minimal effort, you can make things work until 2399. You may be
> >> subject to complaints after that.
> >>
> >Now you've really got me confused. Why is 2400 not a leap year?
>
> (msd = mean solar day)
> 1 year = 365.2422 msd = 365 + 1/4 - 1/100 + 1/400 + error
> That's why we have:
> leapyear 1 out of 4
> non leap year 1 out of 100
> leapyear 1 out of 400 (So 2400 is a leap year.)
> Read any basic astronomy book.
> --

Hmmm. Let's take this two more steps. Since our error over the long run
will be +0.0003 msd, we can reduce this by taking a leap year back every
3200 years making the error now -0.0000125 msd. But this can be corrected by
replacing one of those leap years we just took back every 80,000 years
and we're right on the money, assuming 365.2422 msd/year is exact and
that it's constant over this period of time. I'm sure neither of these
assumptions are correct ;-).

So, for the time being, I claim that 3200 should *not* be a leap year, (nor
should any year divisible by 3200, except if it's also divisible by 80,000).

OK you hackers, I want those algorithms updated tomorrow. :-)

Bill Vaughn
UNIV. OF ROCHESTER
Center for Visual Science
{allegra,seismo,decvax}!rochester!ur-cvsvax!bill

Guru: What's the matter?
Novice hacker: When I do this, it hurts. (shows guru his core dump)
Guru: DON'T DO THAT!!

Darrel VanBuer

unread,
Feb 20, 1985, 6:56:25 PM2/20/85
to
A brief history of Western calendars: as man became more civilized (and
planned more details of life, and more carefully observered details of the
world--a hunter gatherer really doesn't need a calendar, you just follow the
seasons in what you seek out), he came to need to know as early as possible
when key annual events occur (e.g. spring planting). One of the earliest
techniques was instrumented observation (like Stonehenge), but this requires
some considerable investment in equipment and "operators". The early Romans
used a 365 day year with edicted adjustments when observation showed to much
error. By the time of Julius Caesar, the measurements and understanding of
the problem led to the imposition of the Julian calendar throughout the
Roman empire in which every fourth year is a leap year (leaving an error of
about 1 day in 125 years). By requiring a calendar which was the best
science had to offer, you contribute to a smooth-running empire with
everybody celebrating important religious and social days at the same time
without worry that a provincial astronomer will get out of phase due to
different interpretations of the observations (this was a real problem with
Jewish holy days a few thousand years ago--first they had separate
determination resulting in occasional missed syncronization, then signal
fires from a central authority in Jerusalem which fell victim to malicious
interference, and finally a computational system).
Because of the residual error, by the late 16th century, Pope Gregory
mandated the Gregorian calendar which combined a one-time correction of 10
days (in 1582) and the current system which omits leap years in century
years not a multiple of 400. Because of religious factionalism, many areas
did not change until much later (e.g. England delayed until 1752 and the
Eastern Catholic churches the 20th century). There is still a residual
error of about one day in 3200 years for which the necessary political
solution has yet to be taken up (but based on the threshold for past action,
we have about 30,000 years before it will happen :-).

The official length of the year 1900 was 31,556,925.9747 seconds. The error
in the actual length and change due to slowing of the Earth's rotation is
such that a leap second has to be inserted every year or two by standard
timekeepers such as NBS. I don't have a good handle on the error due to
slowing, but I doubt we'll but up to a full day for 20,000 years yet (it's a
difficult measurement because it's near the limits of current measurement
technology [to clarify--that's a cumulative full day off, not a year a whole day
shorter]. It may also never come due to the leap-second scheme in use by
standards organizations (at least till everybody has an atomic clock and
notice what the standards people are doing :-).

Note that 2000 IS IS IS a leap year--ask Pope Gregory. (there has been some
misinformation appearing on this point)

--
Darrel J. Van Buer, PhD
System Development Corp.
2500 Colorado Ave
Santa Monica, CA 90406
(213)820-4111 x5449
...{allegra,burdvax,cbosgd,hplabs,ihnp4,orstcs,sdcsvax,ucla-cs,akgua}
!sdcrdcf!darrelj
VAN...@USC-ECL.ARPA

Darrel VanBuer

unread,
Feb 20, 1985, 7:42:08 PM2/20/85
to
A further clarification: The leap seconds deal with the difference between
the mean solar day and 86400 standard seconds, and have no (direct) effect
on leap days except as errors in the day also change the number of days in a
year (which is further complicated by gradual changes in the Earth's orbit).

(By the way, the error in the Gregorian calendar is just over 26 seconds per
year)

0 new messages