Redefining UTC (no more leap seconds)

0 views
Skip to first unread message

Markus Kuhn

unread,
Dec 3, 1999, 3:00:00 AM12/3/99
to
This might be of great interest to people concerned about leap seconds
and precision timing in computers:

In message <1999120214...@hpopa.obspm.fr>, ie...@hpopa.obspm.fr writes:
>
>***************************************************************************
> Gazette IERS Gazette IERS Gazette IERS Gazette IERS Gazette
> ________________________________________
> No 48, 02 December 1999 /
>_________________________________/ Contact: ie...@obspm.fr
> ftp: hpiers.obspm.fr (145.238.100.28)
> WWW: http://hpiers.obspm.fr
>***************************************************************************
>
> Subject: UTC Questionnaire
> Author : Demetrios Matsakis
>
>--------------------------------------------------------------------------
>
>
>Dear Colleague,
>
>It has been proposed to change the definition of Coordinated Universal
>Time (UTC) regarding the insertion of leap-seconds, possibly even
>eliminating their use. Leap seconds are introduced so as to keep UTC
>synchronized (within 0.9 s) to the time scale determined from the
>Earth's rotation.
>
>Should no new leap seconds be inserted, solar time will diverge from
>atomic time at the rate of about 2 seconds every 3 years, and after
>about a century |UT1-UTC| would exceed 1 minute. Although no
>fundamental problems are anticipated, it is very likely that Y2K-like
>problems may result in software that assumes UT1=UTC, or
>|UT1-UTC|< some value, or whose input/output records use a field size
>that can only accommodate |UT1-UTC| values up to one second.
>
>To gather information, an URSI Commission J Working Group was formed,
>consisting of Don Backer, Wim. N. Brouw, Barry Clark, Irwin Shapiro,
>Ir. E. Van Lil, and myself.
>
>We would like to ask you to consult with the members of your institute
>who currently deal with UT1-UTC, and give us a considered response to
>the following two questions:
>
>A. If the appropriate international bodies decide to eliminate the
>insertion of new leap seconds, would you foresee any practical
>problems for your institution/instrument/observations?
>
> yes ___
>
> * no ____
>
> * possibly ____
>
> (* please explain any known or possible problems)
>
>B. Would you be in favor of such a proposal?
>
> yes ____
>
> no ____
>
> indifferent ___
>
> have better idea ___
>
> (feel free to comment)
>
>C. Is there anyone else you would recommend we contact?
> (feel free to forward this eamil directly)
>
>I would appreciate your assistance, and a response by January 15 to
>d...@orion.usno.navy.mil.
>
>I am attaching a list of institutions and persons contacted, except
>for 931 institutions whose emails were obtained from the AAS. I would
>like to apologize to anyone contacted twice, but also appreciate it
>if you would forward this email to anyone we have missed. Also, if
>you are an URSI Commission J national chair, we would appreciate your
>forwarding this email to your complete membership and in particular
>to the directors of observatories.
>
>Sincerely,
>
>Demetrios Matsakis
>_____________________________________________________________________
>Dr. Demetrios N. Matsakis Director, Time Service Department
>(202) 762-1587 DSN 762-1587 U. S. Naval Observatory
>FAX (202) 762-1511 3450 Massachusetts Avenue NW
>d...@orion.usno.navy.mil Washington DC, USA 20392-5420
>_____________________________________________________________________


Markus

--
Markus G. Kuhn, Computer Laboratory, University of Cambridge, UK
Email: mkuhn at acm.org, WWW: <http://www.cl.cam.ac.uk/~mgk25/>

Steve Dodd

unread,
Dec 4, 1999, 3:00:00 AM12/4/99
to
Markus Kuhn <mg...@cl.cam.ac.uk> wrote:
> In message <1999120214...@hpopa.obspm.fr>, ie...@hpopa.obspm.fr writes:
[..]

>>It has been proposed to change the definition of Coordinated Universal
>>Time (UTC) regarding the insertion of leap-seconds, possibly even
>>eliminating their use. Leap seconds are introduced so as to keep UTC
>>synchronized (within 0.9 s) to the time scale determined from the
>>Earth's rotation.

So this would basically mean that UTC would stay at a constant offset from
TAI (? is this the acronym for atomic time?). What would normal computer
systems be expected to do when showing local time to a user?

Steve
(confused)

--
"If you wish to insult me my consulting rates are available by return of
email." -- Alan Cox

Markus Kuhn

unread,
Dec 5, 1999, 3:00:00 AM12/5/99
to
In article <3848A7AA...@cs.berkeley.edu>, John Hauser <jha...@cs.berkeley.edu> writes:
|>
|> Demetrios Matsakis:

|> >It has been proposed to change the definition of Coordinated Universal
|> >Time (UTC) regarding the insertion of leap-seconds, possibly even
|> >eliminating their use. [...]

|> >Should no new leap seconds be inserted, solar time will diverge from
|> >atomic time at the rate of about 2 seconds every 3 years, and after
|> >about a century |UT1-UTC| would exceed 1 minute.
|>
|> And after 1582 years, the divergence would presumably be more than
|> 15 minutes, by which time people may be wishing the leap seconds had
|> been left in. Is this what we want?

I did reply to the above USNO questionaire, and I made the following
alternative proposal:

- Keep the current scheme of inserting a leap second 23:59:60Z every 1-2
years into UTC.

- Announce UTC leap seconds at least 30-50 years in advance, such that
we do not have to constantly update leap second tables in computers
that try to run on TAI every 6 months or so when a new IERS C bulletin
comes out.

- Relax the |UTC-UT1| < 0.9 s rule to allow for greater fluctuations
of say up to a minute (as required for long-term leap seconds planning),
but still keep |UTC-UT1| as close as feasible to zero as history goes
on.

- Allow changes to an already announced 30-50 years schedule only,
if there is a threat that e.g. |UTC-UT1| < 100 s will be violated
with the current schedule of leap seconds. [This should only occur
in the case of catastrophic events (impact of a major mass, etc.),
so the change of leap second secule is unlikely to be our most
significant problem in that case ...]

Note that IERS C bulletins would still come out every few years or so,
but they would announce leap seconds that are scheduled to take place
in say 50 years from now, so no harm is done if you miss a quite few of them.

I don't think, there are many applications that require |UTC-UT1| < 0.9 s,
and I think having a predictability of leaps seconds much larger than
the usual time between upgrades of a computer system is more important
than a very tight tolerance on |UTC-UT1|.

The only problem that I see is that the data formats of some time
broadcasting services that transmit DUT1 (UTC-UT1 rounded to 100 ms)
would have to be modified to allow for larger DUT1 values up up to
a minute or so. But I don't think devices and software that use DUT1
values are very widely used anyway, so the cost should be marginal
if the switchover is announced a few years in advance.

If time transmission services are updated to handle larger DUT1
values, then they should also be updated to transmit TAI-UTC.

Jack Yeazel

unread,
Dec 5, 1999, 3:00:00 AM12/5/99
to Markus Kuhn
Markus Kuhn wrote:
>
> In article <3848A7AA...@cs.berkeley.edu>, John Hauser <jha...@cs.berkeley.edu> writes:
> |>
> |> Demetrios Matsakis:
> |> >It has been proposed to change the definition of Coordinated Universal
> |> >Time (UTC) regarding the insertion of leap-seconds, possibly even
> |> >eliminating their use. [...]
> |> >Should no new leap seconds be inserted, solar time will diverge from
> |> >atomic time at the rate of about 2 seconds every 3 years, and after
> |> >about a century |UT1-UTC| would exceed 1 minute.
> |>
> |> And after 1582 years, the divergence would presumably be more than
> |> 15 minutes, by which time people may be wishing the leap seconds had
> |> been left in. Is this what we want?
>
> I did reply to the above USNO questionaire, and I made the following
> alternative proposal:
>
> - Keep the current scheme of inserting a leap second 23:59:60Z every 1-2
> years into UTC.
>
> - Announce UTC leap seconds at least 30-50 years in advance, such that
> we do not have to constantly update leap second tables in computers
> that try to run on TAI every 6 months or so when a new IERS C bulletin
> comes out.

It's my understanding that leap seconds cannot be predicted ACCURATELY
(keeping TAI-UTC less than one second) for more than a few years in
advance...???
--
Jack

Get general GPS information at http://joe.mehaffey.com/

Terje Mathisen

unread,
Dec 5, 1999, 3:00:00 AM12/5/99
to

That's the key part of his suggestion: By allowing a much wider swing
area for the UT1-UTC (NOT TAI-UTC!) offset, it would be possible to
announce those leap seconds much further in advance, without breaking
the promise re. max offset.

I like it!

Terje

--
- <Terje.M...@hda.hydro.com>
Using self-discipline, see http://www.eiffel.com/discipline
"almost all programming can be viewed as an exercise in caching"

Chuck Tribolet

unread,
Dec 5, 1999, 3:00:00 AM12/5/99
to
Astronomers use UTC to point their telescopes. They need greater accuracy
than 100 seconds.

Just make it easy to get the updates in to your software: A standard format
somewhere so that the software can update itself.

Markus Kuhn wrote:
>
> In article <3848A7AA...@cs.berkeley.edu>, John Hauser <jha...@cs.berkeley.edu> writes:
> |>
> |> Demetrios Matsakis:
> |> >It has been proposed to change the definition of Coordinated Universal
> |> >Time (UTC) regarding the insertion of leap-seconds, possibly even
> |> >eliminating their use. [...]
> |> >Should no new leap seconds be inserted, solar time will diverge from
> |> >atomic time at the rate of about 2 seconds every 3 years, and after
> |> >about a century |UT1-UTC| would exceed 1 minute.
> |>
> |> And after 1582 years, the divergence would presumably be more than
> |> 15 minutes, by which time people may be wishing the leap seconds had
> |> been left in. Is this what we want?
>
> I did reply to the above USNO questionaire, and I made the following
> alternative proposal:
>
> - Keep the current scheme of inserting a leap second 23:59:60Z every 1-2
> years into UTC.
>
> - Announce UTC leap seconds at least 30-50 years in advance, such that
> we do not have to constantly update leap second tables in computers
> that try to run on TAI every 6 months or so when a new IERS C bulletin
> comes out.
>

--
Chuck Tribolet
Internet: tri...@garlic.com
http://www.almaden.ibm.com/cs/people/triblet

Silicon Valley: Best day job in the world.

Helmut Richter

unread,
Dec 5, 1999, 3:00:00 AM12/5/99
to
mg...@cl.cam.ac.uk (Markus Kuhn) writes:

>I did reply to the above USNO questionaire, and I made the following
>alternative proposal:

> - Keep the current scheme of inserting a leap second 23:59:60Z every 1-2
> years into UTC.

> - Announce UTC leap seconds at least 30-50 years in advance, such that
> we do not have to constantly update leap second tables in computers
> that try to run on TAI every 6 months or so when a new IERS C bulletin
> comes out.

Looks quite reasonable. The major reason for introducing leap seconds is
not *change* of the speed of earth rotation but a systematic error: a
second is simply a bit smaller than 1/86400 of an average day. So the
insertion of leap seconds is pretty well predictable over a long range if
the requirements on UTC-TAI are not too tight.

Helmut Richter


Sam Wormley

unread,
Dec 5, 1999, 3:00:00 AM12/5/99
to
Chuck Tribolet wrote:
>
> Astronomers use UTC to point their telescopes. They need greater accuracy
> than 100 seconds.
>
> Just make it easy to get the updates in to your software: A standard format
> somewhere so that the software can update itself.
>
> Markus Kuhn wrote:

Maybe... I think Astronomers really need to know TT (TDT) to point their
telescopes.


UTC GPS TAI TT (TDT)
----+---------+-------------+------------------------------+-----
| | | ET 1984.0
|<--32s Leap Seconds--->|<----32.184S fixed----------->|
| | | |
|<--13s-->|
|
|
|
+
-----+-----------------------------------------------------------
UT1 (UT)
UTC is variable with respect to UT1 and is kept within ą0.9s with
leap seconds.


The differences between GPS Time and International Atomic Time (TAI) and
Terrestrial Time (TT), also know as Terrestrial Dynaminal Time (TDT), are
constant at at the level of some tens of nanoseconds while the difference
between GPS Time and UTC changes in increments of seconds, each time a
leap second is added to UTC time scale.

If you want to calculate the circumstances of a solar eclipes, you do
the computations in TT and convey the times to other Humans in UTC.

YEAR TT-UT PREDICTION UT1-UTC PREDICTION ERROR

1999.25 63.579 .605 .006
1999.50 63.67 .51 .01
1999.75 63.72 .46 .02
2000.00 63.86 .02
2000.25 64.6 .2
2000.50 64.8 .3
2000.75 65.0 .4

___________________________________________________
Sam Wormley - http://www.cnde.iastate.edu/time.html

Tom Van Baak

unread,
Dec 5, 1999, 3:00:00 AM12/5/99
to Steve Dodd
Steve Dodd wrote:
> So this would basically mean that UTC would stay at a constant offset from
> TAI (? is this the acronym for atomic time?). What would normal computer
> systems be expected to do when showing local time to a user?
>
> Steve

Normal computers would display local time as they always
have: either they're already running in local time (e.g.,
DOS) or they're running UTC (e.g., UNIX, NT) with a local
timezone correction prior to display.

The simplicity of the new UT1-UTC proposal is that local
time computations are not affected.

/tvb

Terje Mathisen

unread,
Dec 6, 1999, 3:00:00 AM12/6/99
to
Chuck Tribolet wrote:
>
> Astronomers use UTC to point their telescopes. They need greater accuracy
> than 100 seconds.

AFAIK, astronomers should NOT use UTC, since UT1 is the timebase useful
for pointing telescopes at stars.

I believe they need to know the offset between UTC and UT1 even today
when the difference is less than 0.9s.
...
86400 seconds corresponds to 360 degrees, so a 1 second error would be
360 / 86400 = 1/240 degree, or 15 seconds of arc.

With enough magnification, this could be enough to miss your target
completely, right?

OTOH, 1/240 degree or less is probably accurate enough that anyone with
a backyard telescope would hit the desired star easily. :-)

Mike Bilow

unread,
Dec 6, 1999, 3:00:00 AM12/6/99
to Markus Kuhn
On 5 Dec 1999, Markus Kuhn wrote:

> I did reply to the above USNO questionaire, and I made the following
> alternative proposal:
>
> - Keep the current scheme of inserting a leap second 23:59:60Z every 1-2
> years into UTC.
>
> - Announce UTC leap seconds at least 30-50 years in advance, such that
> we do not have to constantly update leap second tables in computers
> that try to run on TAI every 6 months or so when a new IERS C bulletin
> comes out.

If we are going to have leap seconds every couple of years, it seems to me
that we might as well make an effort to have them keep UTC synchronized to
UT1 as closely as is reasonably possible. I am not sure whether it is
known whether earth rotation is predictable over a time scale as large as
50 years. There is even a provision for a negative leap second, where a
second would be dropped from UTC rather than added, although we have never
actually had such a negative leap second.

We have really only been able to measure earth rotation with sufficient
precision for the past few years to worry about the issue, so trying to
predict the state of the art 50 years from now seems like a very dangerous
idea. No one in 1949, for example, could have foreseen a technology like
GPS except in perhaps the broadest theoretical sense, and certainly would
never have imagined hikers buying GPS receivers for a little over $100 at
a sportsmen's goods store.

> - Relax the |UTC-UT1| < 0.9 s rule to allow for greater fluctuations
> of say up to a minute (as required for long-term leap seconds planning),
> but still keep |UTC-UT1| as close as feasible to zero as history goes
> on.
>
> - Allow changes to an already announced 30-50 years schedule only,
> if there is a threat that e.g. |UTC-UT1| < 100 s will be violated
> with the current schedule of leap seconds. [This should only occur
> in the case of catastrophic events (impact of a major mass, etc.),
> so the change of leap second secule is unlikely to be our most
> significant problem in that case ...]

I think this would lead to great inconvenience in astronomy and celestial
navigation. As a practical matter, the only non-astronomical time which
is conveniently available is UTC. This means that, if you want to locate
some celestial object, then you need to know (or approximate) UT1, and the
fact that UTC is corrected within 0.9s is a huge convenience. I have
often been concerned with a celestial or quasi-celestial event of only a
few minutes duration, and maintaining correction factors from a practical
reference would greatly increase the chances of error. For example, it is
common for a low earth orbiting satellite to be visible above the horizon
for only a very few minutes, and an error of as much as a minute could
leave a tracking antenna pointed tens of degrees off.

I realize that GPS time is never corrected via leap second, but rather an
offset from GPS time to UTC is transmitted with the GPS stream. Still, as
a practical matter, GPS receivers all happily report UTC on their data
outputs, and all of the necessary corrections are enclosed inside the GPS
receiver box. This is a fundamentally better and more reliable design
than leaving it up to the software reading the GPS receiver to do it. As
more and more unit conversion responsibilities are piled up, eventually
you reach a point where boats sink, airplanes crash, and space probes burn
up in the atmosphere of Mars.

> Note that IERS C bulletins would still come out every few years or so,
> but they would announce leap seconds that are scheduled to take place
> in say 50 years from now, so no harm is done if you miss a quite few of them.
>
> I don't think, there are many applications that require |UTC-UT1| < 0.9 s,
> and I think having a predictability of leaps seconds much larger than
> the usual time between upgrades of a computer system is more important
> than a very tight tolerance on |UTC-UT1|.

Navigation requires knowing UT1 quite precisely. Yes, if UTC is no longer
corrected so that it is always close to UT1, then it is still possible to
calculate UT1 if the current offset is known. However, this requires that
either the offset be available or that it be transmitted as part of the
data along with UTC. Again, as a practical matter, it will become
necessary to modify all sorts of protocols, such as NMEA and NTP, so that
this new large offset can be sent along with UTC so that UT1 can be
derived if it happens to be needed.

> The only problem that I see is that the data formats of some time
> broadcasting services that transmit DUT1 (UTC-UT1 rounded to 100 ms)
> would have to be modified to allow for larger DUT1 values up up to
> a minute or so. But I don't think devices and software that use DUT1
> values are very widely used anyway, so the cost should be marginal
> if the switchover is announced a few years in advance.
>
> If time transmission services are updated to handle larger DUT1
> values, then they should also be updated to transmit TAI-UTC.

Perhaps the actual DUT1 values are not widely used, but you will find that
protocols in which DUT1 is embedded are ubiquitous. For example, the time
codes transmitted by WWV/WWVH, WWVB, and CHU all carry DUT1 information.
Changing these protocols is impractical, since a great deal of hardware is
out in the field and would have to be updated. A fair amount of this
hardware, such as my WWVB-receiving wristwatch, would be uneconomical to
modify and would have to be replaced outright. In more critical settings
such as laboratories, it would always be necessary to worry about whether
the device firmware had been upgraded or not. Although my wristwatch
ignores the actual DUT1 value, it does have to account for the field, so
it will break if the field size changes.

What bothers me about this is that I cannot think of a single benefit to
be ontained from dropping the correction of UTC by leap second. After
all, the infrastructure for supporting leap seconds has been in place for
nearly 30 years and antedates all computer networking protocols, including
TCP/IP, let alone NTP. Even the IERS will still have to make all of its
observations and publish its reports, since we really do need to know the
correction factors whether we adjust UTC or not. What we save by dropping
leap seconds is, it seems to me, a little work by one international
organization and a few national standards bodies. What we risk by
dropping leap seconds is, I think, greatly increased chance of error and
confusion in everything from navigation to computer protocols.

I predict that dropping leap seconds could turn out to have unexpected and
possibly catastrophic consquences. While I am not saying that I would
expect to lose a space shuttle someday due to a navigation error traceable
to this cause, I would never have expected to lose a planetary mission
because of a conflict between English and Metric units, either. Messing
with leap seconds at this point just unnerves my engineering sensibility.

-- Mike

Erik Naggum

unread,
Dec 6, 1999, 3:00:00 AM12/6/99
to
* Markus Kuhn

| I did reply to the above USNO questionaire, and I made the following
| alternative proposal:
|
| - Keep the current scheme of inserting a leap second 23:59:60Z every 1-2
| years into UTC.

I agree with this, but I think a notation and protocol is required to
denote the difference for common use, much like we have time zones today.

| - Announce UTC leap seconds at least 30-50 years in advance, such that
| we do not have to constantly update leap second tables in computers
| that try to run on TAI every 6 months or so when a new IERS C bulletin
| comes out.

with computer networks being as ubiquitous as they are, and dissemination
of clock information over radio and GPS and a number of other ways, I
fail to see the value of "30-50 years", and I'm curious what you think
would benefit from this huge time frame. if the idea is that Y2K has
been a problem for computers and software built in the 60's, the better
idea is probably to avoid UTC in applications with longevity as their
primary concern, much like Y2K really isn't solved by using four digits
for the year, but by using one or more integral numbers of units of time
with much less arbitrary relationships than calendar dates. (my favorite
is a tuple of days since (or until) a leap-year epoch like 2000-03-01,
seconds since midnight, and whatever fraction of a second since start of
second.) I fear that 30-50 years would create more problems than it
solves, and that these problems are indeed better suited by abandoning
leap seconds for the time unit chosen.

#:Erik

Markus Kuhn

unread,
Dec 6, 1999, 3:00:00 AM12/6/99
to
I have received a lot of email about the UTC and leap second topic,
some based on a number of missunderstandings with regard to how
the system works currently and why it was designed that way.

Therefore, if you are interested in the topic, I *highly* recommend
that you read the excellent paper

Dennis D. McCarthy: Astronomical Time. Proceedings of the IEEE,
Vol. 79, No. 7, July 1991, pp. 915-920. [See also the other
papers in this special issue on time and frequency standards.]

Since the author is a government employee, this text is in the
public domain and I can even offer you a free scanned version:

http://www.cl.cam.ac.uk/~mgk25/volatile/astronomical-time.pdf

Marc Brett

unread,
Dec 6, 1999, 3:00:00 AM12/6/99
to
In comp.protocols.time.ntp Mike Bilow <mik...@colossus.bilow.com> wrote:

> I predict that dropping leap seconds could turn out to have unexpected and
> possibly catastrophic consquences. While I am not saying that I would
> expect to lose a space shuttle someday due to a navigation error traceable
> to this cause, I would never have expected to lose a planetary mission
> because of a conflict between English and Metric units, either. Messing
> with leap seconds at this point just unnerves my engineering sensibility.

I agree with this sentiment. If we are to get leap seconds anyway, we
may as well have them accurate to within 0.9 sec. I can see the Law of
Unintended Consequences looming.

I'd like to see this, however:

- Allow leap seconds ONLY on the last day of December or June.
Eliminate the option of allowing them on the last day of
March or September (IERS second preference) or the last day
of any month (IERS third preference). This will simplify
software design.

I've observed the GPS signal announce leap seconds up to 4 months in
advance. Many GPS receivers only supply a 2-state leap warning flag
and are ambiguous about _when_ the leap second event will occur. If
software designers can be certain that it will happen only on the
6-month boundaries, systems will be more reliable. De-facto, this is
already the case, and a leap second in September or March will break
many systems (all versions of NTP, for instance).

--
Marc Brett +44 20 8560 3160 Western Geophysical
Marc....@westgeo.com 455 London Road, Isleworth
FAX: +44 20 8847 5711 Middlesex TW7 5AA UK

Helmut Richter

unread,
Dec 6, 1999, 3:00:00 AM12/6/99
to
Mike Bilow <mik...@colossus.bilow.com> writes:

>If we are going to have leap seconds every couple of years, it seems to me
>that we might as well make an effort to have them keep UTC synchronized to
>UT1 as closely as is reasonably possible.

Markus's suggestion was to insert leap seconds as often as they are
inserted today, but to announce them long before, e.g. by defining
similar leap second rules ahead of time as we now have leap year
rules. This would be absolutely correct for compensating that the mean
day is longer than 86400 seconds, it would not be sufficient for the
*changes* in the speed of earth rotation. But these are orders of
magnitude smaller.

Helmut Richter

Markus Kuhn

unread,
Dec 6, 1999, 3:00:00 AM12/6/99
to
In article <31534579...@naggum.no>, Erik Naggum <er...@naggum.no> writes:
|> * Markus Kuhn

|> | - Announce UTC leap seconds at least 30-50 years in advance, such that
|> | we do not have to constantly update leap second tables in computers
|> | that try to run on TAI every 6 months or so when a new IERS C bulletin
|> | comes out.
|>
|> with computer networks being as ubiquitous as they are, and dissemination
|> of clock information over radio and GPS and a number of other ways, I
|> fail to see the value of "30-50 years", and I'm curious what you think
|> would benefit from this huge time frame.

OK, if you stay with me for a few minutes, I'll explain it in more detail:

Computers handle time today mostly in one of two encodings:

- a scalar time value (e.g., a second counter)
- a broken-down time display such as YYYY-MM-DD hh-mm-ss.sss

Official time standards such as UTC or TAI are published in broken-down
format only. Computers have to map the official broken-down
time into a scalar value for internal use. Examples are the time_t type
in ISO C, time_t in POSIX.1, the TimeT type in CORBA, the Time type
in Ada95, and many many others.

Even though scalar time variables and values are often in documentation
natively described as a "count of seconds since the epoch", they are in
reality instead often reencoded versions of a broken-down UTC display.
Let's look at POSIX's time_t as an example (which I guess is the one most here
are most familiar with). Even though the ISO C standard allows it do be
an arbitrary encoding of the time in a numeric variable, POSIX requires
time_t values to be a specified encoding of a broken-down UTC value
as defined by a fixed formula in the POSIX.1 standard (non-leap seconds
since 1970-01-01 00:00:00 UTC). Leap seconds cannot be encoded in
this sceme, and common implementation practice is to simply repeat
the time_t code of the previous second together with setting some
special LEAP flag during a leap second on systems that are leap-second
aware (e.g., Linux running under ntpd).

An alternative would be to use time_t as an encoding of the broken-down
TAI display. Now time_t would be a true measurement of the number of
seconds that have passed on the international network of reference clocks
since some epoch. The disadvantage of this approach is now, that
converting time_t into a broken-down UTC or civil local time for user
interfaces and protocol interactions suddenly becomes dependent on
a table of leap seconds, which has to be updated roughly every year.
In addition, the relationship between broken-down UTC displays and
time_t values for more than 6 months into the future becomes completely
unpredictable now, even if we assume to have some utopic secure
fully automated robust leap-second update mechanism in place (e.g.,
via the DNS server of USNO/IERS or something like that).

If we use as our scalar time representation just an encoding of UTC
displays, we get a reliable display of UTC broken-down times, but
occasional hick-ups when we calculate time-differences across
leap-seconds. If we make our scalar time representation encodings of
TAI, we get a reliable continuous scale for easy arithmetic on
the number of passed seconds, but displaying UTC broken-down
times becomes hazzardly dependent on keeping the damned leap-second
tables up to date.

Therefore, today most implementors consider correct time display
more important than correct time interval arithmetic, and they
consequently use scalars that encode UTC displays and ignore
leap seconds as if they never happened.

If the TAI/UTC relationship were known in advance for a sufficiently
long time (say > 25 years), at least much longer than any operating
system update cycle that we can imagine today, sufficiently up-to-date
leap second tables would easily become ubiquitously deployable
without having to depend on on-line distribution mechanisms
(by far not all systems are connected to the Internet, thanks
<insert-favourite-transcendental-entity-here>). This would make it
feasible to use in computer systems true second counts that count
a leap second just like any other second, which would represent
effectively a bijective mapping of a TAI display.

Leap seconds predictable over >25 years would combine the best of
both worlds: robust and easy time interval arithmetic together with
reasonably reliable conversion to broken-down time, including for
23:59:60 leap seconds. The only times that could not be encoded
reliably numerically would be events >25 years into the future,
and it would probably be advisable to use a UTC encoding
here instead, if the correct local time display is what matters
in the end, and not the physically correct number of seconds.

By the way, I have already written down, how a revised ISO C time
API could look like, that allows programmers to deal with scalar
time variables that can encode either a UTC or a TAI display
(that is either count or ignore leap seconds). For details, see

http://www.cl.cam.ac.uk/~mgk25/c-time/

This proposal also aims at fixing a whole range of other problems with
the ISO C and POSIX time APIs (multi-threading, lack of precision,
switching between local time zones, traversing timezone databases,
encoding of leap seconds in a UTC-based scalar time scale, etc.).

This proposed API works also fine with the current unpredicatble
TAI-UTC relationship, but CLOCK_TAI would be much more widely available
(and could much more often start out identical to CLOCK_MONOTONIC),
and therefore much more useful, if leap seconds were announced a
long time in advance.

The above web page also contains a very comprehensive list of related
references at the end that are very much worth looking at in detail.

Markus Kuhn

unread,
Dec 6, 1999, 3:00:00 AM12/6/99
to
Helmut....@lrz-muenchen.de (Helmut Richter) writes:
|> Markus's suggestion was to insert leap seconds as often as they are
|> inserted today, but to announce them long before, e.g. by defining
|> similar leap second rules ahead of time as we now have leap year
|> rules. This would be absolutely correct for compensating that the mean
|> day is longer than 86400 seconds, it would not be sufficient for the
|> *changes* in the speed of earth rotation. But these are orders of
|> magnitude smaller.

Well, it would not be sufficient for the *unpredictable changes*
in the speed of earth rotation. Some of the changes (slowing
due to lunar/solar tidal forces) can be prediced reasonably well
with existing models, others are more random.

The interesting question is, how well can the drift between UT1 and
TAI be predicted a few decades into the future, i.e. how far would
UT1 and UTC drift apart if leap seconds were announced a few decades
in advance based on the best models for UT1 that we have today. There
exists quite some literature on the subject, e.g. the stuff quoted in

http://www.cl.cam.ac.uk/~mgk25/volatile/astronomical-time.pdf

but I have yet to spend a few days in the library to get the figures
that tell us how well long-term UT1-TAI forecasting can be done
today. The UT1 predictions published by IERS today seem to be only
90 days into the future. The above paper only says, that the random
fluctuations in the Earth's rotation can be up to a milisecond per day,
but that does not give us the distribution and autocorrelation that
we need to estimate how unpredictable the length of say 10000 UT1
days would be. Most of the unpredictability in the decade range
seems to be due to slow movements of masses within the Earth.

Marc Brett

unread,
Dec 6, 1999, 3:00:00 AM12/6/99
to
In comp.std.internat Helmut Richter <Helmut....@lrz-muenchen.de> wrote:
> Mike Bilow <mik...@colossus.bilow.com> writes:

>>If we are going to have leap seconds every couple of years, it seems to me
>>that we might as well make an effort to have them keep UTC synchronized to
>>UT1 as closely as is reasonably possible.

> Markus's suggestion was to insert leap seconds as often as they are


> inserted today, but to announce them long before, e.g. by defining
> similar leap second rules ahead of time as we now have leap year
> rules. This would be absolutely correct for compensating that the mean
> day is longer than 86400 seconds, it would not be sufficient for the
> *changes* in the speed of earth rotation. But these are orders of
> magnitude smaller.

But such a long lead time will necessarily mean that UT1-UTC will drift
beyond 0.9s. The format for WWVB, CHU, and ACTS will have to change,
and thousands, perhaps millions of time receivers will have to be
upgraded or replaced. UTC will become even less useful to astronomical
observers. System reliability nosedives as UT1-UTC (or is that UTC-UT1?)
correction factors are misapplied.

And the only benefit is that system administrators wouldn't have to
update their TAI-UTC tables as frequently. A very questionable benefit
indeed, because out of sight means out of mind which probably means
they won't be adequately maintained at all.

Instead of redefining UTC to be more predictable, we should be working
on a mechanism to dissemanate TAI-UTC tables to everyone who needs it,
automatically, reliably, economically, rapidly. They should probably
be very much like license files, with a secure key, issued by the IERS,
with 6-month lifetimes, etc. I can see it all now -- a public-domain
replacement for the hugely overpriced FLEXlm products -- now THAT would
be a result!

Terje Mathisen

unread,
Dec 6, 1999, 3:00:00 AM12/6/99
to
Markus Kuhn wrote:

>
> Helmut....@lrz-muenchen.de (Helmut Richter) writes:
> |> Markus's suggestion was to insert leap seconds as often as they are
> |> inserted today, but to announce them long before, e.g. by defining
> |> similar leap second rules ahead of time as we now have leap year
> |> rules. This would be absolutely correct for compensating that the mean
> |> day is longer than 86400 seconds, it would not be sufficient for the
> |> *changes* in the speed of earth rotation. But these are orders of
> |> magnitude smaller.
>
> Well, it would not be sufficient for the *unpredictable changes*
> in the speed of earth rotation. Some of the changes (slowing
> due to lunar/solar tidal forces) can be prediced reasonably well
> with existing models, others are more random.
>
> The interesting question is, how well can the drift between UT1 and
> TAI be predicted a few decades into the future, i.e. how far would
> UT1 and UTC drift apart if leap seconds were announced a few decades
> in advance based on the best models for UT1 that we have today. There
> exists quite some literature on the subject, e.g. the stuff quoted in
>
> http://www.cl.cam.ac.uk/~mgk25/volatile/astronomical-time.pdf

I've studied this article, it seems to me that since the long-term trend
line is nearly constant (i.e. caused by tidal friction), a model which
allowed just a few seconds slack, would make it possible to extend the
leap second announcements much further into the future.

Even if accurate (sub-ms) predictions are impossible, I'd guess that a
10-second window would allow your 25-year future leap second
determination.

Erik Naggum

unread,
Dec 6, 1999, 3:00:00 AM12/6/99
to
* Markus Kuhn

| Computers handle time today mostly in one of two encodings:
|
| - a scalar time value (e.g., a second counter)
| - a broken-down time display such as YYYY-MM-DD hh-mm-ss.sss

I made a note in passing that I prefer a different encoding, one of less
random break-down. specifically, I want days to be counted integrally.
most of the computations commercial applications do with time have to do
with either days or times within days, and it's rather silly to have to
deal with days of varying length in seconds all the time, so let's
separate the two and call this day-and-seconds concept "commercial time",
with the usual breakdown of hours, minutes and seconds being completely
unaffected by leap seconds until 23:59:60, which is just fine by most
software -- very little happens right _before_ midnight, and midnight
would be triggered by a change in the day, anyway. we can deal with days
of any number of seconds if and when we have to compute "scientific"
time, which would have to be an accurate computation of _elapsed_ time
and would be subject to a number of interesting qualifications.

www.naggum.no/lugm-time.html is a moderately brief presentation of this
concept presented at the Lisp User's Group Meeting in San Francisco
earlier this year. it may be of some value to these communities, but was
presented to solve a somewhat different problem than what is at hand, and
may be overly Common Lisp-specific, and is at the same time intended to
show that a more complex representation has enormous performance benefits.

so I am aware of the problems you describe, and I think the Unix solution
is one of the less intelligent solutions. most of the problems you have
to deal with comes from forcing too much knowledge into a system that
cannot represent it. for starters, you need to know the value of the
current timezone if any timestamp is actually to be useful later: it is
much more valuable to know at what _local_ time of day a file was written
than to convert it into UTC and convert it back into whatever the local
time is when needed -- you cannot accomplish this if you do not keep the
timezone around. (the Microsoftน solution to this problem is still not
to know the timezone, but instead let the computer run at local time in
the hardware clock.) an analogous argument can easily be made for leap
seconds.

incidentally, as we're dealing with times in the future and the time
formats that are disseminated are in a broken-down format, the only leap
seconds we need to deal with are those in the past and present, provided
that we dispense with representations that force us to know too much at a
time we cannot know it, such as the Unix/Posix disaster. the solution is
manifestly _not_ to try to "fix" the Posix disaster, that cannot e done
reliably, anyway, but to use something much, much smarter. I humbly
believe that not forcing seconds calculations into day calculations is
one very good way of dispensing with the leap second problem, as well.

#:Erik
-------
น used as a generic adjective alluding to quality

Hans-Georg Michna

unread,
Dec 6, 1999, 3:00:00 AM12/6/99
to
Marc Brett <mbr...@rgs0.london.waii.com> wrote:

> - Allow leap seconds ONLY on the last day of December or June.
> Eliminate the option of allowing them on the last day of
> March or September (IERS second preference) or the last day
> of any month (IERS third preference). This will simplify
> software design.
>
>I've observed the GPS signal announce leap seconds up to 4 months in
>advance. Many GPS receivers only supply a 2-state leap warning flag
>and are ambiguous about _when_ the leap second event will occur. If
>software designers can be certain that it will happen only on the
>6-month boundaries, systems will be more reliable. De-facto, this is
>already the case, and a leap second in September or March will break
>many systems (all versions of NTP, for instance).

Marc,

isn't this a bit on the ridiculous side? A programmer who is
just barely able to program a leap second at year's end but is
out of his wits with programming the same thing at month's end?
Simplify programming is your justification? Give me a break!

For a normal programmer or for any decent computer program it
does not matter at all when the leap second happens, as long as
it is announced in time.

I don't know about the other problems like compatibility with
ancient GPS receivers. I only know a little bit about
programming.

Hans-Georg
_______________
No mail please.

Markus Kuhn

unread,
Dec 6, 1999, 3:00:00 AM12/6/99
to
Marc Brett <mbr...@rgs0.london.waii.com> writes:
|> > Markus's suggestion was to insert leap seconds as often as they are
|> > inserted today, but to announce them long before
|>
|> But such a long lead time will necessarily mean that UT1-UTC will drift
|> beyond 0.9s. The format for WWVB, CHU, and ACTS will have to change,
|> and thousands, perhaps millions of time receivers will have to be
|> upgraded or replaced.

Only those receivers that actually *use* the DUT1 fields and do not just
fully ignore them like most. I'd be surprised, if this were more than a few
hundred receivers worldwide. Professional astronomic observatories have
some, but that is it basically. The DUT1 encoding can easily be switched
from the current counting code to a proper binary code with check bits. This
not only increases the DUT1 range and resolution, but would also improve
the reliability of the received signal. The ITU-R TF.460-4 format for
broadcasting DUT1 is a somewhat pathetic design anyway, no tears should
be shed if it would get changed. Updates in modern receivers can be boiled
down to an EEPROM update (just like some needed just recently for Y2K).

|> UTC will become even less useful to astronomical
|> observers.

UTC was designed for use by governments to define their official civilian
times. Astronomers are a tiny fraction of the user community, which can
easily look up UTC-UT1 on a USNO web site every week before they start
observations (or get the same information via updated radio receivers).

|> And the only benefit is that system administrators wouldn't have to
|> update their TAI-UTC tables as frequently.

There are at least three orders of magnitude more distributed system
administrators than observing astronomers on this planet.
Get in line ... ;-)

|> And the only benefit is that system administrators wouldn't have to
|> update their TAI-UTC tables as frequently. A very questionable benefit
|> indeed, because out of sight means out of mind which probably means
|> they won't be adequately maintained at all.

There still could be IERS C bulletins comming out every 6 months, and whoever
wants to practice updating the tables can still do it twice per year, for
the benefit of always having correct tables for timestamps 30 years into
the future. I certainly would still do it, just for the fun of it. So no
change here, nothing gets necessarily out of sight or mind, and we do not
create any new Y2K-style hazards, of which some paranoids are already afraid
again.

|> Instead of redefining UTC to be more predictable, we should be working
|> on a mechanism to dissemanate TAI-UTC tables to everyone who needs it,
|> automatically, reliably, economically, rapidly.

That would also be nice, but remains orders of magnitude more difficult
to implement. (Which does not mean that it shouldn't be tried as well.)

Markus Kuhn

unread,
Dec 6, 1999, 3:00:00 AM12/6/99
to
In article <31534914...@naggum.no>, Erik Naggum <er...@naggum.no> writes:
|> * Markus Kuhn
|> | Computers handle time today mostly in one of two encodings:
|> |
|> | - a scalar time value (e.g., a second counter)
|> | - a broken-down time display such as YYYY-MM-DD hh-mm-ss.sss
|>
|> I made a note in passing that I prefer a different encoding, one of less
|> random break-down. specifically, I want days to be counted integrally.
|> most of the computations commercial applications do with time have to do
|> with either days or times within days, and it's rather silly to have to
|> deal with days of varying length in seconds all the time, so let's
|> separate the two and call this day-and-seconds concept "commercial time",
|> with the usual breakdown of hours, minutes and seconds being completely
|> unaffected by leap seconds until 23:59:60, which is just fine by most
|> software -- very little happens right _before_ midnight, and midnight
|> would be triggered by a change in the day, anyway.

I very much like the idea of "less random break-downs", especially, since
months, weeks, hours and minutes are just Babylonian cultural artefacts
not backed up by any sensible astronomical or other natural oscillations.
(However, mankind got so used to them over the last ~3000 years that I guess
we will probably stick to them long after the U.S. have successfully
introduced A4 paper and the metric system. But *please* change the subject
line if you want to follow-up on that topic ... ;-)

The only minor problem with your particular approach is: It only works for
UTC. How do you handle leap seconds in other time zones than UTC?

After all (and your posting makes we wonder whether you are aware of that),
a leap second always happens at midnight UTC (23:59:60) all
over the world *simultaneously*, therefore a leap second looks like
00:59:60 in Norway and 18:59:60 at the US east cost (both in winter).

A leap second can be nicely represented in UTC with your day/second count,
by allowing the second counter (normally 0..85399) to overflow into
85400 during it. But that only works for UTC, not for other time zones
which do not have the leap second exactly after the end of the day but
before the next day starts. Or do you just not care about encoding
some local time?

In my C API proposal on

http://www.cl.cam.ac.uk/~mgk25/c-time/

I use also a "less random break-down", but I followed the POSIX.1:1996
idea of having a sec/nsec structure that counts seconds and nanoseconds.
I represent leap seconds (as CLOCK_UTC can produce them, perhaps
also CLOCK_LOCAL) in the form of an overflow of the nanosecond
counter, which usually has values 0..999999999 during a normal second,
but values 1000000000..1999999999 during a leap second. It turns
out, that correctly handling leap seconds from that output follows
quite naturally in code (try it on some examples!), and I can even
handle leap seconds in any local time zone that differs from UTC
by any integral number of seconds.

Sam Wormley

unread,
Dec 6, 1999, 3:00:00 AM12/6/99
to
Terje Mathisen wrote:
>
>
> I've studied this article, it seems to me that since the long-term trend
> line is nearly constant (i.e. caused by tidal friction), a model which
> allowed just a few seconds slack, would make it possible to extend the
> leap second announcements much further into the future.
>
> Even if accurate (sub-ms) predictions are impossible, I'd guess that a
> 10-second window would allow your 25-year future leap second
> determination.
>
> Terje
>

Hi Terje--The long-term trend is not linear at all, nor in the same
direction (when looking over the last few centuries). See:

http://www.cl.cam.ac.uk/~mgk25/volatile/astronomical-time.pdf Fig. 1
http://hpiers.obspm.fr/webiers/general/earthor/ROT.html

Assuming leap seconds are continued indefinitely, there will be
periods of adding leap seconds and periods of subtracting leap
seconds.


______________________________________________________________________
Sam Wormley - http://www.cnde.iastate.edu/staff/swormley/gps/time.html

nil...@postoffice.pacbell.net

unread,
Dec 6, 1999, 3:00:00 AM12/6/99
to

Markus Kuhn wrote:

> In article <3848A7AA...@cs.berkeley.edu>, John Hauser <jha...@cs.berkeley.edu> writes:
> |>
> |> Demetrios Matsakis:
> |> >It has been proposed to change the definition of Coordinated Universal
> |> >Time (UTC) regarding the insertion of leap-seconds, possibly even
> |> >eliminating their use. [...]
> |> >Should no new leap seconds be inserted, solar time will diverge from
> |> >atomic time at the rate of about 2 seconds every 3 years, and after
> |> >about a century |UT1-UTC| would exceed 1 minute.
> |>
> |> And after 1582 years, the divergence would presumably be more than
> |> 15 minutes, by which time people may be wishing the leap seconds had
> |> been left in. Is this what we want?
>

> I did reply to the above USNO questionaire, and I made the following
> alternative proposal:
>
> - Keep the current scheme of inserting a leap second 23:59:60Z every 1-2
> years into UTC.
>

> - Announce UTC leap seconds at least 30-50 years in advance, such that
> we do not have to constantly update leap second tables in computers
> that try to run on TAI every 6 months or so when a new IERS C bulletin
> comes out.
>

Fifty YEARS? How useful would it be to know about leap seconds thatfar ahead? If anything was
ever planned ahead that far, we wouldn't have Y2K to deal with now.

> of say up to a minute (as required for long-term leap seconds planning),
> but still keep |UTC-UT1| as close as feasible to zero as history goes
> on.
>
> - Allow changes to an already announced 30-50 years schedule only,
> if there is a threat that e.g. |UTC-UT1| < 100 s will be violated
> with the current schedule of leap seconds. [This should only occur
> in the case of catastrophic events (impact of a major mass, etc.),
> so the change of leap second secule is unlikely to be our most
> significant problem in that case ...]
>

Wouldn't systems that require the kind of precision that takes leap seconds into account be more
likely to break with a large leap in time instead of a one-second leap? Also, if we're talking
about astronomicalapplications here, on a scale of 50 years you have to take precession into
account, which would add yet another factor into the algorithms for this proposed new time scale.

I'm all for simplifying things, but this sounds like reinventing the wheel to me.


Brian

>


nil...@postoffice.pacbell.net

unread,
Dec 6, 1999, 3:00:00 AM12/6/99
to

Terje Mathisen wrote:

> > It's my understanding that leap seconds cannot be predicted ACCURATELY
> > (keeping TAI-UTC less than one second) for more than a few years in
> > advance...???
>
> That's the key part of his suggestion: By allowing a much wider swing
> area for the UT1-UTC (NOT TAI-UTC!) offset, it would be possible to
> announce those leap seconds much further in advance, without breaking
> the promise re. max offset.
>
> I like it!
>
> Terje
>

Correct me if I'm wrong, but it seems to me that what's being proposed here is essentially a large
spike in the offset between atomic and astronomical time scales being introduced at large intervals,
rather than a small spike in such offsets being introduced at shorter intervals. If so, I would
think this would have more potential to screw up systems dependent on sub-millisecond precision than
the one-second jumps we have now.


Brian


Philip Homburg

unread,
Dec 6, 1999, 3:00:00 AM12/6/99
to
In article <Pine.LNX.3.96.991206...@colossus.bilow.com>,

Mike Bilow <mik...@colossus.bilow.com> wrote:
>I think this would lead to great inconvenience in astronomy and celestial
>navigation. As a practical matter, the only non-astronomical time which
>is conveniently available is UTC. This means that, if you want to locate
>some celestial object, then you need to know (or approximate) UT1, and the
>fact that UTC is corrected within 0.9s is a huge convenience. I have
>often been concerned with a celestial or quasi-celestial event of only a
>few minutes duration, and maintaining correction factors from a practical
>reference would greatly increase the chances of error. For example, it is
>common for a low earth orbiting satellite to be visible above the horizon
>for only a very few minutes, and an error of as much as a minute could
>leave a tracking antenna pointed tens of degrees off.

Are astronomy and celestial navigation the only application domains that can
only use UTC if |UT1-UTC| < 1s?

As a computer user/programmer, I think I prefer leap hours instead of leap
seconds. A fixed 86400 seconds in a day makes many computation much easier.
As far as I know, in a couple of thousand years, we have to do something with
leap days anyhow, so the occasional leap hour should be no problem as well.

Philip Homburg

Dave Martindale

unread,
Dec 6, 1999, 3:00:00 AM12/6/99
to
mg...@cl.cam.ac.uk (Markus Kuhn) writes:

>|> And the only benefit is that system administrators wouldn't have to
>|> update their TAI-UTC tables as frequently.

>There are at least three orders of magnitude more distributed system
>administrators than observing astronomers on this planet.
>Get in line ... ;-)

If it's a computer administration problem, doesn't it make sense to solve
it using computer techniques that stay within the computer system domain,
rather than messing up a perfectly workable time system that affects
everyone? There are rather few distributed system administrators
compared to the whole population.

>|> Instead of redefining UTC to be more predictable, we should be working
>|> on a mechanism to dissemanate TAI-UTC tables to everyone who needs it,
>|> automatically, reliably, economically, rapidly.

>That would also be nice, but remains orders of magnitude more difficult
>to implement. (Which does not mean that it shouldn't be tried as well.)

If a distributed computer system can't manage to automatically distribute
time correction tables accurately, given a month or two advance notice,
why should anyone trust it to do anything else accurately either?

Dave

Marc Brett

unread,
Dec 6, 1999, 3:00:00 AM12/6/99
to

> Marc,

Imagine this:

Your GPS receiver (A Leica MX4200) outputs a time/date stamp every
second with a flag which may be either -1, 00, or 01, indicating an
impending negative leap second, no leap second planned, or an impending
positive leap second, respectively. It's August. The flag goes to 01
and stays there. The GPS system is capable of announcing leap seconds
up to 128 weeks in advance. (This is a real example, BTW).

You know a little about programming, so what would you do? Insert
the leap second at the end of September or December? Justify your
answer.

Rbt

unread,
Dec 6, 1999, 3:00:00 AM12/6/99
to
It looks like we can do 10 years into the future by adding one decimal
digit to the DUT1 representation. DUT1 is usually represented as a
single decimal digit (and a sign bit) giving the value in 100ms
units. Adding one more decimal digit would allow us to represent
values from -9.9s to +9.9s. The uncertainty in predicted values of
DUT1 for the next 10 years is about 5s. That gives us a factor of two
safety margin.

Predicting 30 years into the future would probably require adding two
decimal digits to the representation of DUT1.

Planning 10 years in advance would keep people from saying "It won't
roll-over on my watch, so I won't worry about it." But planning 30
years in advance almost begs for that kind of shortsightedness.

Rick

PS -- all that said, I'm not in favor of anything that will require
non-transparent changes to the format of time code signals.

D. J. Bernstein

unread,
Dec 6, 1999, 3:00:00 AM12/6/99
to
Chuck Tribolet <tri...@garlic.com> wrote:
> Astronomers use UTC to point their telescopes.

If you have any evidence for this rather astonishing claim, please
notify USNO, and explain why UTC is being used instead of UT1. This is
exactly the type of information they're asking for.

---Dan

H. Peter Anvin

unread,
Dec 6, 1999, 3:00:00 AM12/6/99
to
Followup to: <82h0r4$t...@cs.vu.nl>
By author: Philip Homburg <phi...@cs.vu.nl>
In newsgroup: comp.std.internat

>
> Are astronomy and celestial navigation the only application domains that can
> only use UTC if |UT1-UTC| < 1s?
>
> As a computer user/programmer, I think I prefer leap hours instead of leap
> seconds. A fixed 86400 seconds in a day makes many computation much easier.
> As far as I know, in a couple of thousand years, we have to do something with
> leap days anyhow, so the occasional leap hour should be no problem as well.
>

Are you kidding? Think what will happen when the leap hour strikes.
One reason for the 1 s limit is that a 1 s jump isn't significant on a
human activities scale -- remember, all civilian time scales worldwide
(except Saudi Arabia?) are derived from UTC. However, think about how
big an affair DST switching is every year, and you'd know the effect
of a leap hour.

Furthermore, what *really* bothers me about this idea is the "rare
major event" problem -- like Y2K, a rare major event tend to exercise
a code domain that has never been tested -- or far too frequently
never been *written* even.

-hpa

--
<h...@transmeta.com> at work, <h...@zytor.com> in private!
"Unix gives you enough rope to shoot yourself in the foot."

Sam Wormley

unread,
Dec 6, 1999, 3:00:00 AM12/6/99
to
Terje Mathisen wrote:

>
> Sam Wormley wrote:
> >
> > Terje Mathisen wrote:
> > >
> > >
> > > I've studied this article, it seems to me that since the long-term trend
> > > line is nearly constant (i.e. caused by tidal friction), a model which
> > > allowed just a few seconds slack, would make it possible to extend the
> > > leap second announcements much further into the future.
> > >
> > > Even if accurate (sub-ms) predictions are impossible, I'd guess that a
> > > 10-second window would allow your 25-year future leap second
> > > determination.
> > >
> > > Terje
> > >
> >
> > Hi Terje--The long-term trend is not linear at all, nor in the same
> > direction (when looking over the last few centuries). See:
> >
> > http://www.cl.cam.ac.uk/~mgk25/volatile/astronomical-time.pdf Fig. 1
>
> But this is _exactly_ what I did!
>
> Look at fig 2 & fig 4:
>
> Fig 2 shows the (rather linear) long-term drift between TAI and UTC/UT1,
> while Fig 4 explicitely extracts that linear term before plotting the
> remaining somewhat sinusoidal variations.
>
> > http://hpiers.obspm.fr/webiers/general/earthor/ROT.html
>
> This report contains:
>
> http://hpiers.obspm.fr/webiers/general/earthor/utlod/figure1.html
>
> which repeats the seemingly 'up & down' motion of the time offsets, but
> if you take a closer look, you'll notice that this is also the result
> after removing the long-term linear term (0.0022s * (MJD - 48987)),
> which corresponds to a linear rate of 2.2 ms/day.
>
> Over the two-year period from 1994 to 1996, the slope of the residual is
> at its steepest, giving a 150+ ms drift over this period, while the next
> three years recovers this.
>
> Assuming this is more or less typical, the Earth has been slowed down to
> where it looses 2.2 +/- 0.4 ms/day.
>
> The jitter value (+/- 0.4 ms/day) corresponds to 2500 days or 7 years
> before it would result in an extra leap second, if it went in the same
> direction all the time.

>
> > Assuming leap seconds are continued indefinitely, there will be
> > periods of adding leap seconds and periods of subtracting leap
> > seconds.
>
> Subtracting leap seconds is _extremely_ unlikely, since you would have
> to increase the jitter by a factor of 5-6, all going against the normal
> (tidal) slowdown to cause this. I believe the specification allows for
> negative leap seconds "just in case" we get a severe ice age, causing a
> lot of the planet's water to fall down on the South pole.
>
> This would cause an increase in the spin rate, and so could a major
> meteor impact, but in both these scenarios, humanity would have more
> severe problems than the current offset between UTC & UT1.
>
> Terje
>
> PS. The table of predicted UT1-TAI offsets does not allow for the
> possibility of a negative leap second within the next 7-8 years: They
> expect about 7 leap seconds, with a maximum error of 4.

>
> --
> - <Terje.M...@hda.hydro.com>
> Using self-discipline, see http://www.eiffel.com/discipline
> "almost all programming can be viewed as an exercise in caching"

Sorry Terje--I was totally out to lunch... wrong figure, wrong arguement.
I was looking at the figure 1. in the GPS World article thinking that
the difference between a uniform time scale and one based on the Earth's
rotation was anything but linear.

Regards,
-Sam

Terje Mathisen

unread,
Dec 7, 1999, 3:00:00 AM12/7/99
to

Steve Dodd

unread,
Dec 7, 1999, 3:00:00 AM12/7/99
to
Hi,

I didn't realise you'd followed up *and* mailed, so I'll include the text
of my earlier email reply here. Can you please *not* follow-up *and* mail?
Thanks.

Tom Van Baak <t...@veritas.com> wrote:

> Normal computers would display local time as they always
> have: either they're already running in local time (e.g.,
> DOS) or they're running UTC (e.g., UNIX, NT) with a local
> timezone correction prior to display.

> The simplicity of the new UT1-UTC proposal is that local
> time computations are not affected.

I may have to read the original proposal again. AIUI, UTC would drift further
and further from UT1, and I thought UT1 was (to put it simply) ``astronomical
time'', i.e. the time that is used to predict daylight hours, etc. The
difference between the time displayed by devices (UTC) and the apparent
time according to hours of sunlight (UT1) would surely become noticeable
eventually?

--
"If you wish to insult me my consulting rates are available by return of
email." -- Alan Cox

Erik Naggum

unread,
Dec 7, 1999, 3:00:00 AM12/7/99
to
* Markus Kuhn

| The only minor problem with your particular approach is: It only works
| for UTC. How do you handle leap seconds in other time zones than UTC?

I have received some comments that indicate I may have been less than
clear on what time is actually stored in my proposed new format. Common
Lisp offers the time concept UNIVERSAL-TIME, but this does not include
explicit leap seconds, and the current machinery does not handle them at
all, so we're at loss for precision in elapsed time. the value, however,
is an encoded integral number of seconds since 1900-01-01 00:00:00Z, only
ignoring leap seconds. the time zone is then applied when printing or
decoding the value. adding leap seconds to this system was originally
intended to be handled with second number 86400 in a given day, but you
changed my mind on that.

| I represent leap seconds (as CLOCK_UTC can produce them, perhaps also
| CLOCK_LOCAL) in the form of an overflow of the nanosecond counter, which
| usually has values 0..999999999 during a normal second, but values
| 1000000000..1999999999 during a leap second.

very nice. as my paper shows, I break down time into day, second, and
millisecond, which I neglected to mention previously, so there is room
for using your technique in representing leap seconds as milliseconds in
1000..1999. of course, choosing milliseconds is just a convenience
factor, but I, too, came to the conclusion that I had to fix the smallest
time unit, and not use a floating point number in the range [0,1> for the
subsecond value.

| It turns out, that correctly handling leap seconds from that output
| follows quite naturally in code (try it on some examples!), and I can
| even handle leap seconds in any local time zone that differs from UTC by
| any integral number of seconds.

I like your solution and will credit you with it when I apply it, if
that' OK with you.

#:Erik

Hans-Georg Michna

unread,
Dec 7, 1999, 3:00:00 AM12/7/99
to
Marc Brett <mbr...@rgs0.london.waii.com> wrote:

>Imagine this:
>
>Your GPS receiver (A Leica MX4200) outputs a time/date stamp every
>second with a flag which may be either -1, 00, or 01, indicating an
>impending negative leap second, no leap second planned, or an impending
>positive leap second, respectively. It's August. The flag goes to 01
>and stays there. The GPS system is capable of announcing leap seconds
>up to 128 weeks in advance. (This is a real example, BTW).
>
>You know a little about programming, so what would you do? Insert
>the leap second at the end of September or December? Justify your
>answer.

Marc,

I do not fully understand what you're driving at. Do you want me
to reprogram the GPS receiver or to reprogram another device
that's being fed data from the GPS receiver? Are you supposing
that new rules are already in effect or not? Which new rules?

If this GPS receiver is too old and therefore unable to follow
new rules then it should be reprogrammed, abandoned, or its
errors tolerated.

I already wrote that compatibility with old GPS equipment is an
area of which I don't know much. Generally I would say, if the
rules are changed, the change has to be announced in time such
that old equipment can be updated, otherwise accommodated or
written off.

New equipment has to be programmed to follow the new rules,
which, in this case, is trivial.

Hans-Georg Michna

unread,
Dec 7, 1999, 3:00:00 AM12/7/99
to
rbth...@lilypad.rutgers.edu (Rbt) wrote:

>It looks like we can do 10 years into the future by adding one decimal
>digit to the DUT1 representation

With some likelihood that is less than 100%. Something can
change the earth's rotation. I don't know what, but I could
imagine a major eruption or impact of one or more biggish things
from space.

Philip Homburg

unread,
Dec 7, 1999, 3:00:00 AM12/7/99
to
In article <82hngh$817$1...@cesium.transmeta.com>,

H. Peter Anvin <h...@transmeta.com> wrote:
>Followup to: <82h0r4$t...@cs.vu.nl>
>By author: Philip Homburg <phi...@cs.vu.nl>
>In newsgroup: comp.std.internat
>>
>> Are astronomy and celestial navigation the only application domains that can
>> only use UTC if |UT1-UTC| < 1s?
>>
>> As a computer user/programmer, I think I prefer leap hours instead of leap
>> seconds. A fixed 86400 seconds in a day makes many computation much easier.
>> As far as I know, in a couple of thousand years, we have to do something with
>> leap days anyhow, so the occasional leap hour should be no problem as well.
>>
>
>Are you kidding? Think what will happen when the leap hour strikes.
>One reason for the 1 s limit is that a 1 s jump isn't significant on a
>human activities scale -- remember, all civilian time scales worldwide
>(except Saudi Arabia?) are derived from UTC. However, think about how
>big an affair DST switching is every year, and you'd know the effect
>of a leap hour.

- What happens if we have to drop a leap year (in about 3000 year)?
- It is possible to implement the leap hour by changing the timezones. If
countries continue to change daylight saving time every couple of years, like
they have in the past, this should not be a problem.
- How and when to implement the leap hour can be decided 100 years in advance
giving everybody plenty of time to prepare. The Y2K problem was known
well in advance and most people ignored it anyway.

Philip Homburg

Mike Bilow

unread,
Dec 7, 1999, 3:00:00 AM12/7/99
to
In article <82h444$fq4$1...@mail1.wg.waii.com>,
Marc Brett <mbr...@rgs0.london.waii.com> wrote:

>Your GPS receiver (A Leica MX4200) outputs a time/date stamp every
>second with a flag which may be either -1, 00, or 01, indicating an
>impending negative leap second, no leap second planned, or an impending
>positive leap second, respectively. It's August. The flag goes to 01
>and stays there. The GPS system is capable of announcing leap seconds
>up to 128 weeks in advance. (This is a real example, BTW).
>
>You know a little about programming, so what would you do? Insert
>the leap second at the end of September or December? Justify your
>answer.

I would probably insert the leap second right before the "01" changes
back to "00." :)

-- Mike


--
-------------------------------------------------------------------------------
Bilow Computer Science, Inc. | http://www.bilow.com/ | Michael S. Bilow
Cranston, RI 02920-5554, USA | mi...@bilow.com | President
-------------------------------------------------------------------------------
PGP Public Key fingerprint = 4B 06 23 FB 3E 24 A5 24 14 B5 A2 14 96 73 B4 B2
PGP Public Key fingerprint = A5 13 63 7F E3 9F AB 0A 52 62 49 26 BF 0C 01 AD
-------------------------------------------------------------------------------

Helmut Richter

unread,
Dec 7, 1999, 3:00:00 AM12/7/99
to
Erik Naggum <er...@naggum.no> writes:

> I made a note in passing that I prefer a different encoding, one of less
> random break-down. specifically, I want days to be counted integrally.
> most of the computations commercial applications do with time have to do
> with either days or times within days, and it's rather silly to have to
> deal with days of varying length in seconds all the time, so let's
> separate the two and call this day-and-seconds concept "commercial time",
> with the usual breakdown of hours, minutes and seconds being completely
> unaffected by leap seconds until 23:59:60, which is just fine by most
> software -- very little happens right _before_ midnight, and midnight

> would be triggered by a change in the day, anyway. we can deal with days
> of any number of seconds if and when we have to compute "scientific"
> time, which would have to be an accurate computation of _elapsed_ time
> and would be subject to a number of interesting qualifications.

It seems to be a widespread blunder in the design of classes for
date/time to put them into the same class. In my opinion - and I infer
from the above that it is your opinion as well - date and time give
rise to at least four distinct classes:

(1) Date: a day on the calendar. Methods are for instance: mapping to
a calendar date, weekday, time difference between two dates in days.

(2) Time of Day: what the clock shows (plus info whether it is the first
or the second time on this day that the clock shows this time).
Methods are for instance: mapping to time notations (ISO8601 or
the exotic American one), difference of two times on the same day.

(3) Absolute time: some number indicating the time elapsed since some
point in time (disregarding relativistic effects).

(4) Time zone: specification how absolute time can be calculated,
given date and time of day. It is only here where leap seconds
come in. UT1, TAI, ... are just additional time zones.

The attempt to put everything into one class makes things much more
difficult than it should be. Examples:

- I cannot answer the question what weekday a specific second was as
this depends on the local time zone. But I can answer the question
what weekday a specific date was, and the answer is the same on the
entire earth. Specifying dates in seconds makes simple questions very
difficult to answer.

- When a document gets a time stamp, it is not interesting how many
seconds, including leap seconds, have elapsed in this century. It is
only interesting what numbers the calendar and the clock showed when
it was written, plus rough information on the time zone. This is why
the Unix time stamp format is useful although it does not account for
leap seconds - it were indeed less useful if it did.

The cases where absolute time is needed are very rare, e.g. astronomy,
much too rare to warrant a time format in computers that require the
full complexity of time specification each time somebody asks for a
simple answer to a simple question.

Helmut Richter

Mike Bilow

unread,
Dec 7, 1999, 3:00:00 AM12/7/99
to
In article <82gsrh$qof$1...@pegasus.csx.cam.ac.uk>,
Markus Kuhn <mg...@cl.cam.ac.uk> wrote:

>Only those receivers that actually *use* the DUT1 fields and do not just
>fully ignore them like most.

If the size or encoding of the DUT1 field chanes in the time codes, then
anything which merely reads and ignores the field will need to be changed.

>The DUT1 encoding can easily be switched
>from the current counting code to a proper binary code with check bits.

One of the problems with codes such as the WWVB system is that there are
no check digits. Because of the high noise on VLF, decoders use a number
of schemes for deciding whether a signal has been reliably received, such
as making sure successive minute readings agree in a sane way. Another
strategy is to use the fact that only certain bit patterns are legal for
BCD characters, so that the receipt of a frame with a non-BCD character
where a BCD characters was expected will cause the frame to be discarded.

>UTC was designed for use by governments to define their official civilian
>times. Astronomers are a tiny fraction of the user community, which can
>easily look up UTC-UT1 on a USNO web site every week before they start
>observations (or get the same information via updated radio receivers).

There are a lot of "derivative astronomers." Anyone who uses a tide chart,
for example, is in the astronomical camp. Obviously, no one cares if a tide
chart is off by a second -- it is a statistical best estimate, anyway --
but accumulating errors on the order of a couple of minutes will start to
affect even users of tide charts. There are widespread and significant
benefits to keeping the civil time synchronized within a second of apparent
astronomical time.

I also think that there may be cultural and religious resistance to
desynchronizing civil and astronomical time. Regardless of the merits
of such resistance, the practical effect is that some world cultures and
possibly entire nations will revert to using astronomical time instead of
atomic time as their reference, and the benefit of a worldwide standard
will be lost. In an area such as aviation, this could have extremely dire
consequences because of the resulting confusion. In the United States,
there was quite a lot of popular opposition to the introduction of
standard time by the railroads a century ago because time was, to most
people, marked by the sun rather than an abstract concept.

Peter Bunclark

unread,
Dec 7, 1999, 3:00:00 AM12/7/99
to
Mike Bilow wrote:

> On 5 Dec 1999, Markus Kuhn wrote:
>
> > I did reply to the above USNO questionaire, and I made the following
> > alternative proposal:
> >
> > - Keep the current scheme of inserting a leap second 23:59:60Z every 1-2
> > years into UTC.
> >
>

I agree with practically all of what Markus says; but will take the opportunity
saying that those of us who actually are english, use metric units,
with distant fond memories of what we call `Imperial Units'.
Only in America do folk cling to the quaint old customs of
feet, pounds and Fahrenheit.


> because of a conflict between English and Metric units, either. Messing

> with leap seconds at this point just unnerves my engineering sensibility.
>
> -- Mike


Marc Brett

unread,
Dec 7, 1999, 3:00:00 AM12/7/99
to
In comp.protocols.time.ntp Mike Bilow <mik...@colossus.bilow.com> wrote:
> In article <82h444$fq4$1...@mail1.wg.waii.com>,
> Marc Brett <mbr...@rgs0.london.waii.com> wrote:

>>You know a little about programming, so what would you do? Insert
>>the leap second at the end of September or December? Justify your
>>answer.

> I would probably insert the leap second right before the "01" changes

^^^^^^^^^^^^
????????????
> back to "00." :)

Won't work :-( The flags clear many minutes (or hours) after the event.

Marc Brett

unread,
Dec 7, 1999, 3:00:00 AM12/7/99
to
In comp.protocols.time.ntp Peter Bunclark <p...@ast.cam.ac.uk> wrote:
> I agree with practically all of what Markus says; but will take the opportunity
> saying that those of us who actually are english, use metric units,
> with distant fond memories of what we call `Imperial Units'.
> Only in America do folk cling to the quaint old customs of
> feet, pounds and Fahrenheit.

This is a troll, right?

Are you talking about the England which advertises vehicle fuel
consumption in "miles per gallon", still measures road distances in
miles but sells petrol in litres, making the measurement meaningless?
Or the England which sells bulk food by the pound but packaged food in
kilos, making comparison shopping impossible for the average bear? Or
the England where weather is still reported in both Fahrenheit and
Celcius, both inches and mm, just to cater for, for who? The England
where NHS medical records still measure a person's weight in STONES?
(For bonus points: how many Americans even know what a stone is?)

Canada went fully metric almost overnight many years ago, but England
still hasn't managed to do it in several decades of astonishingly
haphazard fits and starts. At least they're trying. I fear there's no
hope at all for that monolith south of the 49th.

Dave Martindale

unread,
Dec 7, 1999, 3:00:00 AM12/7/99
to
Marc Brett <mbr...@rgs0.london.waii.com> writes:

>Canada went fully metric almost overnight many years ago,

Well, almost. Some areas are still a mixture. Aviation uses speed
(aircraft and wind) in knots, altitude in feet, temperature in Celsius,
and pressure in mm of mercury. It's hard to make a wholesale
change there, since old aircraft last so long.

As you say, at least England is trying.

The USA, having failed to convince the public at large that conversion
should happen, seems to have switched to the "stealth" approach. Businesses
are switching anyway, slowly. Science switched long ago. Now all they
have to do is make sure the schools teach metric measurements, and
wait 60 years for the old "Imperial-calibrated" people to die off.

Dave

D. J. Bernstein

unread,
Dec 7, 1999, 3:00:00 AM12/7/99
to
Helmut Richter <Helmut....@lrz-muenchen.de> wrote:
> (1) Date: a day on the calendar.

My libtai library (http://cr.yp.to/libtai.html) uses struct caldate =
year, month, day. You can convert dates to standard MJD numbers for easy
arithmetic.

> (2) Time of Day:

struct caltime = caldate, hour, minute, second, offset.

> (3) Absolute time: some number indicating the time elapsed since some
> point in time (disregarding relativistic effects).

struct tai and struct taia, depending on how much precision you want.

> The cases where absolute time is needed are very rare,

In fact, time addition and subtraction are extremely popular operations.
Time display is much less common. Storing times in a simple format that
supports the common operations easily, and isolating complicated formats
inside user-interface code, is good engineering.

---Dan

Markus Kuhn

unread,
Dec 7, 1999, 3:00:00 AM12/7/99
to
Marc Brett <mbr...@rgs0.london.waii.com> writes:
|> In comp.protocols.time.ntp Peter Bunclark <p...@ast.cam.ac.uk> wrote:
|> > I agree with practically all of what Markus says; but will take the opportunity
|> > saying that those of us who actually are english, use metric units,
|> > with distant fond memories of what we call `Imperial Units'.
|> > Only in America do folk cling to the quaint old customs of
|> > feet, pounds and Fahrenheit.
|>
|> This is a troll, right?
|>
|> Are you talking about the England which advertises vehicle fuel
|> consumption in "miles per gallon", still measures road distances in
|> miles but sells petrol in litres, making the measurement meaningless?
|> Or the England which sells bulk food by the pound but packaged food in
|> kilos, making comparison shopping impossible for the average bear?

Things are getting better:

Bulk food will finally have to be sold in grams and kilograms in the UK
starting 2000-01-01. All supermarkets and grocery stores that I use
here in Cambridge have already replaced or reconfigured their scales
over that past few weeks (along with around 60 000 other retail
businesses in the UK). Prices will be shown in both units for the
next 3 weeks and then starting next month, they will only have to be
displayed in £/kg or £/100 g.

This will simplify comparing prices across Europe significantly
(even more so, if the UK should finally decide to join the Eurozone,
which admittedly might take some more time, considering the enormous
brain-washing campaign that some significant newspaper owners
have started here).

The only remaining exceptions where non-SI units will be allowed
in Britain after the end of 1999 are

- mile, yard, foot, inch for road traffic signs and for road
distance and speed measurement

- pint for draught beer and cider and for milk in returnable containers

- acre for land registration (however, registered file plans
de-facto have exclusively been expressed in metric units
since ~1995)

- troy ounce for transactions in precious metals.

You shouldn't be able to find any other archaic units around here
very soon.

More information on

http://www.dti.gov.uk/cacp/ca/metric/
http://www.dti.gov.uk/cacp/ca/si-units.pdf

So even though progress is somewhat slow, Britain is now also
nicely on track towards becoming fully metric. I am much more
worried about the U.S. though ...

RJ Carpenter

unread,
Dec 7, 1999, 3:00:00 AM12/7/99
to
Marc Brett wrote:
>

> Are you talking about the England which advertises vehicle fuel
> consumption in "miles per gallon", still measures road distances in
> miles but sells petrol in litres,

SNIP



> Canada went fully metric almost overnight many years ago,

IMO, that was widely accepted as a way to "be different from the USA"
(as well as being sensible in the long run).

Back to the UK, I was visiting when one of the Metrication Board reports
was issued. Paraphrasing, packaged goods were to be sold in units of 1,
3/4, 5/8, 1/2, 3/8, 5/16, 1/4, 3/16, 1/8, .... (of course stated in the
decimal equivalents). None of those "foreign" subunits like 0.1, 0.15,
0.2, 0.25, etc.

Even the US stock market will abandon fractional-dollars in favor of
decimal (cents) in the next few weeks.

Are prices still quoted in guineas in fancy stores?

Jim Randle

unread,
Dec 7, 1999, 3:00:00 AM12/7/99
to
I had a kidney stone once. And for arcane measurement systems, there's
nothing more bucolic than a henweigh.

Marc Brett wrote:
clip ....

Keith Thompson

unread,
Dec 7, 1999, 3:00:00 AM12/7/99
to
mg...@cl.cam.ac.uk (Markus Kuhn) writes:
[...]
> A leap second can be nicely represented in UTC with your day/second count,
> by allowing the second counter (normally 0..85399) to overflow into
> 85400 during it.

<LIE>I hate to be picky</LIE>, but that's 86400, not 85400.

--
Keith Thompson (The_Other_Keith) k...@cts.com <http://www.ghoti.net/~kst>
San Diego Supercomputer Center <*> <http://www.sdsc.edu/~kst>
"Oh my gosh! You are SO ahead of your time!" -- anon.

chris

unread,
Dec 7, 1999, 3:00:00 AM12/7/99
to
On 7 Dec 1999 18:44:37 GMT, Marc Brett <mbr...@rgs0.london.waii.com>

wrote:
>
>Are you talking about the England

UK, actually :-)

>which advertises vehicle fuel
>consumption in "miles per gallon", still measures road distances in

>miles but sells petrol in litres, making the measurement meaningless?
>Or the England which sells bulk food by the pound but packaged food in

>kilos, making comparison shopping impossible for the average bear? Or
>the England where weather is still reported in both Fahrenheit and
>Celcius,

Fahrenheit is rarely used, especially by the BBC.

> both inches and mm,

Rain is nearly always in mm, snow in cm, and fog visibility and
mountain heights in metres

> just to cater for, for who? The England
>where NHS medical records still measure a person's weight in STONES?

Pardon? Medical records have been metric for years, including baby
weights, but doctors/nurses still feel the need to convert for 'public
consumption'. I got my height and weight in SI from a medical about 10
years ago, and I've used them ever since.

>(For bonus points: how many Americans even know what a stone is?)
>

>Canada went fully metric almost overnight many years ago, but England
>still hasn't managed to do it in several decades of astonishingly
>haphazard fits and starts.

The UK is ahead of Canada in some respects (though I envy them their
km road signs - and Ireland, too - a powerful signal to the public).
many of the laws aren't enforced, so they have a mix of kg and lbs at
stores, and heights and weights (even among French speakers) are
usually in imperial. Construction is still largely non-metric,
something that went metric here in the 70s.

>At least they're trying. I fear there's no
>hope at all for that monolith south of the 49th.

Check out the US Metric Association via my Web site to see just how
much has changed.
--
Metrication information: http://www.usma.demon.co.uk
UK legislation, EC Directives, Trading Standards links and more

Rich Wales

unread,
Dec 7, 1999, 3:00:00 AM12/7/99
to
Dave Martindale wrote:

> > Canada went fully metric almost overnight many years ago,

> Well, almost. Some areas are still a mixture. Aviation


> uses speed (aircraft and wind) in knots, altitude in feet,
> temperature in Celsius, and pressure in mm of mercury.

Interesting. My impression is that Canada is still in transition; just
about all official public measurements are metric, but most older (and
many middle-aged) Canadians are uncomfortable with the new system and
would still prefer to do things the old way.

Ground transportation is all metric by now. Road distances are in km;
speed limits are posted in km/h; gasoline is sold by the litre; vehicle
safety standards and traffic laws all use kg, km, metres, newtons; etc.,
etc. Many people still talk about miles, MPH, and (British imperial)
gallons, but it's basically impossible to drive in Canada nowadays
without a basic knowledge of metric measurements.

Most things sold in stores are measured in metric. Some items, though,
come in "funny" metric sizes which, on closer examination, are simply
metric equivalents of pre-metric measurements -- e.g., 355 mL (= 12 U.S.
fluid ounces), or 341 mL (= 12 British imperial fluid ounces). Grocery
stores often label things like meat and produce using both kg and lb,
for the benefit of shoppers who still think in pounds (and who might
take their business elsewhere if things weren't labelled in pounds).

Weather reports on TV and radio quote temperature in Celsius and air
pressure in kilopascals (kPa).

The traditional pre-metric paper sizes (such as 8-1/2 x 11 inches) are
still standard; metric paper sizes are virtually nonexistent.

Official personal measurements (e.g., height and weight on a driver's
licence) are likely to be metric. Most Canadians, though, still think
of their height in feet/inches and their weight in pounds.

Architectural measurements are still done in feet and inches.

In general, I'd say most Canadians, when talking casually about everyday
weights and distances, still tend to use feet/inches and pounds.

Aside from the fact that Canada's big metric conversion was not quite a
quarter century ago, it also needs to be kept in mind that the US still
uses its traditional, non-metric measuring systems -- and given the
heavy influx of the US media in Canada, it would be pretty hard to get
away from feet, inches, pounds, and Fahrenheit.

> It's hard to make a wholesale change there, since old
> aircraft last so long.

As I recall, there was a serious attempt to switch aviation over to
metric -- but it got nipped in the bud after the 1983 "Gimli Glider"
incident (Air Canada 143 running out of fuel in mid-flight due to a
fueling error).

> The USA, having failed to convince the public at large
> that conversion should happen, seems to have switched

> to the "stealth" approach. . . . wait 60 years for the


> old "Imperial-calibrated" people to die off.

About 20 years ago, I remember a furniture catalogue from a company
in Oregon, with the following comment at the bottom of several pages:
"ALL MEASUREMENTS IN U.S. MEASURE -- METRIC IS UN-AMERICAN".

Rich Wales ri...@webcom.com http://www.webcom.com/richw/

chris

unread,
Dec 7, 1999, 3:00:00 AM12/7/99
to
On Tue, 07 Dec 1999 15:28:25 -0800, RJ Carpenter <rca...@erols.com>
wrote:

>Back to the UK, I was visiting when one of the Metrication Board reports
>was issued. Paraphrasing, packaged goods were to be sold in units of 1,
>3/4, 5/8, 1/2, 3/8, 5/16, 1/4, 3/16, 1/8, .... (of course stated in the
>decimal equivalents). None of those "foreign" subunits like 0.1, 0.15,
>0.2, 0.25, etc.

I don't remember this, but for those goods which have legal
'Prescribed Quantities' they are like:

125 g, 250 g, 375 g, 500 g, 1 kg, 1.5 kg, 7.5 kg, or a multiple of 1
kg
50 g, 100 g, 200 g, 250 g
200 ml, 250 ml, 500 ml, 750 ml or a multiple of 500 ml

Unfortunately, in the case of some items, notably coffee bean and
milk, they still permit the old imperial sizes too, albeit described
in their metric equivalents!

>Even the US stock market will abandon fractional-dollars in favor of
>decimal (cents) in the next few weeks.
>
>Are prices still quoted in guineas in fancy stores?

No way! the guinea became illegal in 1971 when we went decimal.

Keith Thompson

unread,
Dec 7, 1999, 3:00:00 AM12/7/99
to
Jim Randle <jra...@ibm.net> writes:
> I had a kidney stone once. And for arcane measurement systems, there's
> nothing more bucolic than a henweigh.

What's a henweigh?

chris

unread,
Dec 7, 1999, 3:00:00 AM12/7/99
to
On 7 Dec 1999 14:23:03 -0800, ri...@webcom.com (Rich Wales) wrote:

>Most things sold in stores are measured in metric. Some items, though,
>come in "funny" metric sizes which, on closer examination, are simply
>metric equivalents of pre-metric measurements -- e.g., 355 mL (= 12 U.S.
>fluid ounces), or 341 mL (= 12 British imperial fluid ounces).

Another area where the UK is ahead. Cans come in 330 ml/500 ml (soft
drinks); 330 ml/440 ml/500 ml (beer). Pint bottles of beer are almost
non-existen - they are now all 500 ml.

>The traditional pre-metric paper sizes (such as 8-1/2 x 11 inches) are
>still standard; metric paper sizes are virtually nonexistent.

A4 paper has been standard here since the 70s.

Dave Martindale

unread,
Dec 7, 1999, 3:00:00 AM12/7/99