[Python-ideas] High Precision datetime

842 views
Skip to first unread message

Ed Page

unread,
May 10, 2018, 2:00:45 PM5/10/18
to Python...@python.org
Greetings,
 
Is there interest in a PEP for extending time, datetime / timedelta for arbitrary or extended precision fractional seconds?
 
My company designs and manufactures scientific hardware that typically operate with nanoseconds -- sometimes even attoseconds -- levels of precision.  We’re in the process of providing Python APIs for some of these products and need to expose the full accuracy of the data to our customers.  Doing so would allow developers to do things like timestamp analog measurements for correlating with other events in their system, or precisely schedule a future time event for correctly interoperating with other high-speed devices. 
 
The API we’ve been toying with is adding two new fields to time, datetime and timedelta
- frac_seconds (int)
- frac_seconds_exponent (int or new SITimeUnit enum)
 
time.microseconds would be turned into a property that wraps frac_seconds for compatibility
 
Challenges
- Defining the new `max` or `resolution`
- strftime / strptime.  I propose that we do nothing, just leave formatting / parsing to use `microseconds` at best.  On the other hand, __str__ could just specify the fractional seconds using scientific or engineering notation.
 
Alternatives
- My company create our own datetime library
  - Continued fracturing of time ... ecosystem (datetime, arrow, pendulum, delorean, datetime64, pandas.Timestamp – all of which offer varying degrees of compatibility)
- Add an `attosecond` field and have `microsecond` wrap this.
  - Effectively same except hard code `frac_seconds_exponent` to lowest value
  - The most common cases (milliseconds, microseconds) will always pay the cost of using a bigint as compared to the proposal which is a "pay for what you use" approach
  - How do we define what is "good enough" precision?
- Continue to subdivide time by adding `nanosecond` that is "nanoseconds since last micosecond", `picosecond` that is "picoseconds since last micnanosecond", and  `attosecond` field that is "attoseconds since last picosecond"
  - Possibly surprising API; people might expect `picosecond` to be an offset since last second
  - Messy base 10 / base 2 conversions
- Have `frac_seconds` be a float
  - This has precision issues.
 
If anyone wants to have an impromptu BoF on the subject, I'm available at PyCon.

Thanks
Ed Page
_______________________________________________
Python-ideas mailing list
Python...@python.org
https://mail.python.org/mailman/listinfo/python-ideas
Code of Conduct: http://python.org/psf/codeofconduct/

David Mertz

unread,
May 10, 2018, 3:16:19 PM5/10/18
to Ed Page, Python...@python.org
This feels specialized enough to belong in a third party library. If that library can behave as transparently as possible interacting with Python datetime, so much the better. But the need is niche enough I don't think it belongs in standard library.

... this as someone who actually worked in a lab that measured MD simulations in attoseconds. I do understand the purpose.

Guido van Rossum

unread,
May 10, 2018, 3:36:29 PM5/10/18
to David Mertz, Ph.D., Python-Ideas
I have to agree with David that this seems too specialized to make room for in the stdlib.

Alexander Belopolsky

unread,
May 10, 2018, 6:14:38 PM5/10/18
to ed....@ni.com, python-ideas
> Is there interest in a PEP for extending time, datetime / timedelta for
arbitrary or extended precision fractional seconds?

Having seen the utter disaster that similar ideas brought to numpy, I would
say: no.

On the other hand, nanoseconds are slowly making their way to the stdlib
and to add nanoseconds to datetime we only need a fully backward compatible
implementation, not even a PEP.

See <https://bugs.python.org/issue15443>.

Ethan Furman

unread,
May 10, 2018, 7:11:49 PM5/10/18
to python...@python.org
On 05/10/2018 10:30 AM, Ed Page wrote:

> Alternatives
> - My company create our own datetime library
> - Continued fracturing of time ... ecosystem (datetime, arrow, pendulum, delorean, datetime64, pandas.Timestamp

Or, team up with one of those (if you can).

--
~Ethan~

David Mertz

unread,
May 10, 2018, 8:27:39 PM5/10/18
to Ethan Furman, python-ideas
In fairness, Pandas, datetime64, and Arrow are really the same thing. I don't know about Pendulum or Delorean. A common standard would be great, or at least strong interoperability. I'm sure the authors of those projects would want that... Arrow is entirely about interoperability, after all.

Nathaniel Smith

unread,
May 10, 2018, 10:00:34 PM5/10/18
to Ed Page, Python...@python.org
You don't mention the option of allowing time.microseconds to be a
float, and I was curious about that since if it did work, then that
might be a relatively smooth extension of the current API. The highest
value you'd store in the microseconds field is 1e6, and at values
around 1e6, double-precision floating point has precision of about
1e-10:

In [8]: 1e6 - np.nextafter(1e6, 0)
Out[8]: 1.1641532182693481e-10

So that could represent values to precision of ~0.116 femtoseconds, or
116 attoseconds. Too bad. Femtosecond precision would cover a lot of
cases, if you really need attoseconds then it won't work.

-n

--
Nathaniel J. Smith -- https://vorpus.org

Chris Barker via Python-ideas

unread,
May 14, 2018, 12:06:47 PM5/14/18
to Alexander Belopolsky, python-ideas
On Thu, May 10, 2018 at 6:13 PM, Alexander Belopolsky <alexander....@gmail.com> wrote:
> Is there interest in a PEP for extending time, datetime / timedelta for
arbitrary or extended precision fractional seconds?

Having seen the utter disaster that similar ideas brought to numpy, I would
say: no.

I'm not sure the "disaster" was due to this idea.... nor, frankly, is datetime64 a disaster at all, though certainly far from perfect.

But my question is whether high precision timedeltas belongs with "calendar time" at all.

What with UTC and leap seconds, and all that, it gets pretty ugly, when down to the second or sub-second, what a given datetime really means.

If I were to work with high precision measurements, experiments, etc, I'd use a "nanoseconds since" representation, where the "epoch" would likely be the beginning of the experiment, of something relevant.

Note that this issued in netcdf CF formats, datetimes are expressed in things like:

"hours since 1970-01-01:00:00"

granted, it's mostly so that the values can be stored as an array of a simple scalars, but it does allow precision and an epoch that are suited to the data at hand.

NOTE: One source of the "disaster" of numpy's datetime64 is you can set teh precision, but NOT the epoch -- which is kind of problematic if you really want femtosecond precision for something not in 1970 :-)

-CHB









 
On the other hand, nanoseconds are slowly making their way to the stdlib
and to add nanoseconds to datetime we only need a fully backward compatible
implementation, not even a PEP.

See <https://bugs.python.org/issue15443>.
_______________________________________________
Python-ideas mailing list
Python...@python.org
https://mail.python.org/mailman/listinfo/python-ideas
Code of Conduct: http://python.org/psf/codeofconduct/



--

Christopher Barker, Ph.D.
Oceanographer

Emergency Response Division
NOAA/NOS/OR&R            (206) 526-6959   voice
7600 Sand Point Way NE   (206) 526-6329   fax
Seattle, WA  98115       (206) 526-6317   main reception

Chris....@noaa.gov

Chris Angelico

unread,
May 14, 2018, 12:18:10 PM5/14/18
to python-ideas
On Tue, May 15, 2018 at 2:05 AM, Chris Barker via Python-ideas
<python...@python.org> wrote:
> But my question is whether high precision timedeltas belongs with "calendar
> time" at all.
>
> What with UTC and leap seconds, and all that, it gets pretty ugly, when down
> to the second or sub-second, what a given datetime really means.

UTC and leap seconds aren't a problem. When there's a leap second, you
have 23:59:60 (or you repeat 23:59:59, if you can't handle second
#60). That's pretty straight-forward, perfectly well-defined.

No, the REAL problems come from relativity.....

> If I were to work with high precision measurements, experiments, etc, I'd
> use a "nanoseconds since" representation, where the "epoch" would likely be
> the beginning of the experiment, of something relevant.

That's an unrelated form of time calculation. For that kind of thing,
you probably want to ignore calendars and use some form of monotonic
time; but also, if you want to go to (or below) nanosecond resolution,
you'll need your clock to actually be that accurate, which most likely
means you're not using a computer's clock. Femtosecond timestamping
would basically be just taking numbers given to you by an external
device and using them as sequence points - clocks and calendars become
irrelevant. The numbers might as well be frame numbers in a
super-high-speed filming of the event.

ChrisA

Chris Barker - NOAA Federal via Python-ideas

unread,
May 14, 2018, 6:35:51 PM5/14/18
to Chris Angelico, python-ideas
>
> UTC and leap seconds aren't a problem.

Of course they are a problem— why else would they not be implemented
in datetime?

But my point if that given datetimestamp or calculation could be off
by a second or so depending on whether and how leap seconds are
implemented.

It just doesn’t seem like a good idea to be handling months and
femptoseconds with the same “encoding”

-CHB

David Mertz

unread,
May 14, 2018, 6:49:17 PM5/14/18
to Chris Barker, python-ideas
Chris is certainly right. A program that deals with femtosecond intervals should almost surely start by defining a "start of experiment" epoch where microseconds are fine. Then within that epoch, events should be monotonic integers for when measured or calculated times are marked.

I can easily see reasons why a specialized wrapped int for FemtosecondsFromStart could be useful. But that's still a specialized need for a third party library. One possible use of this class might be to interoperate with datetimes or timedeltas. Conceivably sick interoperability could be dealing with leap seconds when needed. But "experiment time" should be a simple monotonic and uniform counter.

Wes Turner

unread,
May 14, 2018, 6:57:17 PM5/14/18
to David Mertz, python-ideas
From "[Python-Dev] PEP 564: Add new time functions with nanosecond resolution" (2017-10-16 hh:mm ss[...] -Z) 

> Maybe that's why we haven't found any CTCs (closed timelike curves) yet.
>
> Aligning simulation data in context to other events may be enlightening: is there a good library for handing high precision time units in Python (and/or CFFI)?

There's not yet an ISO8601-like standard for this level of time/date precision.

Correlating particle events between experiments does require date+time.

Rob Speer

unread,
May 15, 2018, 2:23:01 PM5/15/18
to Chris Angelico, python-ideas
On Mon, 14 May 2018 at 12:17 Chris Angelico <ros...@gmail.com> wrote:
On Tue, May 15, 2018 at 2:05 AM, Chris Barker via Python-ideas
<python...@python.org> wrote:
> But my question is whether high precision timedeltas belongs with "calendar
> time" at all.
>
> What with UTC and leap seconds, and all that, it gets pretty ugly, when down
> to the second or sub-second, what a given datetime really means.

UTC and leap seconds aren't a problem. When there's a leap second, you
have 23:59:60 (or you repeat 23:59:59, if you can't handle second
#60). That's pretty straight-forward, perfectly well-defined.

I'm sure that the issue of "what do you call the leap second itself" is not the problem that Chris Barker is referring to. The problem with leap seconds is that they create unpredictable differences between UTC and real elapsed time.

You can represent a timedelta of exactly 10^8 seconds, but if you add it to the current time, what should you get? What UTC time will it be in 10^8 real-time seconds? You don't know, and neither does anybody else, because you don't know how many leap seconds will occur in that time.

The ways to resolve this problem are:
(1) fudge the definition of "exactly 10^8 seconds" to disregard any leap seconds that occur in that time interval in the real world, making it not so exact anymore
(2) use TAI instead of UTC, as GPS systems do
(3) leave the relationship between time deltas and calendar time undefined, as some in this thread are suggesting

Chris Barker via Python-ideas

unread,
May 17, 2018, 12:56:47 PM5/17/18
to Rob Speer, python-ideas
On Tue, May 15, 2018 at 11:21 AM, Rob Speer <rsp...@luminoso.com> wrote:

I'm sure that the issue of "what do you call the leap second itself" is not the problem that Chris Barker is referring to. The problem with leap seconds is that they create unpredictable differences between UTC and real elapsed time.

You can represent a timedelta of exactly 10^8 seconds, but if you add it to the current time, what should you get? What UTC time will it be in 10^8 real-time seconds? You don't know, and neither does anybody else, because you don't know how many leap seconds will occur in that time.

indeed -- even if you only care about the past, where you *could* know the leap seconds -- they are, by their very nature, of second precision -- which means right before leap second occurs, your "time" could be off by up to a second (or a half second?) 

It's kind of like using a carpenter's tape measure to to locate points from a electron microscope scan :-)

The other issue with leap-seconds is that python's datetime doesn't support them :-)

And neither do most date-time libraries.

-CHB

Alexander Belopolsky

unread,
May 17, 2018, 1:15:18 PM5/17/18
to Christopher Barker, python-ideas
On Thu, May 17, 2018 at 12:56 PM Chris Barker via Python-ideas <
python...@python.org> wrote:

> The other issue with leap-seconds is that python's datetime doesn't
support them :-)

That's not entirely true. Since the implementation of PEP 495, it is
possible to represent the 23:59:60 as 23:59:59 with the "fold" bit set. Of
course, the repeated 23:59:59 will be displayed and behave exactly the same
as the first 23:59:59, but a 3rd party library can be written to take the
"fold" bit into account in temporal operations.

Chris Barker via Python-ideas

unread,
May 17, 2018, 1:37:03 PM5/17/18
to Alexander Belopolsky, python-ideas
On Thu, May 17, 2018 at 10:14 AM, Alexander Belopolsky <alexander....@gmail.com> wrote:
> The other issue with leap-seconds is that python's datetime doesn't
support them :-)

That's not entirely true.  Since the implementation of PEP 495, it is
possible to represent the 23:59:60 as 23:59:59 with the "fold" bit set.  Of
course, the repeated 23:59:59 will be displayed and behave exactly the same
as the first 23:59:59, but a 3rd party library can be written to take the
"fold" bit into account in temporal operations.

Does that support the other way -- or do we never lose a leap second anyway? (showing ignorance here)
 
But still, now datetime *could* support leap seconds (which is nice, because before, 23:59:60 was illegal, so it couldn't even be done at all), but that doesn't mean that it DOES support leap seconds....

-CHB

-- 

Alexander Belopolsky

unread,
May 17, 2018, 2:53:07 PM5/17/18
to Christopher Barker, python-ideas

On Thu, May 17, 2018 at 1:33 PM Chris Barker <chris....@noaa.gov> wrote:
>
> On Thu, May 17, 2018 at 10:14 AM, Alexander Belopolsky <alexander....@gmail.com> wrote:
>>  [...] Since the implementation of PEP 495, it is

>> possible to represent the 23:59:60 as 23:59:59 with the "fold" bit set.  Of
>> course, the repeated 23:59:59 will be displayed and behave exactly the same
>> as the first 23:59:59, but a 3rd party library can be written to take the
>> "fold" bit into account in temporal operations.
>
>
> Does that support the other way -- or do we never lose a leap second anyway? (showing ignorance here)
>  

I am not sure I understand your question.  All I said was that since PEP 495, it became possible to write a pair of functions to convert between TAI and UTC timestamps without any loss of information.

For example, around the insertion  of the last leap second at the end of 2016, we had the following sequence of seconds:

TAI                  | UTC   
---------------------+--------------------
2016-12-31T23:59:35  | 2016-12-31T23:59:59
2016-12-31T23:59:36  | 2016-12-31T23:59:60
2016-12-31T23:59:37  | 2016-01-01T00:00:00

this correspondence can be implemented in Python using the following datetime objects:

TAI                            | UTC   
-------------------------------+-------------------------------------------
datetime(2016,12,31,23,59,35)  | datetime(2016,12,31,23,59,59)
datetime(2016,12,31,23,59,36)  | datetime(2016,12,31,23,59,59,fold=1)
datetime(2016,12,31,23,59,37)  | datetime(2016,1,1,0,0,0)


Of course, Python will treat datetime(2016,12,31,23,59,59) and datetime(2016,12,31,23,59,59,fold=1)as equal, but you should be able to use your utc_to_tai(t) function to translate to TAI, do the arithmetic there and translate back with the tai_to_utc(t) function.  Wherever tai_to_utc(t) returns a datetime instance with fold=1, you should add that to the seconds field before displaying.
 
> But still, now datetime *could* support leap seconds (which is nice, because before, 23:59:60 was illegal, so it couldn't even be done at all), but that doesn't mean that it DOES support leap seconds....

By the same logic the standard library datetime does not support any local time because it does not include the timezone database.  This is where the 3rd party developers should fill the gap.

Alexander Belopolsky

unread,
May 17, 2018, 3:01:31 PM5/17/18
to Christopher Barker, python-ideas
On Thu, May 17, 2018 at 2:51 PM Alexander Belopolsky <alexander....@gmail.com> wrote:

TAI                  | UTC   
---------------------+--------------------
2016-12-31T23:59:35  | 2016-12-31T23:59:59
2016-12-31T23:59:36  | 2016-12-31T23:59:60
2016-12-31T23:59:37  | 2017-01-01T00:00:00

this correspondence can be implemented in Python using the following datetime objects:

TAI                            | UTC   
-------------------------------+-------------------------------------------
datetime(2016,12,31,23,59,35)  | datetime(2016,12,31,23,59,59)
datetime(2016,12,31,23,59,36)  | datetime(2016,12,31,23,59,59,fold=1)
datetime(2016,12,31,23,59,37)  | datetime(2017,1,1,0,0,0)



Correction: 2016-01-01 in the tables I presented before should be read as 2017-01-01 and similarly for the datetime fields.

Tim Peters

unread,
May 17, 2018, 3:14:33 PM5/17/18
to Chris Barker, python-ideas
[Chris Barker]
> Does that support the other way -- or do we never lose a leap second anyway?
> (showing ignorance here)

Alexander covered the Python part of this, so I'll answer the possible
higher-level question: we haven't yet needed a "negative" leap
second, and it's considered unlikely (but not impossible) that we ever
will. That's because the Earth's rotation is inexorably slowing ,so
the mean solar day inexorably lengthens when measured by SI seconds.

Other things can cause the Earth's rotation to speed up temporarily
(like some major geological events), but they've only been able to
overcome factors acting to slow rotation for brief periods, and never
yet got near to overcoming them by a full second.

Ethan Furman

unread,
May 17, 2018, 3:50:43 PM5/17/18
to python...@python.org
On 05/17/2018 12:13 PM, Tim Peters wrote:

> Other things can cause the Earth's rotation to speed up temporarily
> (like some major geological events), but they've only been able to
> overcome factors acting to slow rotation for brief periods, and never
> yet got near to overcoming them by a full second.

How long before the earth stops rotating? When it does, will we be tide-locked with the sun, or will an earth day
become an earth year?

Inquiring-minds-want-to-know'ly yrs;

--
~Ethan~

Chris Angelico

unread,
May 17, 2018, 3:55:27 PM5/17/18
to python-ideas
On Fri, May 18, 2018 at 5:53 AM, Ethan Furman <et...@stoneleaf.us> wrote:
> On 05/17/2018 12:13 PM, Tim Peters wrote:
>
>> Other things can cause the Earth's rotation to speed up temporarily
>> (like some major geological events), but they've only been able to
>> overcome factors acting to slow rotation for brief periods, and never
>> yet got near to overcoming them by a full second.
>
>
> How long before the earth stops rotating? When it does, will we be
> tide-locked with the sun, or will an earth day become an earth year?
>
> Inquiring-minds-want-to-know'ly yrs;

Won't ever happen. A few thousand years ago, the planet heard the
adage "one good turn deserves another", and interpreted it as an
infinite loop.

ChrisA

Chris Barker via Python-ideas

unread,
May 17, 2018, 4:23:55 PM5/17/18
to Tim Peters, python-ideas
now we really have gotten OT...

But thanks! that was my question!

-CHB


Alexander covered the Python part of this, so I'll answer the possible
higher-level question:  we haven't yet needed a "negative" leap
second, and it's considered unlikely (but not impossible) that we ever
will.  That's because the Earth's rotation is inexorably slowing ,so
the mean solar day inexorably lengthens when measured by SI seconds.

Other things can cause the Earth's rotation to speed up temporarily
(like some major geological events), but they've only been able to
overcome factors acting to slow rotation for brief periods, and never
yet got near to overcoming them by a full second.



Alexander Belopolsky

unread,
May 17, 2018, 5:01:50 PM5/17/18
to Tim Peters, python-ideas
On Thu, May 17, 2018 at 3:13 PM Tim Peters <tim.p...@gmail.com> wrote:
[Chris Barker]
> Does that support the other way -- or do we never lose a leap second anyway?
> (showing ignorance here)

Alexander covered the Python part of this,  ...

No, I did not.  I did not realize that the question was about skipping a second instead of inserting it.  Yes, regardless of whether it is possible given the physics of Earth rotation, negative leap seconds can be supported.  They simply become "gaps" in PEP 495 terminology.  Check out PEP 495 and read "second" whenever you see "hour". :-)

Wes Turner

unread,
May 17, 2018, 7:13:48 PM5/17/18
to Alexander Belopolsky, python-ideas
AstroPy solves for leap seconds [1][2] according to the IAU ERFA (SOFA) library [3] and the IERS-B and IERS-A tables [4]. IERS-B tables ship with AstroPy. The latest IERS-A tables ("from 1973 though one year into the future") auto-download on first use [5].

[3] "Leap second day utc2tai interpolation"

Wes Turner

unread,
May 17, 2018, 7:20:19 PM5/17/18
to Alexander Belopolsky, python-ideas

> Insertion of each UTC leap second is usually decided about six months in advance by the International Earth Rotation and Reference Systems Service (IERS), when needed to ensure that the difference between the UTC and UT1 readings will never exceed 0.9 seconds

On Thursday, May 17, 2018, Wes Turner <wes.t...@gmail.com> wrote:
AstroPy solves for leap seconds [1][2] according to the IAU ERFA (SOFA) library [3] and the IERS-B and IERS-A tables [4]. IERS-B tables ship with AstroPy. The latest IERS-A tables ("from 1973 though one year into the future") auto-download on first use [5].

[3] "Leap second day utc2tai interpolation"


Alexander Belopolsky

unread,
May 17, 2018, 7:42:35 PM5/17/18
to wes.t...@gmail.com, python-ideas
On Thu, May 17, 2018 at 7:12 PM Wes Turner <wes.t...@gmail.com> wrote:
AstroPy solves for leap seconds [1][2] according to the IAU ERFA (SOFA) library [3] and the IERS-B and IERS-A tables [4]. IERS-B tables ship with AstroPy. The latest IERS-A tables ("from 1973 though one year into the future") auto-download on first use [5].

I've just tried it.  Unfortunately, it does not seem to be compatible with PEP 495 datetime yet:

>>> t = astropy.time.Time('2016-12-31T23:59:60')
>>> t.to_datetime()
Traceback (most recent call last):
 ...
ValueError: Time (array(2016, dtype=int32), array(12, dtype=int32), array(31, dtype=int32), array(23, dtype=int32), array(59, dtype=int32), array(60, dtype=int32), array(0, dtype=int32)) is within a leap second but datetime does not support leap seconds 

Maybe someone can propose a feature for astropy to return datetime(2016,12,31,23,59,59,fold=1) in this case.

Nathaniel Smith

unread,
May 17, 2018, 8:10:36 PM5/17/18
to Chris Barker, python-ideas
On Thu, May 17, 2018 at 9:49 AM, Chris Barker via Python-ideas
<python...@python.org> wrote:
> On Tue, May 15, 2018 at 11:21 AM, Rob Speer <rsp...@luminoso.com> wrote:
>>
>>
>> I'm sure that the issue of "what do you call the leap second itself" is
>> not the problem that Chris Barker is referring to. The problem with leap
>> seconds is that they create unpredictable differences between UTC and real
>> elapsed time.
>>
>> You can represent a timedelta of exactly 10^8 seconds, but if you add it
>> to the current time, what should you get? What UTC time will it be in 10^8
>> real-time seconds? You don't know, and neither does anybody else, because
>> you don't know how many leap seconds will occur in that time.
>
>
> indeed -- even if you only care about the past, where you *could* know the
> leap seconds -- they are, by their very nature, of second precision -- which
> means right before leap second occurs, your "time" could be off by up to a
> second (or a half second?)

Not really. There are multiple time standards in use. Atomic clocks
count the duration of time – from their point of view, every second is
the same (modulo relativistic effects). TAI is the international
standard based on using atomic clocks to count seconds since a fixed
starting point, at mean sea level on Earth.

Another approach is to declare that each day (defined as "the time
between the sun passing directly overhead the Greenwich Observatory
twice") is 24 * 60 * 60 seconds long. This is what UT1 does. The
downside is that since the earth's rotation varies over time, this
means that the duration of a UT1 second varies from day to day in ways
that are hard to estimate precisely.

UTC is defined as a hybrid of these two approaches: it uses the same
seconds as TAI, but every once in a while we add or remove a leap
second to keep it roughly aligned with UT1. This is the time standard
that computers use the vast majority of the time. Importantly, since
we only ever add or remove an integer number of seconds, and only at
the boundary in between seconds, UTC is defined just as precisely as
TAI.

So if you're trying to measure time using UT1 then yeah, your computer
clock is wrong all the time by up to 0.9 seconds, and we don't even
know what UT1 is more precisely than ~milliseconds. Generally it gets
slightly more accurate just after a leap second, but it's not very
precise either before or after. Which is why no-one does this.

But if you're trying to measure time using UTC, then computers with
the appropriate setup (e.g. at CERN, or in HFT data centers) routinely
have clocks accurate to <1 microsecond, and leap seconds don't affect
that at all.

The datetime module still isn't appropriate for doing precise
calculations over periods long enough to include a leap second though,
e.g. Python simply doesn't know how many seconds passed between two
arbitrary UTC timestamps, even if they were in the past.

-n

--
Nathaniel J. Smith -- https://vorpus.org

Greg Ewing

unread,
May 18, 2018, 4:06:49 AM5/18/18
to python...@python.org
Ethan Furman wrote:
> How long before the earth stops rotating?

Apparently about 1.9 trillion years.

> When it does, will we be
> tide-locked with the sun, or will an earth day become an earth year?

Wikipedia says the main cause of the slowing is tidal effects
from the moon, so probably it would become tide-locked with the
moon and then not slow any further.

Having a month-long day ought to make our current fears about
climate change look like a lot of panic over nothing.

However, the good news is that we won't have to worry about
it. The sun will become a red giant and swallow the earth
long before then.

--
Greg

Stefan Behnel

unread,
May 18, 2018, 4:28:28 AM5/18/18
to python...@python.org
Greg Ewing schrieb am 18.05.2018 um 10:05:
> Ethan Furman wrote:
>> How long before the earth stops rotating? 
>
> Apparently about 1.9 trillion years.

So, does that mean we now need to hold our breath for 1.9 british trillion
years or 1.9 american trillion years?

Assuming you were referring to the French-Latin-Arabic based numbers and
naming systems at all, that is... And anyway, what's that point doing
there, right between the "1" and the "9" ?

Stefan

Chris Angelico

unread,
May 18, 2018, 4:57:13 AM5/18/18
to python-ideas
On Fri, May 18, 2018 at 6:27 PM, Stefan Behnel <stef...@behnel.de> wrote:
> Greg Ewing schrieb am 18.05.2018 um 10:05:
>> Ethan Furman wrote:
>>> How long before the earth stops rotating?
>>
>> Apparently about 1.9 trillion years.
>
> So, does that mean we now need to hold our breath for 1.9 british trillion
> years or 1.9 american trillion years?

I'm not sure. How long will it take for people to agree on a meaning
for "trillion"?

Oh wait, that's even longer.

ChrisA

Greg Ewing

unread,
May 18, 2018, 5:30:25 AM5/18/18
to python...@python.org
Stefan Behnel wrote:
> So, does that mean we now need to hold our breath for 1.9 british trillion
> years or 1.9 american trillion years?

Seeing as the time-to-red-giant is only about 5e9 years,
I don't think it matters much either way.

--
Greg

Greg Ewing

unread,
May 18, 2018, 5:31:27 AM5/18/18
to python-ideas
Chris Angelico wrote:
> I'm not sure. How long will it take for people to agree on a meaning
> for "trillion"?

About a trillion years, I estimate. :-)

--
Greg
Reply all
Reply to author
Forward
0 new messages