Google Groups no longer supports new Usenet posts or subscriptions. Historical content remains viewable.
Dismiss

java Date to c# ticks

1,280 views
Skip to first unread message

Peter K

unread,
Mar 2, 2010, 2:09:39 AM3/2/10
to
Hi

I have a c# application which reads data from a database. Some of the data
is a "time" which is actually a c# "ticks" value (written to the db by
another c# application).

Now I am writing a java application, which collects data and writes it to
the database for the c# application to read. So how do I convert a java
"Date" to a value which the c# application can interpret as "ticks"?


Thanks,
Peter

Frank Langelage

unread,
Mar 2, 2010, 3:04:25 AM3/2/10
to

First you should use a more common DB schema instead of this proprietary
one.

Second you should have provided the definition of ticks:
From MSDN:
A single tick represents one hundred nanoseconds or one ten-millionth of
a second. There are 10,000 ticks in a millisecond.
The value of this property represents the number of 100-nanosecond
intervals that have elapsed since 12:00:00 midnight, January 1, 0001.


java.util.Date has a constructor public Date(long date) with date as the
number of milliseconds since January 1, 1970, 00:00:00 GMT.
So this should be no problem for a programmer to solve this now.

Peter K

unread,
Mar 2, 2010, 3:44:41 AM3/2/10
to

"Frank Langelage" <fr...@lafr.de> wrote in message
news:7v3v49...@mid.individual.net...


> On 02.03.10 08:09, Peter K wrote:
>> Hi
>>
>> I have a c# application which reads data from a database. Some of the
>> data is a "time" which is actually a c# "ticks" value (written to the db
>> by another c# application).
>>
>> Now I am writing a java application, which collects data and writes it
>> to the database for the c# application to read. So how do I convert a
>> java "Date" to a value which the c# application can interpret as "ticks"?
>>
>
> First you should use a more common DB schema instead of this proprietary
> one.

What would be a better format for the dates?

We have found that using a real date field in sql server 2005 gives odd
rounding issues - eg dates which are very close to midnight (eg 1 ms before
midnight) are rounded up to the next day. We don't want this behaviour, so
the c# application was programmed to use the ticks value (where no rounding
up to the next day occurs),

Using a formatted string for the date makes it harder to select a range of
dates (this is very easy with a numeric value).

What other/better possibilities are there?

Thanks,
Peter

Lothar Kimmeringer

unread,
Mar 2, 2010, 6:16:42 AM3/2/10
to
Peter K wrote:

> Now I am writing a java application, which collects data and writes it to
> the database for the c# application to read. So how do I convert a java
> "Date" to a value which the c# application can interpret as "ticks"?

Assuming that "ticks" are using NT Time, GIYF giving you e.g.
http://support.citrix.com/article/CTX109645

[...]
How to convert Windows NT Time to UNIX Time:
Divide by 10,000,000 and subtract 11,644,473,600.
How to convert UNIX Time to Windows NT Time:
Add 11,644,473,600 and multiply by 10,000,000.
[...]


Regards, Lothar
--
Lothar Kimmeringer E-Mail: spam...@kimmeringer.de
PGP-encrypted mails preferred (Key-ID: 0x8BC3CD81)

Always remember: The answer is forty-two, there can only be wrong
questions!

Lew

unread,
Mar 2, 2010, 11:15:13 AM3/2/10
to
Frank Langelage wrote:
> java.util.Date has a constructor public Date(long date) with date as the
> number of milliseconds since January 1, 1970, 00:00:00 GMT.
> So this should be no problem for a programmer to solve this now.

Given that the Java type to match databases is java.sql.Timestamp, and that
has a resolution of one nanosecond, it's potentially a better choice.

--
Lew

Peter K

unread,
Mar 2, 2010, 4:28:07 PM3/2/10
to

"Lothar Kimmeringer" <news2...@kimmeringer.de> wrote in message
news:wctooysr...@kimmeringer.de...


> Peter K wrote:
>
>> Now I am writing a java application, which collects data and writes it to
>> the database for the c# application to read. So how do I convert a java
>> "Date" to a value which the c# application can interpret as "ticks"?
>
> Assuming that "ticks" are using NT Time, GIYF giving you e.g.
> http://support.citrix.com/article/CTX109645
>
> [...]
> How to convert Windows NT Time to UNIX Time:
> Divide by 10,000,000 and subtract 11,644,473,600.
> How to convert UNIX Time to Windows NT Time:
> Add 11,644,473,600 and multiply by 10,000,000.
> [...]

Thanks to all for your input.

C# (.net) ticks are based on nanoseconds since 1/1/0001.
Actually I had managed to write a satisfactory conversion routine - but
during the testing I kept getting wrong answers, which turned out to be
because I overlooked that Java months are 0-based while .net months are
1-based. (And timezones also confused the picture - most of the data is
generated in a different timezone from where I am).

At the moment I will keep the .net ticks as the value in the database. I
understand it is not a generally accepted date/time representation, but the
original software used this representation, and it seems the easiest to
keep. Data can be input into the database from several sources (applications
written in both c# and now in java). Data is read by a .net (c#)
application.


/Peter

Wojtek

unread,
Mar 2, 2010, 7:40:27 PM3/2/10
to
Peter K wrote :

> And timezones also confused the picture - most of the data is generated in a
> different timezone from where I am

And another good reason to always store dates in GMT

--
Wojtek :-)


Roedy Green

unread,
Mar 2, 2010, 8:56:20 PM3/2/10
to
On Tue, 2 Mar 2010 20:09:39 +1300, "Peter K" <pe...@parcelvej.dk>
wrote, quoted or indirectly quoted someone who said :

>
>Now I am writing a java application, which collects data and writes it to
>the database for the c# application to read. So how do I convert a java
>"Date" to a value which the c# application can interpret as "ticks"?

There are so many definitions of "ticks".

AT ticks were in the neighbourhood of 20 ms.

Dates are just a wrapper around a long ms since 1970-01-01

You might have a look at the code in FileTimes
http://mindprod.com/products.html#FILETIMES
which interconverts between Java-ticks and MS file timestamp ticks.

Java timestamps use 64-bit milliseconds since 1970 GMT. Windows
timestamps use 64-bit value representing the
number of 100-nanosecond intervals since January 1, 1601, with ten
thousand times as much precision.
DIFF_IN_MILLIS is the difference between January 1 1601 and January 1
1970 in milliseconds. This magic number came from
com.mindprod.common11.TestDate. Done according to Gregorian Calendar,
no correction for 1752-09-02 Wednesday was followed immediately by
1752-09-14 Thursday dropping 12 days. Also according to
http://gcc.gnu.org/ml/java-patches/2003-q1/msg00565.html

private static final long DIFF_IN_MILLIS = 11644473600000L;

long javaTime = ( msTime / 10000 ) - DIFF_IN_MILLIS;


See http://mindprod.com/jgloss/time.html
--
Roedy Green Canadian Mind Products
http://mindprod.com

The major difference between a thing that might go wrong and a thing that cannot possibly go wrong is that when a thing that cannot possibly go wrong goes wrong it usually turns out to be impossible to get at or repair.
~ Douglas Adams (born: 1952-03-11 died: 2001-05-11 at age: 49)

Arne Vajhøj

unread,
Mar 2, 2010, 9:00:15 PM3/2/10
to
On 02-03-2010 20:56, Roedy Green wrote:
> On Tue, 2 Mar 2010 20:09:39 +1300, "Peter K"<pe...@parcelvej.dk>
> wrote, quoted or indirectly quoted someone who said :
>> Now I am writing a java application, which collects data and writes it to
>> the database for the c# application to read. So how do I convert a java
>> "Date" to a value which the c# application can interpret as "ticks"?
>
> There are so many definitions of "ticks".
>
> AT ticks were in the neighbourhood of 20 ms.
>
> Dates are just a wrapper around a long ms since 1970-01-01
>
> You might have a look at the code in FileTimes
> http://mindprod.com/products.html#FILETIMES
> which interconverts between Java-ticks and MS file timestamp ticks.
>
> Java timestamps use 64-bit milliseconds since 1970 GMT. Windows
> timestamps use 64-bit value representing the
> number of 100-nanosecond intervals since January 1, 1601, with ten
> thousand times as much precision.

There are many definitions of ticks, but the original poster
did say C# and in that case he must mean System.DateTime.Ticks
and that is year 0001 based not 1601 based.

Arne

Dr J R Stockton

unread,
Mar 3, 2010, 5:56:52 PM3/3/10
to
In comp.lang.java.programmer message <qrfro5lqhcao7vsii7gmbbb6s8mrsjhogh
@4ax.com>, Tue, 2 Mar 2010 17:56:20, Roedy Green <see_website@mindprod.c
om.invalid> posted:

>On Tue, 2 Mar 2010 20:09:39 +1300, "Peter K" <pe...@parcelvej.dk>
>wrote, quoted or indirectly quoted someone who said :
>
>>
>>Now I am writing a java application, which collects data and writes it to
>>the database for the c# application to read. So how do I convert a java
>>"Date" to a value which the c# application can interpret as "ticks"?
>
> There are so many definitions of "ticks".
>
>AT ticks were in the neighbourhood of 20 ms.

Exactly 0x1800B0 per 24-hour day; just over 0x10000 per hour; about 54.9
ms.

> This magic number came from
>com.mindprod.common11.TestDate. Done according to Gregorian Calendar,
>no correction for 1752-09-02 Wednesday was followed immediately by
>1752-09-14 Thursday dropping 12 days.

Ten dates (no days) dropped. Later, parts of Canada dropped 11 dates.

--
(c) John Stockton, nr London, UK. ?@merlyn.demon.co.uk Turnpike v6.05.
Web <URL:http://www.merlyn.demon.co.uk/> - w. FAQish topics, links, acronyms
PAS EXE etc : <URL:http://www.merlyn.demon.co.uk/programs/> - see 00index.htm
Dates - miscdate.htm estrdate.htm js-dates.htm pas-time.htm critdate.htm etc.

Eric Sosman

unread,
Mar 3, 2010, 8:45:50 PM3/3/10
to
On 3/2/2010 4:28 PM, Peter K wrote:
> [...]

> C# (.net) ticks are based on nanoseconds since 1/1/0001.

Assertion: The low-order thirty-two bits of such a value
at any given moment (NOW!) are unknown -- and unknowable.

Y'know those "Star Trek" moments where Scotty looks at the
enormous alien space ship and says "It's huge! It must be half
a mile across!" and Spock says "Zero point five eight three two
two miles, to be precise?" Spock's folly of over-precision (who
measures the alien space ship to plus-or-minus six inches?) is as
nothing compared to that of a time standard that pretends to
measure two millennia's worth of itsy-bitsy wobbles in Earth's
rotation. My claim that thirty-two bits are unknowable says
nothing more than "We don't know the history of Earth's rotation
to plus-or-minus four seconds over the last two thousand years,"
and I'll stand by the claim.

Put it this way: Can you think of ANY physical quantity that
has been measured to (let's see: 1E9 nanoseconds in a second,
86,400 seconds in a day ignoring leap seconds, 365.25 days in a
year ignoring adjustments, 2010 years, 63,430,776,000,000,000,000
nanoseconds in all) TWENTY decimal places?

Add to this the fact that light travels only ~1 foot per
nanosecond. Every mile between you and the time standard amounts
to five *micro*seconds' worth of slop ...

--
Eric Sosman
eso...@ieee-dot-org.invalid

Arne Vajhøj

unread,
Mar 3, 2010, 8:57:28 PM3/3/10
to
On 03-03-2010 20:45, Eric Sosman wrote:
> On 3/2/2010 4:28 PM, Peter K wrote:
>> [...]
>> C# (.net) ticks are based on nanoseconds since 1/1/0001.
>
> Assertion: The low-order thirty-two bits of such a value
> at any given moment (NOW!) are unknown -- and unknowable.

It is not 1 ns unit but 100 ns units. And the low 32 bits
is around 430 seconds.

We do probably not have any measurements at 430 seconds accuracy
for year 1. But do have it today. And it would be rather inconvenient
to use different units for different periods.

Arne

Eric Sosman

unread,
Mar 3, 2010, 9:21:03 PM3/3/10
to
On 3/3/2010 8:57 PM, Arne Vajh�j wrote:
> On 03-03-2010 20:45, Eric Sosman wrote:
>> On 3/2/2010 4:28 PM, Peter K wrote:
>>> [...]
>>> C# (.net) ticks are based on nanoseconds since 1/1/0001.
>>
>> Assertion: The low-order thirty-two bits of such a value
>> at any given moment (NOW!) are unknown -- and unknowable.
>
> It is not 1 ns unit but 100 ns units. And the low 32 bits
> is around 430 seconds.

Thanks for the information. I'll revise my claim: "The
low-order twenty-five bits are unknown and unknowable."

> We do probably not have any measurements at 430 seconds accuracy
> for year 1. But do have it today. And it would be rather inconvenient
> to use different units for different periods.

Intervals between contemporary events can (sometimes) be
measured to nanosecond precision. In the laboratory, femtosecond
precision may be attainable. But extending the scale to longer
periods is pure fiction! Claim: You cannot measure the time
between an event at lunchtime yesterday and one at lunchtime today
with nanosecond precision. You probably can't measure it with
millisecond precision, and even one-second precision would require
a good deal of care.

Even in one single lunch hour, you cannot measure the time
between the swallow and the belch with nanosecond precision.

--
Eric Sosman
eso...@ieee-dot-org.invalid

Roedy Green

unread,
Mar 3, 2010, 11:52:42 PM3/3/10
to
On Wed, 3 Mar 2010 22:56:52 +0000, Dr J R Stockton
<repl...@merlyn.demon.co.uk> wrote, quoted or indirectly quoted
someone who said :

>


>Ten dates (no days) dropped. Later, parts of Canada dropped 11 dates.

The full story is quite complex. Different parts of world accepted
the Gregorian calendar at different times. There are parts of the
world today still on the Julian calendar.

BigDate works off two different definitions, the papal and the British
adoption.

Thomas Pornin

unread,
Mar 4, 2010, 8:56:11 AM3/4/10
to
According to Eric Sosman <eso...@ieee-dot-org.invalid>:

> On 3/2/2010 4:28 PM, Peter K wrote:
> > [...]
> > C# (.net) ticks are based on nanoseconds since 1/1/0001.
>
> Assertion: The low-order thirty-two bits of such a value
> at any given moment (NOW!) are unknown -- and unknowable.

Only if you do not use the "right" definition. Nobody on January 1st, 1
AD had any notion of what a second could be, let alone a nanosecond, and
neither did they imagine that their year was to be numbered "1". That
calendar was applied retroactively, several centuries (for year count)
or millenia (for seconds and nanoseconds) later.

To the effect that "1/1/0001" is defined to be a past date computed back
from now using the absolute definition of the second that we use
nowadays, and which has nothing to do with the rotation of Earth. What
is unknown (with 100ns precision) is how well that synthetic
reconstructed instant matches the position of stars in the sky of Rome
during that night, under the rule of Augustus.


Strangely enough, we have some measures on the average variation of
Earth rotation over the last two millenia, thanks to some ancient
reports of solar eclipses. The zone of total eclipse is a narrow band, a
few dozen kilometers in width; a report of an observation of a total
eclipse at a given place yields a measure of the Earth orientation with
a precision equivalent to two or three minutes of Earh rotation. Chinese
astronomers, in particular, were quite meticulous in writing down the
particulars of an observed eclipse. Of course this is relative to how
well we can reconstruct the eclipse parameters themselves, but it seems
that Moon orbital parameters, while complex, are still quite easier to
extrapolate than the messy artefacts of Earth rotational variations.

So that we _can_ pinpoint the synthetic "1/1/0001" date within the Roman
calendar framework within a few minutes, which is much better than what
the clocks the Romans used could do.

At that point, simply imposing our notions of nanoseconds on the Romans
seems hardly unfair. And they would not have noticed anyway.


--Thomas Pornin

Lothar Kimmeringer

unread,
Mar 4, 2010, 1:14:20 PM3/4/10
to
Eric Sosman wrote:

> Intervals between contemporary events can (sometimes) be
> measured to nanosecond precision. In the laboratory, femtosecond
> precision may be attainable. But extending the scale to longer
> periods is pure fiction! Claim: You cannot measure the time
> between an event at lunchtime yesterday and one at lunchtime today
> with nanosecond precision.

With intervals of that size, nobody will anyway. Point is that
you don't want to change data-structures in dependence of the
size of the interval. As well, you want to keep some kind of
reserve for the future to avoid the problem the runtime libraries
of Borland TurboPascal had where a cycle-counter-value became
larger than the maximum value that could be represented by a Word.

> You probably can't measure it with
> millisecond precision, and even one-second precision would require
> a good deal of care.

Like with all physical measures you have an error. Assuming it
to be constant (e.g. 0.01%) an interval of 10 �s can be expected
to be in the range of 9999 ns and 1001 ns where in terms of
a day, the error alone is plus or minus 9 seconds.

> Even in one single lunch hour, you cannot measure the time
> between the swallow and the belch with nanosecond precision.

Most measurements in IT I'm aware of are about the time of a
method-call, the execution time of an SQL-query, the round-
trip-time of a network request, etc. Hopefully most of them
are in the range of micro- or milliseconds, so having a data-
structure with some kind of "reserve" for the future isn't the
badest thing to have.

Arne Vajhøj

unread,
Mar 4, 2010, 10:56:05 PM3/4/10
to

All true.

But still it is a lot easier to use the same unit for
both long and short intervals.

Arne

Eric Sosman

unread,
Mar 5, 2010, 9:14:18 AM3/5/10
to

I've no quarrel with measuring *intervals* in tiny units.
The thing that started me ranting and foaming at the mouth was
the statement that "C# (.net) ticks are based on nanoseconds
since 1/1/0001." *That's* the association I regard as fiction,
bordering on nonsense.

I seem to have mislaid those pills the court psychiatrist
ordered me to take. Anybody know where they are? ;-)

--
Eric Sosman
eso...@ieee-dot-org.invalid

Arne Vajhøj

unread,
Mar 5, 2010, 11:04:29 AM3/5/10
to

Nanoseconds in year 1 is absurd.

But it is not absurd to measure nanoseconds (or at least milliseconds
today).

And it is not absurd to be able to store days many years back.

And it is not absurd to use the same unit for all times.

So we have now proven that:
3 x not absurd = absurd

Arne


Peter K

unread,
Mar 5, 2010, 4:02:22 PM3/5/10
to

"Eric Sosman" <eso...@ieee-dot-org.invalid> wrote in message
news:hmr3jt$biv$1...@news.eternal-september.org...

Yes, sorry, I mis-wrote the definition from Microsoft.

The .net DateTime structure represents dates and times ranging from 1/1/0001
to 31/12/9999. The values are measured in 100ns units called ticks.

http://msdn.microsoft.com/en-us/library/system.datetime.aspx

But is your quarrel that if I actually went back the billions of nanoseconds
from the value for today's nanasecond value, I wouldn't actually end up at
1/1/0001 - due to vagaries in the Earth's orbit, spin etc?

Martin Gregorie

unread,
Mar 5, 2010, 7:26:52 PM3/5/10
to
On Sat, 06 Mar 2010 10:02:22 +1300, Peter K wrote:

> "Eric Sosman" <eso...@ieee-dot-org.invalid> wrote in message
> news:hmr3jt$biv$1...@news.eternal-september.org...

>> On 3/4/2010 10:56 PM, Arne Vajhøj wrote:
>>> On 03-03-2010 21:21, Eric Sosman wrote:

To say nothing of the transitions between the various calendars, which,
over the mere 2009 years in that range are probably more significant than
spin rate and orbit deviations.

--
martin@ | Martin Gregorie
gregorie. | Essex, UK
org |

John Stockton

unread,
Mar 6, 2010, 8:42:46 AM3/6/10
to
In comp.lang.java.programmer message
<v0fuo5911s1bvrsmq...@4ax.com>, Wed, 3 Mar 2010 20:52:42,
Roedy Green <see_w...@mindprod.com.invalid> posted

>On Wed, 3 Mar 2010 22:56:52 +0000, Dr J R Stockton
><repl...@merlyn.demon.co.uk> wrote, quoted or indirectly quoted
>someone who said :
>>
>>Ten dates (no days) dropped. Later, parts of Canada dropped 11 dates.
>
>The full story is quite complex. Different parts of world accepted
>the Gregorian calendar at different times.

Indeed. But, knowing you to be in Canada, I gave it special treatment.

It is said that mainland Nova Scotia changed from Gregorian to Julian
(sic) previous to 1752. Can you give the last Gregorian and first
Julian date at that change (one expects it to have occurred at local
midnight), from trustworthy Canadian sources?

> There are parts of the
>world today still on the Julian calendar.

I rather doubt whether any parts of the world still use it in their
daily secular life. Russian (and other) Orthodox celebrate Easter by
the Ju8lian Calendar and the pre-1752 (for us) rules. Mount Athos maybe
uses the Greek Orthodox version, but they don't have a secular life.

>BigDate works off two different definitions, the papal and the British
>adoption.

Those are different, but (when extrapolated as necessary in a reasonably
obvious manner) give the same answers. I have seen part of the Canadian
law on Easter, but not the most interesting part. Is it on line?

--
(c) John Stockton, Surrey, UK. ?@merlyn.demon.co.uk Turnpike v6.07 IE 8.


Web <URL:http://www.merlyn.demon.co.uk/> - w. FAQish topics, links, acronyms
PAS EXE etc : <URL:http://www.merlyn.demon.co.uk/programs/> - see 00index.htm

Dates - miscdate.htm moredate.htm js-dates.htm pas-time.htm critdate.htm etc.

Lew

unread,
Mar 7, 2010, 11:37:00 PM3/7/10
to
Arne Vajhøj wrote:
> Nanoseconds in year 1 is absurd.
>
> But it is not absurd to measure nanoseconds (or at least milliseconds
> today).
>
> And it is not absurd to be able to store days many years back.
>
> And it is not absurd to use the same unit for all times.
>
> So we have now proven that:
> 3 x not absurd = absurd

Two wrongs don't make a right.

But three lefts do.

--
Lew

Lew

unread,
Mar 7, 2010, 11:42:14 PM3/7/10
to
Peter K wrote:
>> But is your quarrel that if I actually went back the billions of
>> nanoseconds from the value for today's nanasecond value, I wouldn't
>> actually end up at 1/1/0001 - due to vagaries in the Earth's orbit, spin
>> etc?

Martin Gregorie wrote:
> To say nothing of the transitions between the various calendars, which,
> over the mere 2009 years in that range are probably more significant than
> spin rate and orbit deviations.

I look at such a system (100 ns "ticks" since 0001-01-01T00:00:00.00...Z) as a
"normalized" calendar/time system. Arguing that you cannot precisely measure
0001-01-01T00:00:00.00...Z as a number of ticks ago since NOW is specious;
that datetime is *defined* by being that many ticks ago from NOW. So instead
of trying to measure how many ticks ago "time zero" is, you now have the
uncertainty of measuring how great was the wobble since then.

Heisenberg. Tomayto/tomahto. Frequency/duration. Position/velocity. How
many angels can dance on the head of a pin?

(A: Depends on the caterer.)

--
Lew

Dr J R Stockton

unread,
Mar 8, 2010, 6:10:11 AM3/8/10
to
In comp.lang.java.programmer message <hmr3jt$biv$1...@news.eternal-
september.org>, Fri, 5 Mar 2010 09:14:18, Eric Sosman <esosman@ieee-dot-
org.invalid> posted:

> I've no quarrel with measuring *intervals* in tiny units.
>The thing that started me ranting and foaming at the mouth was
>the statement that "C# (.net) ticks are based on nanoseconds
>since 1/1/0001." *That's* the association I regard as fiction,
>bordering on nonsense.

That's just due to the customary imprecision of US nerds.

Those are GMT nanoseconds (no leap seconds); from Monday 0001-01-01
00:00:00 GMT = 0 ("since Jan 1" actually means "starting Jan 2"); and
the days are of the proleptic Gregorian Calendar.

IIRC, GMT there should actually be UT, but that will be taken as a typo
for UTC.

Januarius of that year actually started on the previous day.

The Gregorian Calendar is valid perpetually from 1582-10-14, by Papal
definition. If the civil calendar is ever changed, it will no longer be
Gregorian. The proleptic extension is obvious, and has the authority of
ISO 8601.

A bad choice of start, too; arithmetic is simpler if the count starts at
AD 0000 March 1, since that follows the rarest exceptional end-of-
February.

--
(c) John Stockton, nr London, UK. ?@merlyn.demon.co.uk Turnpike v6.05.

Web <URL:http://www.merlyn.demon.co.uk/> - w. FAQish topics, links, acronyms
PAS EXE etc : <URL:http://www.merlyn.demon.co.uk/programs/> - see 00index.htm

Eric Sosman

unread,
Mar 8, 2010, 10:32:51 AM3/8/10
to
On 3/7/2010 11:42 PM, Lew wrote:
> Peter K wrote:
>>> But is your quarrel that if I actually went back the billions of
>>> nanoseconds from the value for today's nanasecond value, I wouldn't
>>> actually end up at 1/1/0001 - due to vagaries in the Earth's orbit, spin
>>> etc?
>
> Martin Gregorie wrote:
>> To say nothing of the transitions between the various calendars,
>> which, over the mere 2009 years in that range are probably more
>> significant than spin rate and orbit deviations.
>
> I look at such a system (100 ns "ticks" since
> 0001-01-01T00:00:00.00...Z) as a "normalized" calendar/time system.
> Arguing that you cannot precisely measure 0001-01-01T00:00:00.00...Z as
> a number of ticks ago since NOW is specious; that datetime is *defined*
> by being that many ticks ago from NOW.[...]

In short, the definition is useless, useless in the sense
that one cannot use it to say what the tick count should be at
any given NOW. If you holler NOW! and consult the clocks on
Systems A and B, and the clocks disagree by five minutes, say,
can the definition help you determine which (if either) is
correct? Since the definition is circular ("The current time
is defined as the number of ticks since a moment so-and-so many
ticks ago"), the operators of A and B can *both* claim their
clocks are correct. One might just as well define the time as
the number of ticks since the Ark hit Ararat.

In "The Devil's Dictionary," Ambrose Bierce defined a magnet
as an object exerting the force of magnetism, and magnetism as the
force exerted by a magnet (noting that the paired definitions were
the distillation of innumerable scientific treatises). Circular
definitions can still be funny ("Recursion: see Recursion"), but
it seems not everyone sees the joke.

--
Eric Sosman
eso...@ieee-dot-org.invalid

0 new messages