On 2012-02-23 09:00:40 +0000, Alex Mizrahi said:
> How is this relevant? It is a function you've just wrote, it has its
> own convention. While CL's convention is described in glossary entry
> I've quoted, and you can check it via decode-universal-time alone, no
> need to find combination of three function calls.
I'm trying to make three points.
Firstly, the sign of a timezone is just an arbitrary choice: you can
either treat it as mapping from the local time to universal time, or
the other way around, and the difference is a sign. So, using the ISO
convention, where zones to the east of UTC are positive, the timezone
maps from UTC to the local time, while using CL’s or POSIX’s, where
they are negative, the timezone maps from local time to UTC. Neither
is obviously better. And it’s *just a sign convention*: it is
obviously entirely trivial to convert one to the other, or to write a
wrapper which expresses one in terms of the other.
Secondly, standards do not define whether something is right or wrong:
they just define whether something is conforming to them. As other
people have pointed out, the timezone convention in CL predates ISO
8601 (the latest plausible date for it is 1984, and it might well go
back for a long time before that), but even if it didn’t CL does not
purport to conform to this standard. That’s fine, there are lots of
other standards which it does not conform to either.
Thirdly, if you think that one of these conventions is somehow
inherently better than the other, and even more if you think this
matters, then some guy called Swift wrote a story about eggs you should
read.