On Wednesday, November 23, 2016 at 11:01:18 AM UTC-6, someone wrote:
> On Wednesday, November 23, 2016 at 8:31:33 AM UTC-8, supercat wrote:
> > If the C89 Standard Committee was trying to maintain backward compatibility
> > (I believe they were) that would imply that they did not expect that
> > describing formerly-defined actions as invoking "Undefined Behavior" would
> > break programs that relied upon them. Implementations would be free, after
> > all, to keep on handling those actions in useful fashion whether or not the
> > Standard mandated such treatment.
>
> In a very real way, _nothing_ had defined behaviour before the standard,
> so claiming that formerly defined behaviour changed to undefined doesn't
> make sense. If implementation want to maintain their previous behaviour
> even in the face of it now being officially "undefined", they're free to
> do so.
Standards committees is not the only entities in the universe that
"define" things. The 1974 document specifies how a couple of particular
implementations work. Authors of other implementations were expected to
make them behave in analogous fashion as appropriate for the target
platform when practical. For target platforms that were very much like
the PDP-11, almost all behaviors could be defined as "do what the PDP-11
would do". Only when targeting platforms which were different did things
become problematic (e.g. on a PDP-11 there's no question about what (-1)<<1
should mean, but on a ones'-complement machine there would be two plausible
meanings and on a sign-magnitude machine there would be no clear meaning).
> > I think the difficulty is that the Committee's actions have been
> > reinterpreted as a decision to break code which used those behaviors.
>
> No, the standard merely describes which of those behaviours are in fact
> defined, and how.
Which of those behaviors are defined *by the Standard*. Before the Standard
was ratified, any compiler for two's-complement silent-wraparound hardware
which didn't evaluate (-1)<<1 as -2 would have been considered "broken".
Whether or not anything specifically defined the behavior of <<, the meaning
of left-shifting two's-complement values was most likely established even
before Dennis Ritchie was born [I don't know exactly when during the design
of the Atanasoff-Berry Computer the behavior of two's-complement math was
established, or if it had been established for something earlier, but it
would presumably have been established by the time work was discontinued in
1942].
I see no evidence that the authors of the Standard intended that it cause
anything whose meaning had been well established on certain kinds of
machines prior to the Standard's ratification to be regarded as less well
defined than it had been previously.
> > The
> > 1974 document (which I think represents the best definition of what the C
> > language was in 1974) clearly and unambiguously defines semantics which are
> > sometimes useful, but are not available in the language defined by today's
> > Standard.
>
> And thank goodness. Someone else has already mentioned that you seem to want
> current C to be pretty much like your C74, but I'd wager a significant
> amount of beer that very few others here share that view.
I have said that I want a language whose available semantics are a superset
of those available in C74, at least when running on platforms which share
some basic architectural features with the PDP-11 (storage consists of a
plurality of linearly-addressable octets, each integer type larger than
char is stored as two half-size chunks without padding, integer math is
performed using two's-complement hardware without overflow traps, etc.).
I wouldn't mind having to explicitly request semantics which aren't always
needed *if* there were a concise way of doing so which could be expected
to work on platforms which would naturally support the behaviors in
question. The problem is that in many situations where when the easiest
way for compilers to support such operations on platforms where they would
be practical would be to simply treat them as defined whether the Standard
required it or not, nobody saw any need to specify any alternative which
would be unambiguously defined on all platforms that supported it.
> A shirt catalogue is not a standard. Standards _do_ tend to define things
> that are "obvious". What's interesting to me is how convoluted and wordy
> those definitions often are, probably because when one looks closely, it's
> really not so "obvious" after all.
The analogous "standard" would be the fact that shirts intended for use
by normal humans will have two sleeves, one on each side. Many standards
do go out of their way to state the obvious, but the C89 Standard went
out of its way to avoid avoiding suggesting that any particular
implementation should be considered inferior to others, even when that
meant refraining from recommending things that almost all implementations
should support when practical, but which a few implementations might be
able to.
Most modern standards make use of terms MUST, SHOULD, etc. as defined in
RFC-2119. In many cases, a lot of the value of a Standard comes not from
the MUST items, but from the SHOULD items. A standard written by someone
who wants to avoid SHOULD items will often be inferior to one written by
someone who embraces them.
> > For the "Standard" to really qualify as a standard, it should define classes
> > of things such that knowing that x is in class X and knowing that y is in
> > class Y will allow one to predict something useful about the interaction of
> > objects x and y.
>
> Um, that's exactly what the standard _does_ do; it says "if you do /this/,
> you'll get /that/ result". (And, explicitly or by implication, if you do
> anything different you're on your own.)
The Standard requires that for every conforming implementation, there must
be at least one program meeting certain criteria that the implementation
will process as described by the Standard. Nothing in the Standard would
forbid an implementation from dying with a stack overflow (and behaving
in arbitrary fashion) when fed any other program.
> > By that definition, the so-called C "Standard" is only
> > applicable to programs with constraint violations (which generally aren't
> > very useful).
>
> I don't see how that follows.
See above.
> > Recognizing optional behaviors would greatly increase the value of the
> > Standard by making it possible and practical to define two very useful
> > categories of implementations and programs and a couple of useful guarantees
> > related to them:
>
> Taken to the extreme, _any_ implementation could be called conforming as long
> as every single way in which its behaviour differs from other implementations
> was documented as an option. How is this better than no standard at all?
Under my proposed rules, feeding any Selectively Conforming program to an
implementation that was at least Minimally Conforming would result in one of
two things happening:
1. The program would function successfully.
2. The program would fail in Implementation-Defined fashion (depending
upon the directives it uses, such failure might be guaranteed to occur
at compile time, or could be deferred until run time).
A program which relies upon features which some implementations support and
others do not would be less portable than one which does not rely upon such
features, but depending upon how common the feature was it could be, for
all practical purposes, "almost" as portable. An implementation which does
not support features that are needed by many programs could be viewed as
inferior to one which does support them, but could still be very useful for
running programs that don't need such features.