Google Groups no longer supports new Usenet posts or subscriptions. Historical content remains viewable.
Dismiss

Addition of nocode keyword

160 views
Skip to first unread message

Rick C. Hodgin

unread,
Oct 25, 2019, 8:26:36 AM10/25/19
to
I'd like to propose that the nocode keyword be added to standard
C. It would convey that a block was intentionally left blank,
and was added only to allow program flow to move as it should in
certain cases.

switch (value)
{
case 1:
case 5:
case 7:
case 9:
// All valid
nocode;
break;

default:
// Invalid
report_error_message();
return;
}

It would be a way for the compiler to know that empty code blocks
were intentional, and a new diagnostic could be created to emit
when the compiler detect empty code blocks when use of nocode is
available.

It would also aid in documentation and code validation to know
that an empty block was empty intentionally, and not simply left
un-coded by accident / mistake.

--
Rick C. Hodgin

Philipp Klaus Krause

unread,
Oct 25, 2019, 9:53:05 AM10/25/19
to
Am 25.10.19 um 14:26 schrieb Rick C. Hodgin:
> I'd like to propose that the nocode keyword be added to standard
> C. […]

I don't think adding a keyword to silence warnings has a chance of being
accepted. Try with an attribute instead.

Philipp

Rick C. Hodgin

unread,
Oct 25, 2019, 10:26:12 AM10/25/19
to
It introduces a new opportunity to issue diagnostics by giving
the developer positive control over empty code blocks, whereas
today it's an unknown relegated to parsing comments in a best
case, or examining the logic in a worst case to see if it is
actually doing what it should.

With nocode, that uncertainty is replaced with certainty.

It's a good addition.

--
Rick C. Hodgin

Rick C. Hodgin

unread,
Oct 25, 2019, 2:04:08 PM10/25/19
to
I need some help please.

Could someone give me some advice on where else to look in the C
standard, and how I should edit / extend / completely redo what
I've begun here?

----[ noode_proposal.txt ]-----
C ????: nocode keyword

Submitter: Rick C. Hodgin
Submission Date: October 25, 2019


Summary:

nocode keyword to be added, to be used in code blocks which reflect
flow through program logic, but indicate no other code is required.


Justification:

In source code today, code blocks without code do not indicate in
C syntax whether or not that was the intention. A developer must
seek out comments if present and try and understand the developer
notes for what's going on. When comments are not present, a more
complex task of examining program logic flow must be made to then
know if the block should be empty or not.

Adding the nocode keyword would enable the developer to know with
certainty that an empty block was purposefully left empty, and
was not a mistake or omission of intended code that was simply
forgotten / overlooked. It would also give the compiler the new
ability to generate diagnostics on empty code blocks when nocode
is missing.


Proposed changes (vs. the standard draft N2310):

Section 6.4.1 Keywords -- Add "nocode" into the list.

Section 6.8 Statements and blocks -- Add 6.8.7 nocode
statements.

1 A empty statement placeholder may exist within any block,
and must include the nocode keyword otherwise a diagnostic
will be generated (See 7.2.1.2).

2 Use of nocode can coexist with the flow control directives
goto, continue, and break.

3 Example:

if (x) {
nocode;
} else {
// Other code here
}

switch (x)
{
case 1:
case 5:
case 7:
case 9:
nocode;
break;
default:
// Other code here
break;
}

Section 6.5 Expression -- Add:

9 Use of the nocode keyword in any expression will evaluate
as true.

Example:

for (x = 0; nocode; ++x)
{
// Code here
}

do {

} while (nocode);

Section 7.2.1 Program Diagnostics -- Add section 7.2.1.2 "The
nocode keyword" with the following explanation:

1 Synopsis:

switch (x)
{
case 1:
case 5:
nocode;
break;

default:
// Other code here
break;
}

2 The nocode keyword signifies formally that the sur-
rounding code block does not require any source code.

If any non-flow-control statements are present in a
block where nocode is used, a diagnostic will be gen-
erated.

If any empty blocks are found without the nocode key-
word being present, a diagnostic will be generated.

3 The nocode keyword has no impact on generated code.
It is a mechanism for the developer and compiler used
to identify cases where no other code should exist in
the encapsulating block.
-----[ End ]-----

--
Rick C. Hodgin

Ben Bacarisse

unread,
Oct 25, 2019, 6:43:07 PM10/25/19
to
"Rick C. Hodgin" <rick.c...@gmail.com> writes:

> I'd like to propose that the nocode keyword be added to standard
> C. It would convey that a block was intentionally left blank,
> and was added only to allow program flow to move as it should in
> certain cases.
>
> switch (value)
> {
> case 1:
> case 5:
> case 7:
> case 9:
> // All valid
> nocode;
> break;
>
> default:
> // Invalid
> report_error_message();
> return;
> }
>
> It would be a way for the compiler to know that empty code blocks
> were intentional, and a new diagnostic could be created to emit
> when the compiler detect empty code blocks when use of nocode is
> available.

C already has several ways to write an empty code block. It also has an
empty statement. All of these are human-language agnostic ways to do
the same thing as "nocode".

> It would also aid in documentation and code validation to know
> that an empty block was empty intentionally, and not simply left
> un-coded by accident / mistake.

--
Ben.

Richard Damon

unread,
Oct 25, 2019, 9:10:13 PM10/25/19
to
If the idea is that now an empty block without nocode requires a
diagnostic, then this is a non-starter as it will break too much
existing code (any code that currently has an empty block)

If this is to define that the compiler will now generate a new class of
'diagnostic' (at least by the standard) that is a warning, I think again
that is a non-started as the Standard has no concept of required
warnings, ALL required diagnostics (as far as I know) allow the
implementation to fail to translate the code, and if it does, the
results are undefined, so in effect, they are errors or notifications
that an extension has been used.

A fall throught attribute was defined (though that might have been in
C++) to suppress a common warning, which I suppose is what you might be
modling the idea on, but a LOT of implementations warn on possibly
unintended switch fall through, but I am not sure if any warn about
empty statements by default, so you have a higher bar to pass to get
something like this in. It is also, in my opinion a much rarer form of
error. Since many other languages do 'automatic breaks' in their
equivalent of switch statements, it missing isn't that obvious.
Forgetting to put the actual code to do what you want in that case seems
much harder (unless you are just slapping together some crude framework,
but even then you should be adding the TODO comments to point to what is
missing.

Rick C. Hodgin

unread,
Oct 25, 2019, 10:02:43 PM10/25/19
to
On Friday, October 25, 2019 at 9:10:13 PM UTC-4, Richard Damon wrote:
> If the idea is that now an empty block without nocode requires a
> diagnostic, then this is a non-starter as it will break too much
> existing code (any code that currently has an empty block)

I would assume it would roll out in stages, but I think it should
be defined that way in the standard. It would not be an issued
diagnostic in the first stage, unless an option was turned on to
enable it. It would be a warning-level diagnostic in a later re-
lease, and then finally an error in a future one. It would give
people time to update their code.

> If this is to define that the compiler will now generate a new class of
> 'diagnostic' (at least by the standard) that is a warning, I think again
> that is a non-started as the Standard has no concept of required
> warnings, ALL required diagnostics (as far as I know) allow the
> implementation to fail to translate the code, and if it does, the
> results are undefined, so in effect, they are errors or notifications
> that an extension has been used.

The diagnostic severity would be up to the compiler authors and
implementers.

The root purpose is two-fold actually: documentation, and the re-
moval of code ambiguity.

> A fall throught attribute was defined (though that might have been in
> C++) to suppress a common warning, which I suppose is what you might be
> modling the idea on, but a LOT of implementations warn on possibly
> unintended switch fall through, but I am not sure if any warn about
> empty statements by default, so you have a higher bar to pass to get
> something like this in. It is also, in my opinion a much rarer form of
> error. Since many other languages do 'automatic breaks' in their
> equivalent of switch statements, it missing isn't that obvious.
> Forgetting to put the actual code to do what you want in that case seems
> much harder (unless you are just slapping together some crude framework,
> but even then you should be adding the TODO comments to point to what is
> missing.

My position is the language needs the deterministic ability to
identify intentionally empty code blocks, and non-intentionally
empty code blocks. Without that ability, it leaves a big question
in cases where an empty block is encountered, which, as you state,
is likely in a lot of code.

--
Rick C. Hodgin

Rick C. Hodgin

unread,
Oct 25, 2019, 10:20:19 PM10/25/19
to
On 10/25/2019 6:42 PM, Ben Bacarisse wrote:
> C already ... has an empty statement. All of these are human-
> language agnostic ways to do the same thing as "nocode".

I'm not aware of them. Can you teach me?

In C today, how would you deterministically identify or signify
that this block of code's logic test was intentionally left empty?

for (x = 0; ; ++x)
{
// Code here
}

Or this one:

for (i = 0; ; )
{
}

How do you know the empty portions weren't left out by mistake?
Without nocode or something like it, there's no way for the com-
piler to know. There's no way for another developer to know
either without something like this:

for (x = 0; /*no test here*/; ++x)
{
// Code here
}

But what if in this online world we get some code from some
foreign language? Rely on Google Translate to convey what it
means?

for (x = 0; /*لا يوجد اختبار هنا*/; ++x)
{
// Code here
}

There's no way to know with certainty, and that's the issue. It
is a simple addition. It generates no new code. It provides
potentially very useful information. And it solidifies and clar-
ifies code, something modern compiler efforts have sought to do
with great vehemence, preventing "const char assignments" from
being passed to char * functions, for example.

So much belt tightening in our code bases moving forward. The
time for nocode is here.

>> It would also aid in documentation and code validation to know
>> that an empty block was empty intentionally, and not simply left
>> un-coded by accident / mistake.


--
Rick C. Hodgin

Philipp Klaus Krause

unread,
Oct 26, 2019, 12:44:25 PM10/26/19
to
Am 26.10.19 um 03:10 schrieb Richard Damon:
> A fall throught attribute was defined (though that might have been in
> C++)

It has been proposed to C, too. It was to be voted on yesterday
(http://www.open-std.org/jtc1/sc22/wg14/www/docs/n2437.htm). I don't
know the result yet.

Philipp

Ben Bacarisse

unread,
Oct 26, 2019, 5:52:09 PM10/26/19
to
"Rick C. Hodgin" <rick.c...@gmail.com> writes:

> On 10/25/2019 6:42 PM, Ben Bacarisse wrote:
>> C already ... has an empty statement. All of these are human-
>> language agnostic ways to do the same thing as "nocode".
>
> I'm not aware of them. Can you teach me?

I am sure you know them; I must have simply not have been clear.

The expression in an expression statement is optional, as are the
contents of a compound statement, so

;
{}
{;}
{;;}

and so on (with optional comments) are all increasing insistent ways of
saying that no code was really, really what was intended.

> In C today, how would you deterministically identify or signify
> that this block of code's logic test was intentionally left empty?
>
> for (x = 0; ; ++x)
> {
> // Code here
> }

I don't understand the question. The best way to show intent to a
reader (when it really is in doubt) is with a comment, and the best was
to show intent to a compiler is by writing code with the intended
meaning. Maybe I don't know what "deterministically identify" means.

If you are talking about giving permission for a compiler to issue a
warning for the above then it already has it. It could, perfectly
legally, say something like "Did you intend to leave the condition
empty? To silence this warning, write an explicitly always true
expression.".

> Or this one:
>
> for (i = 0; ; )
> {
> }
>
> How do you know the empty portions weren't left out by mistake?

The surest way for me to know is to understand the code. Have you come
across an example where such a thing is an accident and yet it's not
obviously wrong? I feel (without evidence) that any code like that will
be bad code for all sorts of other, more significant, reasons so I'd
like to see an example.

> Without nocode or something like it, there's no way for the com-
> piler to know.

See above. The compiler can be suspicious about anything it likes and
can warn the programmer along with offering advice about what will make
it (the compiler) shut up. You could even document a special meaning
for an option: -Waccidentally-empty that turns on these warnings while at
the same time behaving like -Dnocode. You could then write

for (i = 0; nocode; nocode) nocode;

a get no warnings even when compiled with -Waccidentally-empty.

> There's no way for another developer to know
> either without something like this:
>
> for (x = 0; /*no test here*/; ++x)
> {
> // Code here
> }

I would hope there is! Do you have an actual example where it is not
obvious that the empty test was intended? If you do, at the very least,
the code needs a comment which probably should explain more than your
example comment does.

> But what if in this online world we get some code from some
> foreign language? Rely on Google Translate to convey what it
> means?
>
> for (x = 0; /*لا يوجد اختبار هنا*/; ++x)
> {
> // Code here
> }
>
> There's no way to know with certainty, and that's the issue.

Unless you are proposing a breaking change by /insisting/ on something
compiler-checkable, then I don't see how your suggestion helps for code
written by anyone else. Other people will always be able to write the
form you obect to, and they can currently write explicit always-true
expressions and explicitly empty code (of any degree of complexity) if
they feel the intent is not clear. And if they don't think they need to
make themselves extra clear, they probably won't use any new keywords
either.

--
Ben.

Ben Bacarisse

unread,
Oct 26, 2019, 6:27:02 PM10/26/19
to
Ben Bacarisse <ben.u...@bsb.me.uk> writes:

> ... The best way to show intent to a
> reader (when it really is in doubt) is with a comment, and the best was
s/was/way/
> to show intent to a compiler is by writing code with the intended
> meaning.

I don't usually correct my already-posted typos (when I see them), but
this one is open to more misinterpretation than many.

--
Ben.

Rick C. Hodgin

unread,
Oct 26, 2019, 8:19:17 PM10/26/19
to
On Saturday, October 26, 2019 at 5:52:09 PM UTC-4, Ben Bacarisse wrote:
> "Rick C. Hodgin" <rick.c...@gmail.com> writes:
>
> > On 10/25/2019 6:42 PM, Ben Bacarisse wrote:
> >> C already ... has an empty statement. All of these are human-
> >> language agnostic ways to do the same thing as "nocode".
> >
> > I'm not aware of them. Can you teach me?
>
> I am sure you know them; I must have simply not have been clear.
>
> The expression in an expression statement is optional, as are the
> contents of a compound statement, so
>
> ;
> {}
> {;}
> {;;}
>
> and so on (with optional comments) are all increasing insistent ways of
> saying that no code was really, really what was intended.


Understood. Thank you.

--
Rick C. Hodgin

Jakob Bohm

unread,
Oct 28, 2019, 5:42:19 AM10/28/19
to
For clarity, a number of existing compilers have the following known
warning behavior:

In places where empty statements are likely to be typos etc., they warn
about an actual empty statement (such as a lone semicolon), but accept
an empty block as en explicit request that nothing be done, without that
warning.

Typical examples include:

if (x > 3) ; // Warns about possible typo
puts("x is greater than 3");
if (x > 3) {} // No warning
puts("Hello world");

In general, it would be useful to supplement the C standard with some
statements of best practice for implementations, such as:

* Warn about empty controlled statements
* Diagnostics should point to the exact code location and mention
the items involved.
* Diagnostics involving include files should state both the include
file and where it was included.
* Diagnostics involving macros should state both the location of the
invocation and the location of the specific location within the macro
definition.
* Diagnostics involving continuation lines should point to actual
source file lines, not abstract decoded lines.
* Unless otherwise stated, basic types should expose whatever behavior
above and beyond the language standard that is provided by the actual
machine definition. (For example, unless otherwise stated, if the
machine is one that is defined to handle integer overflow in a specific
way, that behavior should be implementation defined to be that machine
behavior, not "anything goes" undefined).
* With very few clearly implementation defined cases, "undefined
behavior" is not a license to introduce semantics that bear no
resemblance to the source code. (A typical implementation defined case
would be a function-like keyword that explicitly tells the compiler to
ignore certain cases in its interpretation of the program). It can
however be a license to arbitrarily choose between different meaningful
interpretations, such as performing memory accesses in either of
multiple permitted orders or optimizing an expression in ways that are
not correct for invalid inputs, provided this does not trigger mistaken
optimization assumptions outside the expression.



Enjoy

Jakob
--
Jakob Bohm, CIO, Partner, WiseMo A/S. https://www.wisemo.com
Transformervej 29, 2860 Søborg, Denmark. Direct +45 31 13 16 10
This public discussion message is non-binding and may contain errors.
WiseMo - Remote Service Management for PCs, Phones and Embedded

Rick C. Hodgin

unread,
Oct 28, 2019, 9:13:23 PM10/28/19
to
I agree on all points.

I believe nocode is a significant and necessary feature that
should be added to C, but I'm done trying to work within the
C Standard oversight body. I've tried and tried and tried
and it's not worth it.

It's easier to write your own language that is backwards com-
patible with C and be done with it.

--
Rick C. Hodgin

David Spencer

unread,
Oct 28, 2019, 10:55:20 PM10/28/19
to
"Rick C. Hodgin" <rick.c...@gmail.com> writes:

>I believe nocode is a significant and necessary feature that
>should be added to C, but I'm done trying to work within the
>C Standard oversight body. I've tried and tried and tried
>and it's not worth it.

>It's easier to write your own language that is backwards com-
>patible with C and be done with it.

This would not be backward compatible with C, which is the main reason
that it would never get consideration past reading the first line of
the submission.

--
dhs spe...@panix.com

Richard Damon

unread,
Oct 28, 2019, 11:45:48 PM10/28/19
to
On 10/25/19 10:02 PM, Rick C. Hodgin wrote:
> On Friday, October 25, 2019 at 9:10:13 PM UTC-4, Richard Damon wrote:
>> If the idea is that now an empty block without nocode requires a
>> diagnostic, then this is a non-starter as it will break too much
>> existing code (any code that currently has an empty block)
>
> I would assume it would roll out in stages, but I think it should
> be defined that way in the standard. It would not be an issued
> diagnostic in the first stage, unless an option was turned on to
> enable it. It would be a warning-level diagnostic in a later re-
> lease, and then finally an error in a future one. It would give
> people time to update their code.

To the Standard, there is no such thing as a 'Warning Level Diagnostic'.
All required diagnostics result in the program's behavior being
undefined, (if a program is created) so are effectively 'errors', but an
implementation is able to define the behavior and thus provide an
extension. To add the concept of a Warning to the standard would be a
major change and significant work.

Because it breaks valid programs using what can be a useful feature, it
also has a very high hurdle to pass to show it add something of great
value, which I don't see. I will say that personally, this isn't the
sort of error I see being done often.

>
>> If this is to define that the compiler will now generate a new class of
>> 'diagnostic' (at least by the standard) that is a warning, I think again
>> that is a non-started as the Standard has no concept of required
>> warnings, ALL required diagnostics (as far as I know) allow the
>> implementation to fail to translate the code, and if it does, the
>> results are undefined, so in effect, they are errors or notifications
>> that an extension has been used.
>
> The diagnostic severity would be up to the compiler authors and
> implementers.
>
> The root purpose is two-fold actually: documentation, and the re-
> moval of code ambiguity.

This sounds like an ideal sort of thing for a compiler to implement as
an extension (almost doesn't need to be an real extension if you spell
the keyword as something like // nocode or /* nocode */ like gcc does
with // nobreak since a comment is a perfectly fine to have with an
empty statement.
>
>> A fall throught attribute was defined (though that might have been in
>> C++) to suppress a common warning, which I suppose is what you might be
>> modling the idea on, but a LOT of implementations warn on possibly
>> unintended switch fall through, but I am not sure if any warn about
>> empty statements by default, so you have a higher bar to pass to get
>> something like this in. It is also, in my opinion a much rarer form of
>> error. Since many other languages do 'automatic breaks' in their
>> equivalent of switch statements, it missing isn't that obvious.
>> Forgetting to put the actual code to do what you want in that case seems
>> much harder (unless you are just slapping together some crude framework,
>> but even then you should be adding the TODO comments to point to what is
>> missing.
>
> My position is the language needs the deterministic ability to
> identify intentionally empty code blocks, and non-intentionally
> empty code blocks. Without that ability, it leaves a big question
> in cases where an empty block is encountered, which, as you state,
> is likely in a lot of code.
>

WHY is this an ambiguity that is any different than any other case of
did the programmer right what the meant or not?

THere is nothing in the language that makes an empty code block
'ambiguous', its meaning is very clear.

If anything like this would make sense adding, it would be making
spacing significant to make something like:

if(x < y)
x = x + 1;
y = y + 1;
some other code

Here, the line y = y + 1 is indented like it is supposed to be
controlled by the if, but since it isn't inside a compond statement, it
isn't. This is something that some implementatios will warn about, and
it could be argued that it is bad enough style that it is worth making
it illegal, but I really doubt that will happen.


As an aside, I will remind you that you have commented that you find
Standard impossible to read and understand. That condition probably
makes you a poor choice

Rick C. Hodgin

unread,
Oct 29, 2019, 7:57:21 AM10/29/19
to
Hi David. Consider:

(x)
+----->>> New language
|
C ------+----->>> C

At point (x), they would still be the same language.

The new language would be similar to having a C compiler with some
custom extensions.

--
Rick C. Hodgin

Rick C. Hodgin

unread,
Oct 29, 2019, 8:14:16 AM10/29/19
to
On 10/28/2019 11:45 PM, Richard Damon wrote:
> On 10/25/19 10:02 PM, Rick C. Hodgin wrote:
>> On Friday, October 25, 2019 at 9:10:13 PM UTC-4, Richard Damon wrote:
>>> If the idea is that now an empty block without nocode requires a
>>> diagnostic, then this is a non-starter as it will break too much
>>> existing code (any code that currently has an empty block)
>>
>> I would assume it would roll out in stages, but I think it should
>> be defined that way in the standard. It would not be an issued
>> diagnostic in the first stage, unless an option was turned on to
>> enable it. It would be a warning-level diagnostic in a later re-
>> lease, and then finally an error in a future one. It would give
>> people time to update their code.
>
> To the Standard, there is no such thing as a 'Warning Level Diagnostic'.

Section 5.1.22 of N2310, page 23 in the draft I have (page 10 in
the written numbers on the page), acknowledges that real-world
implementations of the C Standard applied to an actual app, do
have warnings:

9) An implementation is encourage to identify the nature of, and
where possible localize, each violation. Of course, an imple-
mentation is free to produce any number of diagnostic messages,
often referred to as warnings, as long as a valid program is
still correctly translated. It can also successfully translate
an invalid program. Annex I lists a few of the more common
warnings.

And then in Annex I, page 418 in the draft I have (page 405 in the
written numbers on the page), it states:

1) An implementation may genreate warnings in many situations,
none of which are specified as part of this document. The
following are a few of the more common situations.

So there is an acknowledgement that warnings exist, and are even to
be expected. They're just not formally defined.

> All required diagnostics result in the program's behavior being
> undefined, (if a program is created) so are effectively 'errors', but an
> implementation is able to define the behavior and thus provide an
> extension. To add the concept of a Warning to the standard would be a
> major change and significant work.
>
> Because it breaks valid programs using what can be a useful feature, it
> also has a very high hurdle to pass to show it add something of great
> value, which I don't see. I will say that personally, this isn't the
> sort of error I see being done often.

We have:
void function_name(void);

Why not just this:
function_name;

There's a reason to include an explicit conveyance of the non-exist-
ence of something. Zero is one of the most powerful number concepts
we have.

>>> If this is to define that the compiler will now generate a new class of
>>> 'diagnostic' (at least by the standard) that is a warning, I think again
>>> that is a non-started as the Standard has no concept of required
>>> warnings, ALL required diagnostics (as far as I know) allow the
>>> implementation to fail to translate the code, and if it does, the
>>> results are undefined, so in effect, they are errors or notifications
>>> that an extension has been used.
>>
>> The diagnostic severity would be up to the compiler authors and
>> implementers.
>>
>> The root purpose is two-fold actually: documentation, and the re-
>> moval of code ambiguity.
>
> This sounds like an ideal sort of thing for a compiler to implement as
> an extension (almost doesn't need to be an real extension if you spell
> the keyword as something like // nocode or /* nocode */ like gcc does
> with // nobreak since a comment is a perfectly fine to have with an
> empty statement.

It can be implemented today:

#define nocode

And then used wherever. But it won't generate diagnostics, and that's
the biggest part of the equation that's missing without having direct
compiler-specified support.

>>> A fall throught attribute was defined (though that might have been in
>>> C++) to suppress a common warning, which I suppose is what you might be
>>> modling the idea on, but a LOT of implementations warn on possibly
>>> unintended switch fall through, but I am not sure if any warn about
>>> empty statements by default, so you have a higher bar to pass to get
>>> something like this in. It is also, in my opinion a much rarer form of
>>> error. Since many other languages do 'automatic breaks' in their
>>> equivalent of switch statements, it missing isn't that obvious.
>>> Forgetting to put the actual code to do what you want in that case seems
>>> much harder (unless you are just slapping together some crude framework,
>>> but even then you should be adding the TODO comments to point to what is
>>> missing.
>>
>> My position is the language needs the deterministic ability to
>> identify intentionally empty code blocks, and non-intentionally
>> empty code blocks. Without that ability, it leaves a big question
>> in cases where an empty block is encountered, which, as you state,
>> is likely in a lot of code.
>
> WHY is this an ambiguity that is any different than any other case of
> did the programmer right what the meant or not?

I think all cases of ambiguity should be addressed.

> THere is nothing in the language that makes an empty code block
> 'ambiguous', its meaning is very clear.

It is clear to the compiler, but it leaves the open question: was
the intention to not have code there? Or was it simply forgotten?

We need the ability to know explicitly.

> If anything like this would make sense adding, it would be making
> spacing significant to make something like:
>
> if(x < y)
> x = x + 1;
> y = y + 1;
> some other code

I agree that there should be an option to enforce a defined type of
syntax spacing, but we would not agree on what it is. IIRC, ANSI
uses 4-wide spacing for tabs. GNU uses 8. Some have braces indent-
ed, others don't. I block off sections of code with meaning:

------
code_above(); // Followed by a triple-space


//////////
// Explanation here
//////
// Comments for each related sub-block
indented_code_here();
relating_to_the_explanation();

// Double-spaced comments go here
and_more_code();
goes_here(); // Followed by a triple-space


// New comments for the next sub-block that doesn't need an
// outer encapsulating block
code_continues_here();
------

I do that primarily to address my dyslexia, so that a different
part of my brain categorizes code so I'm not constantly reading
only to understand things.

> Here, the line y = y + 1 is indented like it is supposed to be
> controlled by the if, but since it isn't inside a compond statement, it
> isn't. This is something that some implementatios will warn about, and
> it could be argued that it is bad enough style that it is worth making
> it illegal, but I really doubt that will happen.

If the spacing parameters could be all programmable, as many IDEs add
those features to allow custom formatting of source code, then I would
agree. The compiler could use those rules and ensure that the code
was not only syntactically correct, but also contained proper spacing
by its own standard.

> As an aside, I will remind you that you have commented that you find
> Standard impossible to read and understand. That condition probably
> makes you a poor choice

Only because of my dyslexia. If someone would sit down with me and
explain it to me in speech, where I could interact a bit and ask
questions, it would not be an issue. It's just worse than reading
stereo instructions in its present form. I do not comprehend things
well written like that. It's like a jumbled mass of confusion to me
when I read it. It's actually very difficult for me to read normal
text as well, like these posts here on Usenet. I have to really
really concentrate, and it wears me out if I do too much of it.

--
Rick C. Hodgin

Rick C. Hodgin

unread,
Oct 29, 2019, 8:41:15 AM10/29/19
to
On 10/29/2019 8:15 AM, Rick C. Hodgin wrote:
> I agree that there should be an option to enforce a defined type of
> syntax spacing, but we would not agree on what it is.  IIRC, ANSI
> uses 4-wide spacing for tabs.  GNU uses 8.  Some have braces indent-
> ed, others don't.  I block off sections of code with meaning:
>
> ------
>     code_above();       // Followed by a triple-space
>
>
>     //////////
>     // Explanation here
>     //////
>         // Comments for each related sub-block
>         indented_code_here();
>         relating_to_the_explanation();
>
>         // Double-spaced comments go here
>         and_more_code();
>         goes_here();    // Followed by a triple-space
>
>
>     // New comments for the next sub-block that doesn't need an
>     // outer encapsulating block
>     code_continues_here();
> ------
>
> I do that primarily to address my dyslexia, so that a different
> part of my brain categorizes code so I'm not constantly reading
> only to understand things.

This need to indent is so prevalent in my ability to easily read and
maintain even my own code, that I added a new language feature to
CAlive. They are the ||| and |||| operators.

||| serves as an "ignore me" operator. It serves no purpose, except
to be part of a notable visualization presentation designed to catch
the eye.

|||| is a line comment. They're used together in code like this:

------
code_above(); // Followed by a double-space

||||||||||
|||| Explanation here
|||| Maybe even a multi-line explanation
||||||
||||
||| // Comments for each related sub-block
||| indented_code_here();
||| relating_to_the_explanation();
|||
||| // Double-spaced comments go here
||| and_more_code();
||| goes_here(); // Followed by a double-space
||||
||||||

// New comments for the next sub-block that doesn't need an
// outer encapsulating block
code_continues_here();
------

It allows the block to be visually identified easily in source
code. All code within the block relates to it. It can be
collapsed with the explanation portion remaining so you know
what that block of code actually did as by description. It
also introduces a new ability within the debugger, to single-
step through code by those blocks. And with the use of an IDE,
starts and stops for each block can be flagged, and it does the
manual labor of typing in the |||| and ||| for you as needed.
You just type code like normal.

And in the IDE, the presentation can be different. Rather
than showing ||| characters everywhere which can be cluttered
to some people, it can be made into a vertical line, or enclosed
in a rectangle, or have the background color changed, or a fade-
in color from the left to the right to mark it off subtly, but
still visually, or in other ways.

In any event ... I've tried with C. There are several features
I desired to see added. I'm relegated to adding them to my own
language. And I'm okay with that. It's the only way to have
real control over a thing anyway.

CAlive. It will provide desirable advancements, and be THE way
to keep the C language alive in moving forward, unless they do
succumb and start implementing various of CAlive's features.

--
Rick C. Hodgin

David Brown

unread,
Oct 29, 2019, 9:32:12 AM10/29/19
to
On 28/10/2019 10:42, Jakob Bohm wrote:

>
> For clarity, a number of existing compilers have the following known
> warning behavior:
>
> In places where empty statements are likely to be typos etc., they warn
> about an actual empty statement (such as a lone semicolon), but accept
> an empty block as en explicit request that nothing be done, without that
> warning.
>
> Typical examples include:
>
> if (x > 3) ; // Warns about possible typo
> puts("x is greater than 3");
> if (x > 3) {} // No warning
> puts("Hello world");
>

Good tools try to warn about potentially unintentional code or
behaviour. It is not always easy to avoid false positives, however.
That is why more advanced tools (linters) often use stylised comments as
additional information.

> In general, it would be useful to supplement the C standard with some
> statements of best practice for implementations, such as:

When you say "supplement the C standard", do you mean you think these
should be added to the C standards (in which case I disagree entirely),
or that there should be in a supplemental document that can be used
along with the C standards (in which case I mostly disagree) ?

Pretty much everything you suggest below is in the category of "quality
of implementation" - they are features that a compiler can support in
order to be a more useful development tool for users. They are not
about C coding, or the functional behaviour of the code - and thus not
something that should be part of the C standards. They are merely
useful warnings and diagnostic information - and can be provided by
non-compiler tools (linters, or even IDE's), and could be a significant
implementation burden for small compilers.

>
> * Warn about empty controlled statements
> * Diagnostics should point to the exact code location and mention
> the items involved.
> * Diagnostics involving include files should state both the include
> file and where it was included.
> * Diagnostics involving macros should state both the location of the
> invocation and the location of the specific location within the macro
> definition.
> * Diagnostics involving continuation lines should point to actual
> source file lines, not abstract decoded lines.

These are all just compiler warnings.


> * Unless otherwise stated, basic types should expose whatever behavior
> above and beyond the language standard that is provided by the actual
> machine definition. (For example, unless otherwise stated, if the
> machine is one that is defined to handle integer overflow in a specific
> way, that behavior should be implementation defined to be that machine
> behavior, not "anything goes" undefined).

This is a terrible idea in terms of portability of C code, and
optimisation of C code.

If a compiler wants to provide additional documented semantics for
things that the C standards do not define, such as integer overflow
behaviour, then it can do so. Even better, compilers can provide
choices to control this behaviour (I would strongly prefer such choices
to be made in the code, such as by pragmas, rather than command-line
switches).

But there are many disadvantages in having such extra behaviour - it
reduces optimisation opportunities, it reduces static error checking,
and it reduces run-time diagnostics and checks. And of course it
encourages code to be unnecessarily tied to specific platforms.

I can appreciate that some people want specific behaviour for integer
overflow - I am convinced that many of these do not appreciate the
negative consequences of that behaviour. And many people specifically
do not want such behaviour. So it certainly should not be mandated by
the standards.


> * With very few clearly implementation defined cases, "undefined
> behavior" is not a license to introduce semantics that bear no
> resemblance to the source code. (A typical implementation defined case
> would be a function-like keyword that explicitly tells the compiler to
> ignore certain cases in its interpretation of the program). It can
> however be a license to arbitrarily choose between different meaningful
> interpretations, such as performing memory accesses in either of
> multiple permitted orders or optimizing an expression in ways that are
> not correct for invalid inputs, provided this does not trigger mistaken
> optimization assumptions outside the expression.
>

Undefined behaviour has, by its definition, no meaningful behaviour. In
some cases it might be possible to guess what the programmer intended,
though this can be very difficult to do in a deterministic way in a
compiler - what is "obvious" to a human reader in a simple sample case
may be very much harder to spot in general cases in a tool.

It is reasonable to want a compiler that does not knowingly exasperate
problems due to undefined behaviour - that is a quality of
implementation issue. But it is unreasonable (indeed, impossible) to
attempt to legislate this in general.

There may be scope for changing certain /specific/ cases of undefined
behaviour in the standards into fully defined or implementation specific
behaviour. But it would be for specific cases only, not as a
generalisation.

Jakob Bohm

unread,
Oct 29, 2019, 12:41:07 PM10/29/19
to
On 29/10/2019 14:32, David Brown wrote:
> On 28/10/2019 10:42, Jakob Bohm wrote:
>
>>
>> For clarity, a number of existing compilers have the following known
>> warning behavior:
>>
>> In places where empty statements are likely to be typos etc., they warn
>> about an actual empty statement (such as a lone semicolon), but accept
>> an empty block as en explicit request that nothing be done, without that
>> warning.
>>
>> Typical examples include:
>>
>> if (x > 3) ; // Warns about possible typo
>> puts("x is greater than 3");
>> if (x > 3) {} // No warning
>> puts("Hello world");
>>
>

Note: The above was an example of actual good behavior by some actual
current compilers, given as a counterexample to the proposal for a
"nocode" keyword.

> Good tools try to warn about potentially unintentional code or
> behaviour. It is not always easy to avoid false positives, however.
> That is why more advanced tools (linters) often use stylised comments as
> additional information.
>
>> In general, it would be useful to supplement the C standard with some
>> statements of best practice for implementations, such as:
>
> When you say "supplement the C standard", do you mean you think these
> should be added to the C standards (in which case I disagree entirely),
> or that there should be in a supplemental document that can be used
> along with the C standards (in which case I mostly disagree) ?
>
> Pretty much everything you suggest below is in the category of "quality
> of implementation" - they are features that a compiler can support in
> order to be a more useful development tool for users. They are not
> about C coding, or the functional behaviour of the code - and thus not
> something that should be part of the C standards. They are merely
> useful warnings and diagnostic information - and can be provided by
> non-compiler tools (linters, or even IDE's), and could be a significant
> implementation burden for small compilers.
>

Indeed this would be a separate "quality of implementation" document,
covering many aspects ignored by the formal requirements of the C
standard. Such a document, with a reasonably level of independence and
authority behind it, could allow purchase managers and formal tenders
for public contracts to state that they want a compiler that is at least
"quality level 2" or for portable C libraries to state that it is
portable to all "quality level 2" compilers.


>>
>> * Warn about empty controlled statements
>> * Diagnostics should point to the exact code location and mention
>> the items involved.
>> * Diagnostics involving include files should state both the include
>> file and where it was included.
>> * Diagnostics involving macros should state both the location of the
>> invocation and the location of the specific location within the macro
>> definition.
>> * Diagnostics involving continuation lines should point to actual
>> source file lines, not abstract decoded lines.
>
> These are all just compiler warnings.

The last 4 also apply to error diagnostics. Over the years I have
encountered a number of otherwise good compilers failing each of these
rules, making it very difficult to identify what exactly triggered that
"syntax error" diagnostic.


>
>
>> * Unless otherwise stated, basic types should expose whatever behavior
>> above and beyond the language standard that is provided by the actual
>> machine definition. (For example, unless otherwise stated, if the
>> machine is one that is defined to handle integer overflow in a specific
>> way, that behavior should be implementation defined to be that machine
>> behavior, not "anything goes" undefined).
>
> This is a terrible idea in terms of portability of C code, and
> optimisation of C code.
>
> If a compiler wants to provide additional documented semantics for
> things that the C standards do not define, such as integer overflow
> behaviour, then it can do so. Even better, compilers can provide
> choices to control this behaviour (I would strongly prefer such choices
> to be made in the code, such as by pragmas, rather than command-line
> switches).

The ability to choose different semantics for compiler specific reason
is why I wrote "unless otherwise stated" twice.

>
> But there are many disadvantages in having such extra behaviour - it
> reduces optimisation opportunities, it reduces static error checking,
> and it reduces run-time diagnostics and checks. And of course it
> encourages code to be unnecessarily tied to specific platforms.
>

In practice, the vast majority of machines share some common semantics
such as "2s complement ints" or "ints stored as straight little or big
endian, nothing weird".

> I can appreciate that some people want specific behaviour for integer
> overflow - I am convinced that many of these do not appreciate the
> negative consequences of that behaviour. And many people specifically
> do not want such behaviour. So it certainly should not be mandated by
> the standards.
>

I have yet to see anyone outside academia and compiler vendors actively
wanting the bizarro optimizations found e.g. in some versions of gcc.

I have seen those that want explicit overflow semantics such as
saturation to MAXINT/MININT (for the actual type), or program
termination with a diagnostic, as either ensures that out-of-range
values do not cause unexpected malfunctions (think Ariane rocket
incident).

The point is to state that it is a horribly bad (even if compliant)
compiler that does what some gcc versions reportedly did (optimizing
away a loop condition because the loop would assign an out-of-range
value to a variable which would not be read after assigning that invalid
value).


>
>> * With very few clearly implementation defined cases, "undefined
>> behavior" is not a license to introduce semantics that bear no
>> resemblance to the source code. (A typical implementation defined case
>> would be a function-like keyword that explicitly tells the compiler to
>> ignore certain cases in its interpretation of the program). It can
>> however be a license to arbitrarily choose between different meaningful
>> interpretations, such as performing memory accesses in either of
>> multiple permitted orders or optimizing an expression in ways that are
>> not correct for invalid inputs, provided this does not trigger mistaken
>> optimization assumptions outside the expression.
>>
>
> Undefined behaviour has, by its definition, no meaningful behaviour. In
> some cases it might be possible to guess what the programmer intended,
> though this can be very difficult to do in a deterministic way in a
> compiler - what is "obvious" to a human reader in a simple sample case
> may be very much harder to spot in general cases in a tool.
>

Indeed, that is how the compiler community has chosen to read it, while
historic C compilers (from around the time of the C89 definition) would
interpret it as simply "not defined by the standard, do something sane
and preferably documented".

> It is reasonable to want a compiler that does not knowingly exasperate
> problems due to undefined behaviour - that is a quality of
> implementation issue. But it is unreasonable (indeed, impossible) to
> attempt to legislate this in general.
>

The somewhat obtuse misreadings of the standard by all major compiler
vendors in recent years is at least reason to explicitly make such
misreadings invalid.

> There may be scope for changing certain /specific/ cases of undefined
> behaviour in the standards into fully defined or implementation specific
> behaviour. But it would be for specific cases only, not as a
> generalisation.
>

The proposal is to change the default meaning of "undefined behavior" to
"implementation specific" in all but a few clearly delineated cases.
Of cause an implementation could define an insane implementation
specific behavior, but it would stick out as a sore thumb to anyone
reading the "implementation specific behavior" section of the
documentation, unlike a mostly undocumented obtuse interpretation.

David Spencer

unread,
Oct 29, 2019, 4:21:48 PM10/29/19
to
"Rick C. Hodgin" <rick.c...@gmail.com> writes:

>On 10/28/2019 10:55 PM, David Spencer wrote:
>Hi David. Consider:

> (x)
> +----->>> New language
> |
> C ------+----->>> C

>At point (x), they would still be the same language.

The new language you propose is not the same language at x, or at any
other point.

It would not be compatible with

int nocode = 0;

which has been C since it was first conceived.

--
dhs spe...@panix.com

Rick C. Hodgin

unread,
Oct 29, 2019, 4:31:52 PM10/29/19
to
New keywords have been added to C with each revision. At some
point the keyword "auto" did not exist. int auto = 0; would've
been valid. It's not any more.

As with all new extensions, things in the past which use the
new keywords / syntax will break.

--
Rick C. Hodgin

james...@alumni.caltech.edu

unread,
Oct 29, 2019, 5:47:30 PM10/29/19
to
On Tuesday, October 29, 2019 at 4:31:52 PM UTC-4, Rick C. Hodgin wrote:
> On Tuesday, October 29, 2019 at 4:21:48 PM UTC-4, David Spencer wrote:
...
> > It would not be compatible with
> > int nocode = 0;
> >
> > which has been C since it was first conceived.
>
> New keywords have been added to C with each revision.

Yes, and for quite some time now, the committee has been careful to
choose all new keywords from the name space reserved for
implementations, which therefore cannot cause problems for code that
strictly conformed to previous versions of the standard. I don't think
your suggestion has any significant chance of getting accepted into the
standard, but it would greatly increase that chance to use such a name,
such as _Nocode.

> As with all new extensions, things in the past which use the
> new keywords / syntax will break.

If you choose a reserved name as a keyword, all code which previously
used that keyword was already broken.

David Brown

unread,
Oct 29, 2019, 6:24:39 PM10/29/19
to
On 29/10/2019 17:41, Jakob Bohm wrote:
> On 29/10/2019 14:32, David Brown wrote:
>> On 28/10/2019 10:42, Jakob Bohm wrote:
>>
>>>
>>> For clarity, a number of existing compilers have the following known
>>> warning behavior:
>>>
>>> In places where empty statements are likely to be typos etc., they warn
>>> about an actual empty statement (such as a lone semicolon), but accept
>>> an empty block as en explicit request that nothing be done, without that
>>> warning.
>>>
>>> Typical examples include:
>>>
>>>    if (x > 3) ; // Warns about possible typo
>>>       puts("x is greater than 3");
>>>    if (x > 3) {} // No warning
>>>    puts("Hello world");
>>>
>>
>
> Note: The above was an example of actual good behavior by some actual
> current compilers, given as a counterexample to the proposal for a
> "nocode" keyword.
>

Yes, I agree - both that such compiler warnings are useful, and that
they show a "nocode" keyword is unnecessary.
Ah, okay. I think it would be difficult to specify these things
accurately enough for a document like that, but I now understand more
about what you mean. Perhaps it would be better expressed as a kind of
test suite - a set of example snippets that you expect "quality"
compilers to emit warnings on, plus other snippets that you /don't/ want
warnings on. People could rate compilers (or, more specifically,
compiler + flag combinations) on the percentage of matches on the tests.

>
>>>
>>>    * Warn about empty controlled statements
>>>    * Diagnostics should point to the exact code location and mention
>>> the items involved.
>>>    * Diagnostics involving include files should state both the include
>>> file and where it was included.
>>>    * Diagnostics involving macros should state both the location of the
>>> invocation and the location of the specific location within the macro
>>> definition.
>>>    * Diagnostics involving continuation lines should point to actual
>>> source file lines, not abstract decoded lines.
>>
>> These are all just compiler warnings.
>
> The last 4 also apply to error diagnostics.  Over the years I have
> encountered a number of otherwise good compilers failing each of these
> rules, making it very difficult to identify what exactly triggered that
> "syntax error" diagnostic.
>

I am afraid you will continue to be disappointed, because you are asking
about an unsolvable problem. Compilers can get better at this - and in
my experience, they /are/ getting better - but they can never get it
fully right. If you write "int colour;" in one part of a file, and
later write "color = 2;" in another part, which line is the mistake?

Compilers - especially combined with a good IDE - can make it easier to
find the source of diagnostics, but they won't ever get it perfect.


>
>>
>>
>>>    * Unless otherwise stated, basic types should expose whatever
>>> behavior
>>> above and beyond the language standard that is provided by the actual
>>> machine definition.  (For example, unless otherwise stated, if the
>>> machine is one that is defined to handle integer overflow in a specific
>>> way, that behavior should be implementation defined to be that machine
>>> behavior, not "anything goes" undefined).
>>
>> This is a terrible idea in terms of portability of C code, and
>> optimisation of C code.
>>
>> If a compiler wants to provide additional documented semantics for
>> things that the C standards do not define, such as integer overflow
>> behaviour, then it can do so.  Even better, compilers can provide
>> choices to control this behaviour (I would strongly prefer such choices
>> to be made in the code, such as by pragmas, rather than command-line
>> switches).
>
> The ability to choose different semantics for compiler specific reason
> is why I wrote "unless otherwise stated" twice.
>

Yes - but I strongly disagree with your wish to make these extra
semantics the default. Change it to be "compilers should have a way to
expose stronger semantics of integer arithmetic convenient to the
underlying machine", and I would be a lot happier. Even better would be
for compilers to provide types like "int_mod32_t" which would be a
32-bit signed integer with modular overflow behaviour. "int32_t" (and
"int") would continue to have undefined overflow behaviour. Then the
programmer would be stating exactly what they want.

>>
>> But there are many disadvantages in having such extra behaviour - it
>> reduces optimisation opportunities, it reduces static error checking,
>> and it reduces run-time diagnostics and checks.  And of course it
>> encourages code to be unnecessarily tied to specific platforms.
>>
>
> In practice, the vast majority of machines share some common semantics
> such as "2s complement ints" or "ints stored as straight little or big
> endian, nothing weird".

You are mixing representation and operational semantics.

That is a matter of representation, and it is certainly the case that
for modern processors, two's complement with big or little endian
ordering and no padding is universal. (There are still systems in use
for which this does not apply, but they are unlikely to have compilers
that support future C standards.) Sizes are not as universal as some
people think - there are devices in use with 16-bit or 32-bit char, or
with 24-bit int.

I have nothing against simplifying the C standards to require two's
complement representation and ban padding bits (except in _Bool). C++20
has this, and it is a proposed addition to the next C standard.

However, representation is not overflow behaviour. There are very good
reasons for /not/ wanting two's complement overflow behaviour even when
the underlying cpu supports it.

(There are some operations, especially with shifts, where it would be
reasonable to tighten the behaviour definitions if the representation is
fixed with two's complement. And things like conversions from unsigned
to signed types could be given fully defined behaviour rather than
implementation defined behaviour.)

>
>> I can appreciate that some people want specific behaviour for integer
>> overflow - I am convinced that many of these do not appreciate the
>> negative consequences of that behaviour.  And many people specifically
>> do not want such behaviour.  So it certainly should not be mandated by
>> the standards.
>>
>
> I have yet to see anyone outside academia and compiler vendors actively
> wanting the bizarro optimizations found e.g. in some versions of gcc.
>

Well, that's changed now. I write code for small embedded systems,
mainly in C (with some C++), and I want signed overflow to be undefined
behaviour, and I want the compiler to optimise based on that knowledge.
I want my compiler to transform "x + 1 > 0" into "x >= 0". I want it to
transform "x * 20 / 5" into "x * 4". I want it to inform me if I've
written "int x = 1000000 * 1000000;". I want it to be able to run with
"-fsanitize=integer-overflow" and tell me when it sees a mistake in my code.

Overflowing your types is almost always an error in the code. (Not
always, but almost always. I am happy to go out of my way to deal with
cases where I actually want wrapping behaviour.) I want my compiler to
be able to tell me if it sees these errors, and I want it to be able to
optimise on the assumption that those errors don't occur.

> I have seen those that want explicit overflow semantics such as
> saturation to MAXINT/MININT (for the actual type),

Yes, sometimes you want saturation behaviour.

> or program
> termination with a diagnostic, as either ensures that out-of-range
> values do not cause unexpected malfunctions (think Ariane rocket
> incident).

Making integer overflow defined behaviour means errors like this are
more likely - because they are still logical errors in the code, but now
they are allowed by the language and the tools have no way to distinguish.

>
> The point is to state that it is a horribly bad (even if compliant)
> compiler that does what some gcc versions reportedly did (optimizing
> away a loop condition because the loop would assign an out-of-range
> value to a variable which would not be read after assigning that invalid
> value).
>

Let's be clear here - the famous case of gcc removing a loop
optimisation in SPEC benchmark was due to a bug in the SPEC code, and
was with a pre-release test version of gcc. Part of the testing before
releasing new compiler versions involves such tests, and the whole thing
was only publicised because it was interesting that a bug had been found
in the old SPEC code.

When you write code with undefined behaviour, you cannot expect the
compiler to guess what you meant to write. If you think that is the
case, C is not the language for you - it is that simple.

Having said that, compilers can always be better at informing the user
about such cases. They cannot tell you whenever they optimises based on
the assumption that undefined behaviour does not occur - then perfectly
normal code would cause a flood of warnings. But they can always be better.

>
>>
>>>    * With very few clearly implementation defined cases, "undefined
>>> behavior" is not a license to introduce semantics that bear no
>>> resemblance to the source code.  (A typical implementation defined case
>>> would be a function-like keyword that explicitly tells the compiler to
>>> ignore certain cases in its interpretation of the program).  It can
>>> however be a license to arbitrarily choose between different meaningful
>>> interpretations, such as performing memory accesses in either of
>>> multiple permitted orders or optimizing an expression in ways that are
>>> not correct for invalid inputs, provided this does not trigger mistaken
>>> optimization assumptions outside the expression.
>>>
>>
>> Undefined behaviour has, by its definition, no meaningful behaviour.  In
>> some cases it might be possible to guess what the programmer intended,
>> though this can be very difficult to do in a deterministic way in a
>> compiler - what is "obvious" to a human reader in a simple sample case
>> may be very much harder to spot in general cases in a tool.
>>
>
> Indeed, that is how the compiler community has chosen to read it, while
> historic C compilers (from around the time of the C89 definition) would
> interpret it as simply "not defined by the standard, do something sane
> and preferably documented".

No, compilers have never done that - it's a common myth and
misunderstanding. Lack of optimisations in earlier compilers was due to
more limited knowledge of compilers and optimisation techniques, and
more limited resources on hosts. And even 30 years ago, compilers /did/
optimise on the assumption that integer overflow did not occur - such as
strength-reduction of "x * 20 / 5" to "x * 4". And there are also many
cases where compilers have said "this is not defined by the standard,
but we choose to define it ourselves".

>
>> It is reasonable to want a compiler that does not knowingly exasperate
>> problems due to undefined behaviour - that is a quality of
>> implementation issue.  But it is unreasonable (indeed, impossible) to
>> attempt to legislate this in general.
>>
>
> The somewhat obtuse misreadings of the standard by all major compiler
> vendors in recent years is at least reason to explicitly make such
> misreadings invalid.

Every compiler vendor I know of would want to hear about any misreadings
they have made. Have you considered that they in fact /have/ read the
standards correctly, instead of making up additional ideas that are not
there in the standards? If you can point to the wording in the
published C standards (C90, C99, C11 or C18) showing where integer
overflow behaviour is defined to be wrapping two's complement behaviour
on two's complement cpus, then you would have a strong point.


There are many challenges involved in writing good, efficient, and
portable C code. One of these is that there are bits of code that can
be written in a way that is undefined behaviour and fails on stronger
compilers but works efficiently on weaker compilers, or it can be
written in another way that is defined behaviour and works efficiently
on strong compilers but very inefficiently on weaker compilers. Which
way should you write such code? The answer is, of course, "it depends".
What you cannot reasonably do, however, is place artificial limits on
good, modern compilers in order to support older code with undefined
behaviours - older code that is not, in fact, correct C code at all.

The way modern compilers handle this is to provide switches to give
specific additional semantics that match certain behaviours of weaker
tools that might be assumed by code - classic examples in gcc being
"-fwrapv" to get wrapping integer overflow, and "-fno-strict-aliasing"
to disable type-based alias analysis. In addition, these take effect by
default when the compiler is run, as optimisations are disabled unless
explicitly enabled.

In other words, these compilers go out of their way to support incorrect
(but reasonable at the time of writing) C code, and to support people
who want the C language to be defined in a different way. And yet
people like you /still/ complain!

>
>> There may be scope for changing certain /specific/ cases of undefined
>> behaviour in the standards into fully defined or implementation specific
>> behaviour.  But it would be for specific cases only, not as a
>> generalisation.
>>
>
> The proposal is to change the default meaning of "undefined behavior" to
> "implementation specific" in all but a few clearly delineated cases.

That won't work.

First, /everything/ is undefined behaviour except in the cases where the
C standards explicitly defines it. You could turn some specific
undefined behaviours into implementation defined behaviour, but you
cannot possibly expect implementations to specify and document
everything that is not written in the C standards. If your cat sits on
your keyboard, the behaviour is undefined by the C standards - do you
really expect compiler vendors to pick a specific deterministic and
documented behaviour for that event?

Secondly, undefined behaviour is a /good/ thing. It is /better/ in many
cases that things are undefined behaviour, rather than implementation
defined behaviour. I have explained above why I actively prefer integer
overflow to be undefined behaviour.

Thirdly, in a great many cases of undefined behaviour as listed in Annex
J.2 of the standard, there is no sensible choice of what implementation
defined behaviour should be.

And even in cases where you apparently think there is an "obvious"
choice for defining the behaviour, it often is not the case or not the
only option. For signed integer overflow, people might want two's
complement wrapping. But they might want saturation, or trapping, or
optimisations based on ignoring the possibility of the overflow.


I invite you to read through J.2 of C11 or C18 (you have read the C
standards that you criticise, and criticise compiler vendors for
misreading, haven't you?) and list the implementation-defined behaviour
you would like to see for your favourite cpu target.

Rick C. Hodgin

unread,
Oct 29, 2019, 7:36:49 PM10/29/19
to
On 10/29/2019 5:47 PM, james...@alumni.caltech.edu wrote:
> On Tuesday, October 29, 2019 at 4:31:52 PM UTC-4, Rick C. Hodgin wrote:
>> On Tuesday, October 29, 2019 at 4:21:48 PM UTC-4, David Spencer wrote:
> ...
>>> It would not be compatible with
>>> int nocode = 0;
>>>
>>> which has been C since it was first conceived.
>>
>> New keywords have been added to C with each revision.
>
> Yes, and for quite some time now, the committee has been careful to
> choose all new keywords from the name space reserved for
> implementations, which therefore cannot cause problems for code that
> strictly conformed to previous versions of the standard. I don't think
> your suggestion has any significant chance of getting accepted into the
> standard, but it would greatly increase that chance to use such a name,
> such as _Nocode.

Never happen. I think the use of features like that in a language,
unless they are explicitly going out of their way to not be invasive
in their name usage, are ridiculous.

int8_t
int16_t
int32_t
int64_t

Why _t? In CAlive I defined signed and unsigned:

s8, s16, s32, s64
u8, u16, u32, u64

And can you guess what these are?

f32, f64, f80

>> As with all new extensions, things in the past which use the
>> new keywords / syntax will break.
>
> If you choose a reserved name as a keyword, all code which previously
> used that keyword was already broken.

Exactly when was "auto" reserved by the C committee? That C code
I wrote that's been compiling all these years since the 1980s now
breaks because they reserved the word auto in the 1990s? 2000s?
2010s?

It doesn't matter. I keep saying this, and it's hard for me to
flatly and coldly turn my back on something or someone (it takes
a real chain of events to make it happen), but I'm done with any
attempt to improve C via the C Standard or such related.

CAlive is my future. End of story.

--
Rick C. Hodgin

David Spencer

unread,
Oct 29, 2019, 7:46:28 PM10/29/19
to
"Rick C. Hodgin" <rick.c...@gmail.com> writes:

>As with all new extensions, things in the past which use the
>new keywords / syntax will break.

So it's not backward compatible. Stop claiming that it is.

--
dhs spe...@panix.com

Rick C. Hodgin

unread,
Oct 29, 2019, 9:41:11 PM10/29/19
to
On 10/29/2019 7:46 PM, David Spencer wrote:
> "Rick C. Hodgin" <rick.c...@gmail.com> writes:
>> As with all new extensions, things in the past which use the
>> new keywords / syntax will break.
>
> So it's not backward compatible. Stop claiming that it is.


Really, David?

--
Rick C. Hodgin

Richard Damon

unread,
Oct 29, 2019, 10:13:09 PM10/29/19
to
To my memory auto wasn't reserved by the C committee because it was a
key word decades before the C committee touched the language. I am very
sure that auto was a key word in the first publication of K&R back in
the 70s.

the _t endings for the standard integer types was done specifically
BECAUSE it was just ugly enough that those names were very unlikely to
be used in existing code, and the names descriptive enough that even if
they were, they were likely being used for something close enough to
what the standard was defining that the rare conflict was easily fixed.
The problem with type names like s8 was that they WERE fairly commonly
being usedm so even if they were used for similar purposes there was
still a lot of code that needed to be edited to fix it. Also some code
used some of them for something very different, for instance, a program
might define a set of strings as s1, s2, s3, s4, s5, s5, s7, s8 (and hit
the name conflict here).

Richard Damon

unread,
Oct 29, 2019, 10:37:55 PM10/29/19
to
Yes, the Standard, in the non-normative footnotes/annexes acknowledges
that warnings exist. None of these are defined by the Standard, and the
Standard has no framework to officially define a required warning. The
Standard doesn't even that formally define a 'Error', as almost all
'Required Diagnostics' may still result in the generation of a program
(but whose behavior is undefined by the Standard, but might be defined
by the implementation). The only thing that MUST not generate a program
is the processing of a #error directive that isn't bypassed by a #if or
similar control.

The fallthrough attribute in C++ (which may come to Standard C) is the
closest example to what you are proposing, but that relates to a common
warning in many implementations, generated in response to a common
programming error, for a construct that actually is commonly used
properly too, so has a need for a standard way to being suppressed.

This nocode condition doesn't seem to be a common programming error, and
I don't hear many people complaining about getting warnings when
legitimately using that construct, so it doesn't meet the other parts of
the condition. To me, it seems to be a cute solution looking for a
problem to solve, but not finding one.
>
>> All required diagnostics result in the program's behavior being
>> undefined, (if a program is created) so are effectively 'errors', but an
>> implementation is able to define the behavior and thus provide an
>> extension. To add the concept of a Warning to the standard would be a
>> major change and significant work.
>>
>> Because it breaks valid programs using what can be a useful feature, it
>> also has a very high hurdle to pass to show it add something of great
>> value, which I don't see. I will say that personally, this isn't the
>> sort of error I see being done often.
>
> We have:
>     void function_name(void);
>
> Why not just this:
>     function_name;
>

Because that doesn't parse to the same meaning. without the voids it
means something different. Note that originally, the void return type
was not required, and the lack of a explicit return type caused the type
to be assumed to be int. It was decided that this assumed default caused
enough problems that it was decided that in that case, intentionally
breaking the old programs using that feature was desired.

The second void also has meaning. The declaration
void function_name();
has a different meaning (mostly a throwback to older versions of the
language)

Omitting the () in the definition make it a very different sort of
declaration. While void x; doesn't make much sense, for other types like
int x; it has a very definite meaning, the declaring of a variable, so
it would require some form of mind-reading to decide if it somehow means
in some cases defining a function with no parameters vs a variable.

David Brown

unread,
Oct 30, 2019, 4:48:31 AM10/30/19
to
On 30/10/2019 00:36, Rick C. Hodgin wrote:
> On 10/29/2019 5:47 PM, james...@alumni.caltech.edu wrote:
>> On Tuesday, October 29, 2019 at 4:31:52 PM UTC-4, Rick C. Hodgin wrote:
>>> On Tuesday, October 29, 2019 at 4:21:48 PM UTC-4, David Spencer wrote:
>> ...
>>>> It would not be compatible with
>>>>    int nocode = 0;
>>>>
>>>> which has been C since it was first conceived.
>>>
>>> New keywords have been added to C with each revision.
>>
>> Yes, and for quite some time now, the committee has been careful to
>> choose all new keywords from the name space reserved for
>> implementations, which therefore cannot cause problems for code that
>> strictly conformed to previous versions of the standard. I don't think
>> your suggestion has any significant chance of getting accepted into the
>> standard, but it would greatly increase that chance to use such a name,
>> such as _Nocode.
>
> Never happen.  I think the use of features like that in a language,
> unless they are explicitly going out of their way to not be invasive
> in their name usage, are ridiculous.

The naming style _Nocode is picked /precisely/ not to be invasive - and
is thus an excellent choice.

Look at when C gained a boolean type. They could not add a keyword
"bool" without conflict with a great deal of existing code. So the name
of the type, and the new keyword, was _Bool. But because people often
want a nicer name like "bool", they also made a <stdbool.h> file
consisting basically of :

#define bool _Bool
#define true 1
#define false 0

It gives you everything. There is a keyword that does not conflict with
existing code, and people writing new code can easily use the nicer type
name "bool" after #include <stdbool.h>

If your proposal were accepted, the new keyword would be _Nocode. And
there would be a new standard header, <stdnocode.h>, consisting of the
line :

#define nocode _Nocode



>
>     int8_t
>     int16_t
>     int32_t
>     int64_t
>
> Why _t?

There are a number of reasons. It makes it obvious to programmers and
readers that these are types, not variables. It reduces the risk of
conflict with existing code. It makes them stand out nicely. They are
long enough that they won't be used for other things, short enough that
the are not onerous to type. They follow the long tradition in C and
POSIX of using _t suffixes on types.

No choice would be considered "perfect" by everyone, but these names are
good enough for most people.

>  In CAlive I defined signed and unsigned:

Your language, your choice.

>
>     s8, s16, s32, s64
>     u8, u16, u32, u64

There is such a thing as being /too/ short. These don't carry the
information I would want in type names - and they risk conflict with
other uses.

>
> And can you guess what these are?
>
>     f32, f64, f80

Knowing the context, yes. Without context, no, it would not be clear.
But "float32_t" and "float64_t" would be perfectly clear in all cases.
(Standard C does not have these types, but I think it should do.)

>
>>> As with all new extensions, things in the past which use the
>>> new keywords / syntax will break.
>>
>> If you choose a reserved name as a keyword, all code which previously
>> used that keyword was already broken.
>
> Exactly when was "auto" reserved by the C committee?  That C code
> I wrote that's been compiling all these years since the 1980s now
> breaks because they reserved the word auto in the 1990s?  2000s?
> 2010s?

"auto" has been around since the beginning of C (and perhaps from its
predecessors). It has been pointless in code since at least C89/C90
(the first standardised C), but still exists for compatibility. (C++
completely redefined its meaning with C++11. I would like to see it
deprecated in C too, so that it can be re-used later.)

>
> It doesn't matter.  I keep saying this, and it's hard for me to
> flatly and coldly turn my back on something or someone (it takes
> a real chain of events to make it happen), but I'm done with any
> attempt to improve C via the C Standard or such related.
>

I am sure the C standards committee see this as a great loss.

Less sarcastically, the crux of your problem is cooperation. C is a
language standardised by a committee. Before changes are made, there
are discussions - opinions are swapped, ideas exchanged. People
/listen/ far more than they talk, and they think about other people more
than themselves. The standards committee make changes that they think
will be helpful for /others/ - they are not concerned about their own
coding.

You, on the other hand, are never willing to enter a proper discussion.
You get an idea, and you fixate on it - convinced that it is something
miraculous and world-changing, and that you alone know the answer. You
have no consideration for other people - you are concerned only with
your own coding. You are unable to listen to other people's opinions,
thoughts or suggestions - you assume that anyone who does not agree with
you entirely is doing so through malice or, if you are feeling kind,
incompetence.

/That/ is why you cannot work with the C standards committee.

I hope - though it is highly unlikely - that you will read that and it
will give you the insight needed to change your attitude. It is always
a good idea to try to work /with/ other people, rather than against
everyone.

> CAlive is my future.  End of story.
>

Sadly for you, that may be the case.

guinne...@gmail.com

unread,
Oct 30, 2019, 7:36:19 AM10/30/19
to
On Tuesday, 29 October 2019 20:31:52 UTC, Rick C. Hodgin wrote:
> On Tuesday, October 29, 2019 at 4:21:48 PM UTC-4, David Spencer wrote:
> > "Rick C. Hodgin" <rick.c...@gmail.com> writes:
> >
> > >On 10/28/2019 10:55 PM, David Spencer wrote:
> > >Hi David. Consider:
> >
> > > (x)
> > > +----->>> New language
> > > |
> > > C ------+----->>> C
> >
> > >At point (x), they would still be the same language.
> >
> > The new language you propose is not the same language at x, or at any
> > other point.
> >
> > It would not be compatible with
> > int nocode = 0;
> >
> > which has been C since it was first conceived.
>
> New keywords have been added to C with each revision. At some
> point the keyword "auto" did not exist. int auto = 0; would've
> been valid. It's not any more.

At what point did C not have the keyword 'auto'?

It's present even in the earliest C manual I can find:
https://www.bell-labs.com/usr/dmr/www/cman.pdf

Rick C. Hodgin

unread,
Oct 30, 2019, 8:03:20 AM10/30/19
to
On Tuesday, October 29, 2019 at 10:13:09 PM UTC-4, Richard Damon wrote:
> On 10/29/19 7:36 PM, Rick C. Hodgin wrote:
> > Exactly when was "auto" reserved by the C committee?  That C code
> > I wrote that's been compiling all these years since the 1980s now
> > breaks because they reserved the word auto in the 1990s?  2000s?
> > 2010s?
>
> To my memory auto wasn't reserved by the C committee because it was a
> key word decades before the C committee touched the language. I am very
> sure that auto was a key word in the first publication of K&R back in
> the 70s.

My mistake. Take some other keyword that wasn't there in the beg-
inning, that was added in later, and use that as a reference for
my point.

--
Rick C. Hodgin

Rick C. Hodgin

unread,
Oct 30, 2019, 8:10:29 AM10/30/19
to
I'm well beyond the point of caring on this issue, and indeed any-
thing related to the betterment of C.

> The fallthrough attribute in C++ (which may come to Standard C) is the
> closest example to what you are proposing, but that relates to a common
> warning in many implementations, generated in response to a common
> programming error, for a construct that actually is commonly used
> properly too, so has a need for a standard way to being suppressed.

C seems to be reactionary to what exists, not proactive as by a
guiding philosophy. I think that's why I take exception to its
various facets and implementation, because there are some things
that need to be added IMO. Why C hasn't introduced the basic and
simple class, for example, is beyond me. That one facet alone
would take so much C++ code away from C++ because many people do
not want C++'s complexity, but use it because it supports classes.

Other features are like that too ... but who cares, right? Cer-
tainly not I.

> This nocode condition doesn't seem to be a common programming error, and
> I don't hear many people complaining about getting warnings when
> legitimately using that construct, so it doesn't meet the other parts of
> the condition. To me, it seems to be a cute solution looking for a
> problem to solve, but not finding one.
> >
> >> All required diagnostics result in the program's behavior being
> >> undefined, (if a program is created) so are effectively 'errors', but an
> >> implementation is able to define the behavior and thus provide an
> >> extension. To add the concept of a Warning to the standard would be a
> >> major change and significant work.
> >>
> >> Because it breaks valid programs using what can be a useful feature, it
> >> also has a very high hurdle to pass to show it add something of great
> >> value, which I don't see. I will say that personally, this isn't the
> >> sort of error I see being done often.
> >
> > We have:
> >     void function_name(void);
> >
> > Why not just this:
> >     function_name;
>
> Because that doesn't parse to the same meaning.

If you re-designed your parser it could.

> Omitting the () in the definition make it a very different sort of
> declaration.

That a typo. I intended to write:

function_name();

When function_name had not yet been defined, referencing it
like that could assume void function_name(void). And I do
recognize that there's a default int type assumed, which I
also think is ridiculous. If you're going to have a strongly
typed language, you should also have a strongly declared lan-
guage.

This is my last post on the matter. I don't care about C any
longer. There's only so much a man can take before he steps
away and leaves the thing he's in pursuit of for something
more under his control / ability to influence.

I've spent the last year taking steps to move my life in a
direction that will give me that control. We'll see what
fruit it bears.

--
Rick C. Hodgin

Rick C. Hodgin

unread,
Oct 30, 2019, 8:13:22 AM10/30/19
to
On Wednesday, October 30, 2019 at 4:48:31 AM UTC-4, David Brown wrote:
> [SNIP SNIP SNIP SNIP SNIP]

DAVID!!!

FOR THE 98TH TIME ... I DO NOT READ YOUR POSTS. I WILL NEVER
READ YOUR POSTS AGAIN. I DO NOT WANT YOUR INFLUENCE IN MY LIFE
AT ALL EVER.

STOP POSTING REPLIES TO ME.

IF YOU WANT TO POST A REPLY TO SOMETHING I WRITE, TAKE ALL OF
MY REFERENCES AND ATTRIBUTION OUT OF IT AND QUOTE ONLY THE WORDS
I WROTE WITHOUT ANY ASSOCIATION TO ME, AND THEN REPLY TO THOSE
BLANKENED, GENERIC, FACELESS POINTS.

GOOD-BYE FOREVER!!!

AND LEAVE ME ALONE!!!

--
Rick C. Hodgin

james...@alumni.caltech.edu

unread,
Oct 30, 2019, 11:12:26 AM10/30/19
to
On Tuesday, October 29, 2019 at 7:36:49 PM UTC-4, Rick C. Hodgin wrote:
> On 10/29/2019 5:47 PM, james...@alumni.caltech.edu wrote:
> > On Tuesday, October 29, 2019 at 4:31:52 PM UTC-4, Rick C. Hodgin wrote:
> >> On Tuesday, October 29, 2019 at 4:21:48 PM UTC-4, David Spencer wrote:
> > ...
> >>> It would not be compatible with
> >>> int nocode = 0;
> >>>
> >>> which has been C since it was first conceived.
> >>
> >> New keywords have been added to C with each revision.
> >
> > Yes, and for quite some time now, the committee has been careful to
> > choose all new keywords from the name space reserved for
> > implementations, which therefore cannot cause problems for code that
> > strictly conformed to previous versions of the standard. I don't think
> > your suggestion has any significant chance of getting accepted into the
> > standard, but it would greatly increase that chance to use such a name,
> > such as _Nocode.
>
> Never happen. I think the use of features like that in a language,
> unless they are explicitly going out of their way to not be invasive
> in their name usage, are ridiculous.

The following identifiers were given no meaning by C90, but were reserved to the implementation. They were all given specific meanings by later versions of the standard:
_Alignas, _Alignof, _Atomic, _Bool, _Complex, _Generic, _Imaginary,
_Noreturn, _Static_assert, _Thread_local, __func__, __STDC_HOSTED__,
__STDC_VERSION__, __STDC_ISO_10646__, __STDC_MB_MIGHT_NEQ_WC__,
__STDC_UTF_16__, __STDC_UTF_32__, __STDC_ANALYZABLE__,
__STDC_IEC_559_COMPLEX__, __STDC_LIB_EXT1__, __STDC_NO_ATOMICS__,
__STDC_NO_COMPLEX__, __STDC_NO_THREADS__, __STDC_NO_VLA__, _Pragma,
_Imaginary_I, _Complex_I, __STDC_WANT_LIB_EXT1__, __alignas_is_defined,
__bool_true_false_are_defined, _Exit.

Because those identifiers were reserved in C90, they could not be used
in strictly conforming C90 code. As a result, the features implemented
by giving standard-defined meanings to those identifiers could be added
to later versions of the language without any danger of breaking
existing strictly conforming code. If you consider that ridiculous -
well, that's just another example of your poor judgment on matters of
language design.

> int8_t
> int16_t
> int32_t
> int64_t
>
> Why _t?

POSIX reserves identifiers that end with _t for use in identifying
types. C can be implemented on systems that don't conform to POSIX, but
C was developed in parallel with POSIX, and often follows POSIX
conventions where appropriate.

In CAlive I defined signed and unsigned:
>
> s8, s16, s32, s64
> u8, u16, u32, u64
>
> And can you guess what these are?

No. Without the _t, I would suspect them of being variable names, not
types - which is the purpose of that convention. I have in fact used
precisely those identifiers to identify variables, and I've seen the
same use in other people's code. It's particularly common in members
of unions.

> >> As with all new extensions, things in the past which use the
> >> new keywords / syntax will break.
> >
> > If you choose a reserved name as a keyword, all code which previously
> > used that keyword was already broken.
>
> Exactly when was "auto" reserved by the C committee? That C code
> I wrote that's been compiling all these years since the 1980s now
> breaks because they reserved the word auto in the 1990s? 2000s?
> 2010s?

What are you talking about? By any chance are you still compiling using
a C++ compiler, and then once again complaining in a newsgroup devoted
to the C standard about things that went wrong because C++ is a
different language than C?

"auto" has been a keyword for longer than C has been a language. It is,
in fact, a carry over from one of C's predecessor languages (I'm not
sure whether it was introduced in B or in BCPL). It actually made more
sense in the predecessor language. In C, the only places where you're
allowed to use "auto" are places where automatic storage duration is the
default, so leaving out "auto" would give the same result as using it.
As a result, C code almost never uses "auto". Something about it's
predecessor language was different (I'm not sure what) that made "auto"
a reasonable thing to use in that language.

That's why C++ was able to give "auto" a new meaning, one that probably
broke your code when you compiled it in C++. But that's not the fault of
either committee. There's no good reason to use "auto" in C code, and if
you insist on using a C++ compiler, the new meaning of that keyword is a
good reason why you should not be using it with its old meaning.

> It doesn't matter. I keep saying this, and it's hard for me to
> flatly and coldly turn my back on something or someone (it takes
> a real chain of events to make it happen), but I'm done with any
> attempt to improve C via the C Standard or such related.

If you're done with it, why are you posting new suggestions to
comp.std.c?

Keith Thompson

unread,
Oct 30, 2019, 11:41:13 AM10/30/19
to
"Rick C. Hodgin" <rick.c...@gmail.com> writes:
Rick, that's not how this works. David can post whatever he likes.
By posting to Usenet, you give everyone implicit permission to read
and quote what you write. If you don't like what he posts, ignore
it. If you use a real newsreader, use a killfile. (If you use
Google Groups, I don't know whether that's an option, but a Google
search for "google groups killfile" might turn up something useful.)

It's probably useful to let him know (once) that you won't be
reading his posts so he doesn't waste his time, but you don't get
to tell him not to reply to yours. (Well, you can tell him, but
he can ignore you.)

In a similar vein, my filters are newsgroup-specific. I was
already filtering out your posts in comp.lang.c. I will now do so
in comp.std.c.

--
Keith Thompson (The_Other_Keith) ks...@mib.org <http://www.ghoti.net/~kst>
Working, but not speaking, for Philips Healthcare
void Void(void) { Void(); } /* The recursive call of the void */

Rick C. Hodgin

unread,
Oct 30, 2019, 12:46:41 PM10/30/19
to
On Wednesday, October 30, 2019 at 11:41:13 AM UTC-4, Keith Thompson wrote:
> Rick, that's not how this works. David can post whatever he likes.

Of course he can. My request remains. It's on David now to
honor that request or not.

> [obvious explanation snipped]

--
Rick C. Hodgin

Nick Bowler

unread,
Oct 30, 2019, 12:47:09 PM10/30/19
to
On Wed, 30 Oct 2019 08:12:24 -0700, jameskuyper wrote:
> "auto" has been a keyword for longer than C has been a language. It is,
> in fact, a carry over from one of C's predecessor languages (I'm not
> sure whether it was introduced in B or in BCPL). It actually made more
> sense in the predecessor language. In C, the only places where you're
> allowed to use "auto" are places where automatic storage duration is the
> default, so leaving out "auto" would give the same result as using it.
> As a result, C code almost never uses "auto". Something about it's
> predecessor language was different (I'm not sure what) that made "auto"
> a reasonable thing to use in that language.

In B, local variables are declared like this:

auto x;

This syntax was permitted and works basically the same way in the original
C standards. In this specific case, removing 'auto' in fact changes the
meaning of the declaration (so in C90 it was not the case, in general,
that removing "auto" would give the same result as using it).

C99 and later standards completely removed this particular declaration
style from the language; all remaining uses of the auto keyword are truly
redundant. However most modern C compilers still support this syntax
(with the required diagnostic message) even when they implement recent
standards.

David Brown

unread,
Oct 30, 2019, 1:30:08 PM10/30/19
to
On 30/10/2019 17:44, Rick C. Hodgin wrote:
> On Wednesday, October 30, 2019 at 11:41:13 AM UTC-4, Keith Thompson wrote:
>> Rick, that's not how this works. David can post whatever he likes.
>
> Of course he can. My request remains. It's on David now to
> honor that request or not.
>

I don't want to get more repetitively off-topic, but the point - as
Keith explained - is that this is a public newsgroup. Anything a person
posts here may be read and quoted by anyone else, and anyone can reply
to those posts. Replies in a newsgroup are not personal - they are for
everyone who accesses the group or its archives, now and in the future.
Posts may be more or less targeted at a particular poster, but they are
always for the whole group.

If you want to take the advice or use the information in posts I make,
great. If you want to ignore them, that's your choice. When people
make informative posts to newsgroups, it is usually in the hope that it
will be of use or interest to others as well. If I had wanted to make a
personal reply to you, Rick, I'd have sent an email.

No, I will not honour any request to avoid following up to Usenet posts
just because you made the post. That is not how Usenet works. I will
happily honour a request not to make direct contact or email you -
because that is about personal contact. But Usenet is public, and you
don't get to impose gag orders on anyone.

And I really hope that is the end of this particular silliness.

James Kuyper

unread,
Oct 30, 2019, 1:30:20 PM10/30/19
to
I knew that, and forgot about it, and should have modified my
explanation to accommodate those facts. However, even in C90, that usage
of "auto" could equally well have been handled by typing "int x;", which
involves (very marginally) less typing.

David Brown

unread,
Oct 30, 2019, 1:37:35 PM10/30/19
to
On 30/10/2019 17:47, Nick Bowler wrote:
> On Wed, 30 Oct 2019 08:12:24 -0700, jameskuyper wrote:
>> "auto" has been a keyword for longer than C has been a language. It is,
>> in fact, a carry over from one of C's predecessor languages (I'm not
>> sure whether it was introduced in B or in BCPL). It actually made more
>> sense in the predecessor language. In C, the only places where you're
>> allowed to use "auto" are places where automatic storage duration is the
>> default, so leaving out "auto" would give the same result as using it.
>> As a result, C code almost never uses "auto". Something about it's
>> predecessor language was different (I'm not sure what) that made "auto"
>> a reasonable thing to use in that language.
>
> In B, local variables are declared like this:
>
> auto x;
>
> This syntax was permitted and works basically the same way in the original
> C standards. In this specific case, removing 'auto' in fact changes the
> meaning of the declaration (so in C90 it was not the case, in general,
> that removing "auto" would give the same result as using it).

I think that was only because of the "implicit int". In older C,
"implicit int" meant that "auto x;" had the same meaning as "auto int
x;". Removing the "auto" from that, you'd get "int x;", which would
have exactly the same meaning (in contexts where "auto int x;" is allowed).

I haven't used B (or BCPL) - could it be that they did not have
different types, and thus no equivalent of "int x;" ?

>
> C99 and later standards completely removed this particular declaration
> style from the language; all remaining uses of the auto keyword are truly
> redundant. However most modern C compilers still support this syntax
> (with the required diagnostic message) even when they implement recent
> standards.
>

Once "implicit int" was removed from the language, you could no longer
write "auto x;" - you had to write "auto int x;", "register int x;" or
just "int x;" (with the auto implied) for a non-static local variable.
As there is no benefit in including the "auto", it is rarely seen. But
compilers /must/ still support "auto int x;" if they try to be
conforming, as it is still allowed even in modern standards. And many
still allow "auto x;" (with implicit int) when compiling in lax modes,
for support for older code.

Nick Bowler

unread,
Oct 30, 2019, 2:34:41 PM10/30/19
to
On Wed, 30 Oct 2019 18:37:32 +0100, David Brown wrote:
> On 30/10/2019 17:47, Nick Bowler wrote:
>> In B, local variables are declared like this:
>>
>> auto x;
>>
>> This syntax was permitted and works basically the same way in the
>> original C standards. In this specific case, removing 'auto' in fact
>> changes the meaning of the declaration (so in C90 it was not the case,
>> in general, that removing "auto" would give the same result as using
>> it).
>
> I think that was only because of the "implicit int". In older C,
> "implicit int" meant that "auto x;" had the same meaning as "auto int
> x;". Removing the "auto" from that, you'd get "int x;", which would
> have exactly the same meaning (in contexts where "auto int x;" is
> allowed).

This is all true, but as soon as you add 'int' to the declaration the
syntax is no longer compatible with B. This is probably not a problem
because I doubt there were many programmers who cared about maintaining
B-like syntax in 1999.

> I haven't used B (or BCPL) - could it be that they did not have
> different types, and thus no equivalent of "int x;" ?

B has no data types (or depending how you look at it, only one type).

Imagine a variant of C where the only object type you are allowed to use
is 'int', and that you can use an 'int' value in any context where you
might want to use a pointer to an int. Require 'auto' when declaring
local variables, then delete all the 'int' keywords and you'll have a
pretty good idea of what B programs look like.

And if you write a program under this variant, it is quite likely that
it will be successfully translated by a modern C compiler (with warnings),
provided that conversions from int * to int do not lose information (this
is a requirement since B has no pointer types).

Ben Bacarisse

unread,
Oct 30, 2019, 8:19:04 PM10/30/19
to
guinne...@gmail.com writes:

> On Tuesday, 29 October 2019 20:31:52 UTC, Rick C. Hodgin wrote:
<cut>
>> New keywords have been added to C with each revision. At some
>> point the keyword "auto" did not exist. int auto = 0; would've
>> been valid. It's not any more.
>
> At what point did C not have the keyword 'auto'?
>
> It's present even in the earliest C manual I can find:
> https://www.bell-labs.com/usr/dmr/www/cman.pdf

It's also in C's predecessor B. I think it's safe to say it has always
been in C, even before C was publicly described.

--
Ben.

Rick C. Hodgin

unread,
Oct 30, 2019, 9:29:28 PM10/30/19
to
On 10/30/2019 11:12 AM, james...@alumni.caltech.edu wrote:
> On Tuesday, October 29, 2019 at 7:36:49 PM UTC-4, Rick C. Hodgin wrote:
>> Exactly when was "auto" reserved by the C committee? That C code
>> I wrote that's been compiling all these years since the 1980s now
>> breaks because they reserved the word auto in the 1990s? 2000s?
>> 2010s?
>
> What are you talking about? ...
>
> "auto" has been a keyword for longer than C has been a language. It is,
> in fact, a carry over from one of C's predecessor languages (I'm not
> sure whether it was introduced in B or in BCPL). It actually made more
> sense in the predecessor language. In C, the only places where you're
> allowed to use "auto" are places where automatic storage duration is the
> default, so leaving out "auto" would give the same result as using it.
> As a result, C code almost never uses "auto". Something about it's
> predecessor language was different (I'm not sure what) that made "auto"
> a reasonable thing to use in that language.

I've never used it. I didn't even know it was a C language keyword
until I was reading the C Standard the other day. After reading what
it does I still don't see why it exists. I came across a website
referencing it that reads:

"In the language C auto is a keyword for specifying a storage
duration. When you create an auto variable it has an 'automatic
storage duration'. We call these objects 'local variables'. In
C, all variables in functions are local by default. That’s why
the keyword auto is hardly ever used."

A wasted use of an otherwise desirable functionality as is seen in
C++.

This is what I'm talking about with the C committees being reaction-
ary rather than visionary. They don't move in ways that make sense.
They move trailing others who are moving in ways they think makes
sense, effectively codifying what's seen in the real world.

It should be more than that. The C committee should be steering.

> That's why C++ was able to give "auto" a new meaning, one that probably
> broke your code when you compiled it in C++.

There was no code. It was a fictitious example conjured up to
prove a point. It would work with any keyword that was later
added to the language.

>> It doesn't matter. I keep saying this, and it's hard for me to
>> flatly and coldly turn my back on something or someone (it takes
>> a real chain of events to make it happen), but I'm done with any
>> attempt to improve C via the C Standard or such related.
>
> If you're done with it, why are you posting new suggestions to
> comp.std.c?

I am done with it now after this go around. I had said previously
I was done with it, but it's very difficult for me to hold rigid
lines on things. I tend to soften over time. It takes a very
special set of circumstances to push me to that point and keep me
there.

--
Rick C. Hodgin

james...@alumni.caltech.edu

unread,
Oct 31, 2019, 11:37:08 AM10/31/19
to
It was needed by C's predecessor, B. It remained somewhat useful in C
up until C99, when removal of the implicit int rule also removed what
little remaining usefulness it had. That's the kind of thing that
happens whenever a language lives long enough - if CAlive ever becomes
anything other than vaporware, I'm sure that it will eventually start
accumulating historical artifacts, too.

> This is what I'm talking about with the C committees being reaction-
> ary rather than visionary. They don't move in ways that make sense.
> They move trailing others who are moving in ways they think makes
> sense, effectively codifying what's seen in the real world.
>
> It should be more than that. The C committee should be steering.

Once something has been standardized, it's extremely difficult,
bordering on the impossible, to remove it if it later turns out to have
been a mistake. "auto" is a prime example of this. Therefore, the
committee generally prefers not to standardize a new feature until after
it has been successfully implemented and put to use as an extension, so
they can get a better idea of whether or not it can work. As a result, C
is generally steered by the implementors, who in turn are often pushed
by their users, rather than being steered by the committee. I don't see
this as a particularly bad thing.

I think your fundamental problem with this process is not who's doing
the steering, but rather the fact that the people who are doing the
steering are, in general, too competent, sane, and wise to steer the
language in the directions you want it to go (which is not to say that
they're unusually competent, sane, or wise).


> > That's why C++ was able to give "auto" a new meaning, one that probably
> > broke your code when you compiled it in C++.
>
> There was no code. It was a fictitious example conjured up to
> prove a point. It would work with any keyword that was later
> added to the language.

With the exception of "restrict" all new keywords since C99 have been
chosen from the name space reserved for implementations. It looks like "restrict" is not only the best example you could use, but also probably
the only one. The addition of the other keywords has not broken any
existing strictly conforming code, belying your claim that such breakage
is inevitable.

Rick C. Hodgin

unread,
Oct 31, 2019, 12:38:58 PM10/31/19
to
On Thursday, October 31, 2019 at 11:37:08 AM UTC-4, james...@alumni.caltech.edu wrote:
> On Wednesday, October 30, 2019 at 9:29:28 PM UTC-4, Rick C. Hodgin wrote:
> > This is what I'm talking about with the C committees being reaction-
> > ary rather than visionary. They don't move in ways that make sense.
> > They move trailing others who are moving in ways they think makes
> > sense, effectively codifying what's seen in the real world.
> >
> > It should be more than that. The C committee should be steering.
>
> Once something has been standardized, it's extremely difficult,
> bordering on the impossible, to remove it if it later turns out to have
> been a mistake.

Deprecation is wide-spread in every language I've ever used. Even
in Java used for mobile Android development.

It is expected that things will migrate / morph over time as newer
and better and different technologies, abilities, features, or what-
ever the cue is, change, that the tools used previously can still
be used if you want those older features, but if you want to move
forward some features need to be faded out.

> "auto" is a prime example of this. Therefore, the
> committee generally prefers not to standardize a new feature until after
> it has been successfully implemented and put to use as an extension, so
> they can get a better idea of whether or not it can work. As a result, C
> is generally steered by the implementors, who in turn are often pushed
> by their users, rather than being steered by the committee. I don't see
> this as a particularly bad thing.

That's what I disagree with. C should be a steering committee,
as well as a codifying body.

> I think your fundamental problem with this process is not who's doing
> the steering, but rather the fact that the people who are doing the
> steering are, in general, too competent, sane, and wise to steer the
> language in the directions you want it to go (which is not to say that
> they're unusually competent, sane, or wise).

It's that there is a direction C needs to go. It needs the class.
It needs to have an option to tighten up type checking. It needs
some relaxations like not requiring the keyword struct on refer-
ences, single-line comments, anonymous unions, and I think the 2nd
biggest feature: should not REQUIRE forward declarations. The
compiler should be allowed to morph into an n-pass compiler where
unresolved forward declarations can be set aside on pass 0, and
then become resolved for pass 1.

To me, these are absolutely fundamental aspects of the nature of
the language. It calls out for it by its own voice, and not even
by my direction. When I sit back and look at what a C-like pro-
gramming language needs, this is what IT tells me.

> > > That's why C++ was able to give "auto" a new meaning, one that probably
> > > broke your code when you compiled it in C++.
> >
> > There was no code. It was a fictitious example conjured up to
> > prove a point. It would work with any keyword that was later
> > added to the language.
>
> With the exception of "restrict" all new keywords since C99 have been
> chosen from the name space reserved for implementations. It looks like "restrict" is not only the best example you could use, but also probably
> the only one. The addition of the other keywords has not broken any
> existing strictly conforming code, belying your claim that such breakage
> is inevitable.

Then use restrict as the example to prove my point.

Deprecation exists for a reason. You cycle it out 2 or 4 or n
generations, allowing it to be used all that time with the flag
that it's been deprecated, and then remove it. That gives all
actively software projects / developers plenty of time to make
the changes. And those that aren't actively developed still run
their old binary code just like always.

It's a different philosophy of use, a different vision I have
for C and C++. It's why CAlive will be successful. Many devel-
opers out there have the same wishes I do, they just don't real-
ize it yet because all they have is what they've been given.
But you can look at several new programming languages lately
and the features they incorporate and you can see I'm no alone
in this vision. My goals are just to keep it closer to C than
other goals. I want CAlive to compile existing C code with zero
changes and no errors when using a compatibility mode. It will
just do so with the CAlive LiveCode (edit-and-continue) features,
as well as new ABI features like the inquiry, and several others.

C will be relegated to niche areas increasingly as time goes on.
Other languages running on today's incredibly fast hardware, are
sufficient for 99% of apps. And in those cases where developers
want more, they won't want to stay at C's low level. And they
won't want to learn C++'s endless features and nuances to write
clean, efficient C++ code. They want something simple, useful,
powerful, flexible, easy to develop in/for, easy to update and
maintain, etc. Enter CAlive's direct target / goals / focus,
coupled also to a faith in Christ applied to the product and its
license, bringing the reality of God back into our daily lives,
yes even our programming lives. *GASP!* Yes, we are supposed
to have God active in ALL areas of our lives, not just at a
church on Sundays and Tuesdays for Bible Study. All the time He
is to be maintained AHEAD of all of our efforts, including every-
thing we do day-in / day-out.

CAlive is coming into existence for a reason. And oh how I wish
those associated with C would've been more accommodating to new
ideas, having a vision, pushing forward rather than following be-
hind. It would've saved me years of work.

--
Rick C. Hodgin

Keith Thompson

unread,
Oct 31, 2019, 1:14:25 PM10/31/19
to
David Brown <david...@hesbynett.no> writes:
[...]
> Once "implicit int" was removed from the language, you could no longer
> write "auto x;" - you had to write "auto int x;", "register int x;" or
> just "int x;" (with the auto implied) for a non-static local variable.
> As there is no benefit in including the "auto", it is rarely seen. But
> compilers /must/ still support "auto int x;" if they try to be
> conforming, as it is still allowed even in modern standards. And many
> still allow "auto x;" (with implicit int) when compiling in lax modes,
> for support for older code.

Yes, many compilers allow old forms for compatibility with earlier
editions of the standard. But as far as the standard is concerned,
allowing "auto x;" is no different from allowing any other non-C
syntax. Backward compatibility extensions have no special status
in the standard; the old forms are simply no longer C.

(Expanding on what you wrote, not disagreeing.)

David Brown

unread,
Oct 31, 2019, 2:50:22 PM10/31/19
to
On 31/10/2019 17:38, Rick C. Hodgin wrote:
> On Thursday, October 31, 2019 at 11:37:08 AM UTC-4, james...@alumni.caltech.edu wrote:
>> On Wednesday, October 30, 2019 at 9:29:28 PM UTC-4, Rick C. Hodgin wrote:
>>> This is what I'm talking about with the C committees being reaction-
>>> ary rather than visionary. They don't move in ways that make sense.
>>> They move trailing others who are moving in ways they think makes
>>> sense, effectively codifying what's seen in the real world.
>>>
>>> It should be more than that. The C committee should be steering.
>>
>> Once something has been standardized, it's extremely difficult,
>> bordering on the impossible, to remove it if it later turns out to have
>> been a mistake.
>
> Deprecation is wide-spread in every language I've ever used. Even
> in Java used for mobile Android development.
>

A very important feature of C as a language is that it does /not/
deprecate anything without very good reason. C does not gain new
features easily - and it loses features even more reluctantly. This is
both a benefit and a limitation - but the C community (compiler vendors,
programmers, and standards committee) have taken the view that C is to
be a stable language. This means - unlike many other languages - you
can take old C code, compile it with a new C compiler (on a platform
that didn't exist when the code was written), and have a good chance of
everything working correctly. The disadvantage is that the language
does not gain much in the way of new features - C is what it is, and
your choice is to accept that or find a different language.

I have personally made updated versions of C programs I wrote over
twenty years ago, and the language still works the same on modern tools.

From the C99 Rationale:

"""
The Committee is content to let C++ be the big and ambitious language.
While some features of C++ may well be embraced, it is not the
Committee’s intention that C become C++.
"""

Indeed, I'd recommend reading the entire introduction to the C99
Rationale to get an idea of how C changes.

> It is expected that things will migrate / morph over time as newer
> and better and different technologies, abilities, features, or what-
> ever the cue is, change, that the tools used previously can still
> be used if you want those older features, but if you want to move
> forward some features need to be faded out.

That is common, but not necessary - and while it is often not a problem,
it is not always desirable. That is why there is a place for a stable
language like C, as well as more innovative and faster-changing languages.

>
>> "auto" is a prime example of this. Therefore, the
>> committee generally prefers not to standardize a new feature until after
>> it has been successfully implemented and put to use as an extension, so
>> they can get a better idea of whether or not it can work. As a result, C
>> is generally steered by the implementors, who in turn are often pushed
>> by their users, rather than being steered by the committee. I don't see
>> this as a particularly bad thing.
>
> That's what I disagree with. C should be a steering committee,
> as well as a codifying body.

The C community as a whole disagrees with you. You may want a more
top-down for your own languages (and top-down has its benefits too), but
that is not the way C works.

>
>> I think your fundamental problem with this process is not who's doing
>> the steering, but rather the fact that the people who are doing the
>> steering are, in general, too competent, sane, and wise to steer the
>> language in the directions you want it to go (which is not to say that
>> they're unusually competent, sane, or wise).
>
> It's that there is a direction C needs to go.

Why? C is fine as it is, receiving mostly just small fixes or improvements.

> It needs the class.

No, it does not. That would give you a different language. There are
already plenty of languages that are a bit like C, but have classes -
there is no need for C to become one of them.

> It needs to have an option to tighten up type checking. It needs
> some relaxations like not requiring the keyword struct on refer-
> ences, single-line comments, anonymous unions, and I think the 2nd
> biggest feature: should not REQUIRE forward declarations. The
> compiler should be allowed to morph into an n-pass compiler where
> unresolved forward declarations can be set aside on pass 0, and
> then become resolved for pass 1.

Single-line comments were added in C99, and anonymous unions moved from
a very common extension to standard in C11. Surely it is a good idea to
learn what the C language and the C standards say, before criticising it?

>
> To me, these are absolutely fundamental aspects of the nature of
> the language.

Then you are not talking about C. They may well be features you like to
see in programming languages - everyone has their own preferences and
opinions. But you cannot call them "absolutely fundamental aspects of
the nature of the language", given that the C language has been in heavy
use for about 50 years and does not have these features!

> It calls out for it by its own voice, and not even
> by my direction. When I sit back and look at what a C-like pro-
> gramming language needs, this is what IT tells me.
>

This is all your personal opinion and preferences, nothing more. If you
feel that strongly, then use a different programming language that has
the features you want, not C. You can't blame C for not being the
language you want to use - that's just silly. It's like blaming an
apple pie for not tasting chocolaty when chocolate cake is your favourite.
Odd - C has existed for 50 years or so, and is still hugely popular and
important. Most other languages either remain niche, or are popular for
a few years before fading away. Your claims are completely disconnected
from reality.

> Other languages running on today's incredibly fast hardware, are
> sufficient for 99% of apps. And in those cases where developers
> want more, they won't want to stay at C's low level.

No one has ever suggested that C is the best language choice for all
use. Most people use C for coding problems that suit C, and other
languages for coding problems best solved in other languages.

> And they
> won't want to learn C++'s endless features and nuances to write
> clean, efficient C++ code.

One of the few challengers to C for a long-lived and endlessly popular
language, is C++.

> They want something simple, useful,
> powerful, flexible, easy to develop in/for, easy to update and
> maintain, etc.

People who want that, use other languages. There are lots to choose from.

> Enter CAlive's direct target / goals / focus,
> coupled also to a faith in Christ applied to the product and its
> license, bringing the reality of God back into our daily lives,
> yes even our programming lives. *GASP!* Yes, we are supposed
> to have God active in ALL areas of our lives, not just at a
> church on Sundays and Tuesdays for Bible Study. All the time He
> is to be maintained AHEAD of all of our efforts, including every-
> thing we do day-in / day-out.
>

If that's what you want, that's fine for /you/ - and anyone who shares
your opinions and preferences. But it does not mean there is something
wrong with C (or any other language).

> CAlive is coming into existence for a reason. And oh how I wish
> those associated with C would've been more accommodating to new
> ideas, having a vision, pushing forward rather than following be-
> hind. It would've saved me years of work.
>

Did it never occur to you that no one who works with C was interested in
any of your ideas? An idea being "new" does not automatically make it a
good idea - and even an idea that is good for some programming languages
is not automatically good for other languages. I have heard countless
ideas from you over the years, and there were barely two or three that I
ever thought had useful potential for any programming language - none
that made sense for C. (There were more that could have been turned
into good ideas, had you been open to discussion.) If you think your
ideas will let you build a wonderful new language "CAlive", then great -
go for it. But please cut out the delusion that you alone know what is
good for C.

Rick C. Hodgin

unread,
Nov 1, 2019, 10:20:29 AM11/1/19
to
On Wednesday, October 30, 2019 at 11:12:26 AM UTC-4, james...@alumni.caltech.edu wrote:
> On Tuesday, October 29, 2019 at 7:36:49 PM UTC-4, Rick C. Hodgin wrote:
> > int8_t
> > int16_t
> > int32_t
> > int64_t
> >
> > Why _t?
> >
> > In CAlive I defined signed and unsigned:
> >
> > s8, s16, s32, s64
> > u8, u16, u32, u64
> >
> > And can you guess what these are?
>
> No. Without the _t, I would suspect them of being variable names, not
> types - which is the purpose of that convention.

What is "int"? I come from an assembly background. INT is an
interrupt. Okay, so in C int is an integer. Got it.

You learn what key / core features of a language are and then
use them as foundations. The same holds true for s8, s16, s32,
s64, and u8, u16, u32, u64. I also define other forms which
auto-upsize in use, including s24, s40, s48, s54, and their u
counterparts. They allow for packed data storage of unusual
sizes.

When I first heard the website Yahoo, or the website YouTube,
I thought those were ridiculous names. It would be like me
coming up with MeTalkie for a communications website. It may
sound silly today, but if it were popular it would just be a
name.

I once was talking to my computer illiterate neighbors who
needed help getting on the Internet and getting setup with
Windows 98 at that time.

The mother and the daughter were there, and I asked them this
literal question: "Is your web browser home page set to
launch into Yahoo.com?" And they both burst out laughing so
hard they started crying. They had no idea what I was talking
about, and only knew the words from everyday experience. We
still talk about that day. And when you look at that question
from that point of view, it really is quite a series of words
strung together.

Some of C's syntax is quite a thing. C++ even more so. But
once you get used to it, become fluent in it, it's all as it
should be.

We learn by doing, James. It's true of all things, including
programming languages. But I'll still argue all day long that
tagging things that will be used in every function with an _t
is beyond ridiculous, into a whole new realm of even-more-than-
ridiculous. It forces you to create your own typedefs or
#define statements to make them something more desirable.

--
Rick C. Hodgin

Philipp Klaus Krause

unread,
Aug 4, 2020, 5:00:27 AM8/4/20
to
Am 25.10.19 um 16:26 schrieb Rick C. Hodgin:
> On Friday, October 25, 2019 at 9:53:05 AM UTC-4, Philipp Klaus Krause wrote:
>> Am 25.10.19 um 14:26 schrieb Rick C. Hodgin:
>>> I'd like to propose that the nocode keyword be added to standard
>>> C. […]
>>
>> I don't think adding a keyword to silence warnings has a chance of being
>> accepted. Try with an attribute instead.
>
>
> It introduces a new opportunity to issue diagnostics by giving
> the developer positive control over empty code blocks, whereas
> today it's an unknown relegated to parsing comments in a best
> case, or examining the logic in a worst case to see if it is
> actually doing what it should.
>
> With nocode, that uncertainty is replaced with certainty.
>
> It's a good addition.
>

No, it is not. I doesn't change the semantics. Such stuff is exactly
what attributes were introduced for (so adding keywords would not be
necessary). Compare e.g. the fallthrough attribute.

Philipp Klaus Krause

unread,
Aug 4, 2020, 5:18:45 AM8/4/20
to
Am 31.10.19 um 02:29 schrieb Rick C. Hodgin:
>
> This is what I'm talking about with the C committees being reaction-
> ary rather than visionary.  They don't move in ways that make sense.
> They move trailing others who are moving in ways they think makes
> sense, effectively codifying what's seen in the real world.
>
> It should be more than that.  The C committee should be steering.

While you might disagree with how the comitee works, the current way has
even been stated explicitly. From the C2X charter N2086:

"8. Codify existing practice to address evident deficiencies. Only those
concepts that have some prior art should be accepted. (Prior art may
come from implementations of languages other than C.) Unless some
proposed new feature addresses an evident deficiency that is actually
felt by more than a few C programmers, no new inventions should be
entertained."

"13. Unlike for C99, the consensus at the London meeting was that there
should be no invention, without exception. Only those features that have
a history and are in common use by a commercial implementation should be
considered. Also there must be care to standardize these features in a
way that would make the Standard and the commercial implementation
compatible."
0 new messages