Google Groups no longer supports new Usenet posts or subscriptions. Historical content remains viewable.
Dismiss

Most important aspect of any compiler (C or otherwise)

198 views
Skip to first unread message

Rick C. Hodgin

unread,
Oct 4, 2016, 2:03:00 PM10/4/16
to
For me? Barring the obvious things like functionality and error-free binary
code, it would have to be edit-and-continue. And without equal.

I make so many continuous typing mistakes that are completely non-reflective
of what my thought processes are that it's absolutely amazing. I think one
thing and something completely different comes out of my fingers. This is
such a continuous ongoing aspect of my life, and there are so many times I
not only don't catch it, but I actually read it back correctly when proof-
reading. It reminds me of the Monty Python skit, "Problem with words,
Thripshaw's Disease":

Doctor: "But recently you have been having this problem with your
word order."
Patient: "Absolutely. What makes it worse is that sometimes at the
end of a sentence I'll come out with the wrong fusebox."
Doctor: "Fusebox?"
Patient: "And the thing about saying the wrong word is that a) I
don't notice it, and b) sometimes orange water given
bucket of plaster."

-----
As such, I view edit-and-continue as the single most desirable asset of
any software developer. It is a fully primary area of focus for my CAlive
compiler and IDE.

Best regards,
Rick C. Hodgin

John Gordon

unread,
Oct 4, 2016, 3:00:45 PM10/4/16
to
In <20aae408-4b77-483a...@googlegroups.com> "Rick C. Hodgin" <rick.c...@gmail.com> writes:

> For me? Barring the obvious things like functionality and error-free binary
> code, it would have to be edit-and-continue. And without equal.

So when the compiler encounters an error, it retains the compilation
artifacts (parse tree, symbol table, etc.) up to that point, and when the
user corrects the error it continues compiling where it left off?

What if the correction changed something that has already been compiled?

How much time can it possibly save? Compilation is pretty fast, no?

--
John Gordon A is for Amy, who fell down the stairs
gor...@panix.com B is for Basil, assaulted by bears
-- Edward Gorey, "The Gashlycrumb Tinies"

Rick C. Hodgin

unread,
Oct 4, 2016, 3:09:03 PM10/4/16
to
On Tuesday, October 4, 2016 at 3:00:45 PM UTC-4, John Gordon wrote:
> In <20aae408-4b77-483a...@googlegroups.com> "Rick C. Hodgin" <rick.c...@gmail.com> writes:
>
> > For me? Barring the obvious things like functionality and error-free binary
> > code, it would have to be edit-and-continue. And without equal.
>
> So when the compiler encounters an error, it retains the compilation
> artifacts (parse tree, symbol table, etc.) up to that point, and when the
> user corrects the error it continues compiling where it left off?
>
> What if the correction changed something that has already been compiled?
>
> How much time can it possibly save? Compilation is pretty fast, no?

There are two approaches to the design. Microsoft uses the top-down
recompilation which then looks for the deltas or diffs between the
generated binary images, and then updates things accordingly. It often
breaks data or functions on the stack, or fails to compile completely
if you stray away.

The model I'm using for CAlive takes the approach of saving the full
compilation state. Compilation is carried out continuously as things
are changed. But in general, dependency trees are maintained and if
things which were dependent upon something that's changed has changed,
the entire related lines of source code are stricken and recompiled.

The last three passes of the compiler are static, and these apply the
deltas to the binary image and data, even auto-injecting stale members
during compilation to deal with things that have been actively deleted,
but remain in data or on the stack.

And a full recompile can be signaled at any time with a single command
or button click in the IDE.

And to satiate Ben's oft-comment at this point: These are all the
planned operations. They are not coded yet, but are almost completely
designed. A few more aspects to find solutions to.

jacobnavia

unread,
Oct 4, 2016, 3:56:56 PM10/4/16
to
Le 04/10/2016 à 21:08, Rick C. Hodgin a écrit :
> The model I'm using for CAlive takes the approach of saving the full
> compilation state. Compilation is carried out continuously as things
> are changed. But in general, dependency trees are maintained and if
> things which were dependent upon something that's changed has changed,
> the entire related lines of source code are stricken and recompiled.

In paper, everything runs, is very clear, etc. Since CAlive is vaporware
and you do not seem to know what you are talking about, I have some
doubts about even the feasibility of what you are saying you will do.

I have the feeling that you start this discussions vaguely on topic to
mask the fact that you post religious crap all over the place.

Rick C. Hodgin

unread,
Oct 4, 2016, 4:06:56 PM10/4/16
to
CAlive already has a lot of its code developed for VFrP and VJr, two
xbase languages that are also incomplete, but possess a large base of
development (over 100K lines between the two of them, plus their other
supporting apps). And, I've spent a great deal of time offline going
over how to handle the processing of all of the features I've added to
CAlive. I have nearly every one of them sorted out, but there are a
few that I'm still trying to figure out the best way to incorporate
them into the overall RDC (Rapid Development Compiler) framework I'm
creating, which is not just this CAlive compiler, but is the vehicle
which will make CAlive exist, and also many other languages (prayerfully)
exist as well.

My goals in software design are comprehensive. I am targeting a 25-year
plan for my life. I'm 47 today. I began on the project 4 years ago. I
have 21 years left to complete all of the things I want to complete. I
will work on it as much as I'm able, and whatever remains I'll leave for
my posterity to continue.

As for the "religious" posts, I have no hidden agenda, Jacob. When I
post something about Jesus it's because it's appropriate to do so. There
are people in this forum (and the other forums I post into) who will never
go to a church, never visit a religious forum. They are valuable people
too, and I will make sure that when I stand before the Lord someday I will
be able to look Him in the eye and say, "Yes, Lord, I told them as best as
I was able."

This particular thread was started today because I have spent at least an
hour correcting my many typing mistakes on source code I've developed.
Some days are worse than others, and today's been pretty bad. It causes
me frustration when it keeps happening over and over on the most simple
things. I long for an edit-and-continue compiler in the language I'm
currently working in (Microsoft Visual FoxPro), which is why I began
writing VFrP and VJr, both open source alternatives for FoxPro and other
xbase languages.

BartC

unread,
Oct 4, 2016, 4:29:12 PM10/4/16
to
On 04/10/2016 20:08, Rick C. Hodgin wrote:
> On Tuesday, October 4, 2016 at 3:00:45 PM UTC-4, John Gordon wrote:
>> In <20aae408-4b77-483a...@googlegroups.com> "Rick C. Hodgin" <rick.c...@gmail.com> writes:
>>
>>> For me? Barring the obvious things like functionality and error-free binary
>>> code, it would have to be edit-and-continue. And without equal.
>>
>> So when the compiler encounters an error, it retains the compilation
>> artifacts (parse tree, symbol table, etc.) up to that point, and when the
>> user corrects the error it continues compiling where it left off?
>>
>> What if the correction changed something that has already been compiled?
>>
>> How much time can it possibly save? Compilation is pretty fast, no?
>
> There are two approaches to the design. Microsoft uses the top-down
> recompilation which then looks for the deltas or diffs between the
> generated binary images, and then updates things accordingly. It often
> breaks data or functions on the stack, or fails to compile completely
> if you stray away.

Microsoft Visual Studio, if that's what you're using, is a massive 1.5
million *file* application. Files, not lines!

Maybe you need something along those lines for an application that could
otherwise take *hours* to rebuild.

It might well however have taken them considerably more than 21 man years.

> The model I'm using for CAlive takes the approach of saving the full
> compilation state. Compilation is carried out continuously as things
> are changed. But in general, dependency trees are maintained and if
> things which were dependent upon something that's changed has changed,
> the entire related lines of source code are stricken and recompiled.

I used a completely different approach decades ago when developing
applications. Most user-level commands and features were implemented as
independent scripts. Scripts could be edited, compiled, run and re-run
from within the application, without needing to restart the app or go
through the same series of steps necessary to set up the data. If it
went wrong, I was still in the application.

But this is much, much simpler than what I think you're trying to do,
which I believe is some restart at the statement level rather than module.

--
Bartc

Rick C. Hodgin

unread,
Oct 4, 2016, 4:46:21 PM10/4/16
to
On Tuesday, October 4, 2016 at 4:29:12 PM UTC-4, Bart wrote:
> On 04/10/2016 20:08, Rick C. Hodgin wrote:
> > On Tuesday, October 4, 2016 at 3:00:45 PM UTC-4, John Gordon wrote:
> >> In <20aae408-4b77-483a...@googlegroups.com> "Rick C. Hodgin" <rick.c...@gmail.com> writes:
> >>
> >>> For me? Barring the obvious things like functionality and error-free binary
> >>> code, it would have to be edit-and-continue. And without equal.
> >>
> >> So when the compiler encounters an error, it retains the compilation
> >> artifacts (parse tree, symbol table, etc.) up to that point, and when the
> >> user corrects the error it continues compiling where it left off?
> >>
> >> What if the correction changed something that has already been compiled?
> >>
> >> How much time can it possibly save? Compilation is pretty fast, no?
> >
> > There are two approaches to the design. Microsoft uses the top-down
> > recompilation which then looks for the deltas or diffs between the
> > generated binary images, and then updates things accordingly. It often
> > breaks data or functions on the stack, or fails to compile completely
> > if you stray away.
>
> Microsoft Visual Studio, if that's what you're using, is a massive 1.5
> million *file* application. Files, not lines!

Visual Studio is a full IDE with a lot more features than I intend for my
IDE. It has SQL query abilities, editors for GUI components, etc. I
will offer those types of features via extensions that people can add.

> Maybe you need something along those lines for an application that could
> otherwise take *hours* to rebuild.

The largest application I have compiles in about 10 seconds in Visual
Studio, and about the same in GCC.

I don't need all of the functionality of Visual Studio. The main features
I use are the general IDE ability to redefine the workspace as required,
the Code Definition window (invaluable), and the refactoring abilities it
has, and those it can feed into add-on programs like Visual Assist X by
Whole Tomato (worth its weight in gold).

> It might well however have taken them considerably more than 21 man years.

No doubt.

> > The model I'm using for CAlive takes the approach of saving the full
> > compilation state. Compilation is carried out continuously as things
> > are changed. But in general, dependency trees are maintained and if
> > things which were dependent upon something that's changed has changed,
> > the entire related lines of source code are stricken and recompiled.
>
> I used a completely different approach decades ago when developing
> applications. Most user-level commands and features were implemented as
> independent scripts. Scripts could be edited, compiled, run and re-run
> from within the application, without needing to restart the app or go
> through the same series of steps necessary to set up the data. If it
> went wrong, I was still in the application.

I've thought a lot about how to implement what I call LiveCode (edit-and-
continue). I concluded that I want the ability to connect to a running
binary image that's been deployed on a user machine, to be able to remote
into that binary image, apply the appropriate program database, and allow
full editing as it is. But the goals of having an end-user application
is speed and efficiency. As such, support for the binary image must exist
through the entire compiler stack, including optimizations.

It's a tall order. I'm giving it my best and it's undoubtedly less than
I could do along with others helping me. It has always been my intention
to have others helping me, but so far no one's come on board to help. A
few inquiries, a few promises, but nothing's come to fruition yet.

> But this is much, much simpler than what I think you're trying to do,
> which I believe is some restart at the statement level rather than module.

CAlive allows something I call Horizontal Debugging, which allows you to
even step into an expression, through individual components. So, CAlive
has to maintain context information about where to restart for the inner
parts of an expression, as well as lines.

It's a comprehensive project. Everyone I've shown the details to in person
so far tell me I'm crazy. But, it's the goal I have and it's what's upon
my heart. It's been there since the early 90s. I've worked on various
parts at various times since then, but as I matured and began to look at
the larger "life's meaning" questions, I began to look into the philosophy
of what I was trying to do. I stepped back and considered the project in
the overall scheme of things, and not just as a tool, but as an offering
unto the Lord, unto mankind, one given over to helping other people and
making their lives better, rather than doing it for money. It's an outward
expression of the love I feel inside from the relationship I have with my
Lord and Savior, which is also why I haven't found anyone to come on board
and help me yet ... because I note that point in my source code, mention it
in emails, and wear it outwardly in the things I do. People recognize that
for what it is and stay away, always to my pain.

I don't imagine it will always be like this. Either the end-time events
will make it impossible to continue to develop due to life's hardships at
that time, or the rapture will take place, or other people will ultimately
at some point see the value in helping me and then come on board to do just
that, also helping themselves and other people as well.

I keep longing for that day when developer #2 comes on board to help out
on an ongoing basis. So far, I've had a couple nibbles, but other than that
it's just me and the crickets ongoing, and let me tell you that four years
of that kind of not only isolation in development, but to have people keep
telling you over and over what a ludicrous idea it is, or how I'll never be
able to complete it, or how I odn't have the skills to complete it, it is
really quite a continuous pounding to endure when all you possess within is
the desire to give people the absolute best of what you have, coupled to
the absolute best of what they have, as a collaborative effort given over
to make other people's lives better (and also my own of course, as I think
these tools will be wonderful when completed).

There's only one reason I am able to keep going. I've tried many times to
quit, and literally every time I've gotten to the point where I'm ready to
walk away, that very day something would come which encouraged me anew.
I've actually stood there at times with my mouth hanging open in disbelief
at how it's all happened.

I'm resolved now to continue on so long as I can. But, I can also tell
that as I'm getting older I'm not able to think as clearly as I used to,
or go as deep as I used to, or work for as long on a problem as I used to.
I'm losing my edge and things just take longer now. But, it's all part
of it. Whereas things used to come to me so easily, now they're harder.
I have to really want it now, whereas before it was just given to me and
I ran with it.

BartC

unread,
Oct 4, 2016, 7:31:08 PM10/4/16
to
On 04/10/2016 21:46, Rick C. Hodgin wrote:
> On Tuesday, October 4, 2016 at 4:29:12 PM UTC-4, Bart wrote:

>> But this is much, much simpler than what I think you're trying to do,
>> which I believe is some restart at the statement level rather than module.
>
> CAlive allows something I call Horizontal Debugging, which allows you to
> even step into an expression, through individual components. So, CAlive
> has to maintain context information about where to restart for the inner
> parts of an expression, as well as lines.

In that case you'd probably find it a lot easier to run an interpreter
on your code while debugging, rather than trying to do it on binary
native code.

> It's a comprehensive project.

> I keep longing for that day when developer #2 comes on board to help out
> on an ongoing basis. So far, I've had a couple nibbles, but other than that
> it's just me and the crickets ongoing, and let me tell you that four years
> of that kind of not only isolation in development

I've worked in isolation since around 1980, other than contacts with
clients. I found it a novelty recently to see youtube videos with people
actually discussing programming stuff out loud!

> There's only one reason I am able to keep going. I've tried many times to
> quit, and literally every time I've gotten to the point where I'm ready to
> walk away,

I've had that too - why not let someone else take over the headache of
creating these languages, and supporting the implementations and
provided large numbers of libraries. And yet...

But I just think of the stuff I do as a diversion. Like other people do
sudoku or build things in their sheds. (It's also cool when people see
me doing stuff on a laptop when travelling, yet there is no WiFi. What
can I possibly be doing!)

Just don't let it take over your life.

--
Bartc

Rick C. Hodgin

unread,
Oct 4, 2016, 7:58:19 PM10/4/16
to
Bartc wrote:
> Rick C. Hodgin wrote:
> > It's a comprehensive project.
> >
> > I keep longing for that day when developer #2 comes on board to help out
> > on an ongoing basis. So far, I've had a couple nibbles, but other than that
> > it's just me and the crickets ongoing, and let me tell you that four years
> > of that kind of not only isolation in development
>
> I've worked in isolation since around 1980, other than contacts
> with clients.

I'm trying to create a God-fearing GNU / Linux alternative with hardware
support as well, including CPUs, motherboards, peripheral hardware, and
even manufacturing at some late point. It is a full hardware and software
stack created in direct response to the teachings of Jesus Christ, in giving
unto others rather than hoarding behind copyrights and patents.
It is a labor of love applied in this industry, something I would like
to see happen in all industries that are of benefit. It's part of the Village Freedom Project, of freeing us from traditional
models of "seek gain for self," to instead apply the teachings of Jesus Christ
of "seek gain for others, and in so doing also gain for yourself, through
their gain."

It is not something I can do alone, nor do I want to. I have good ideas, a direct
philosophical vision, a strong drive, and a solid and growing knowledge and
understanding of what's required in all disciplines for my project to
succeed.

People can benefit from what I have to offer, and the One from / for whom
I'm offering it. By following the teachings of Jesus Christ, everybody gains
maximally. That is my goal, and it is not mine alone. Many people just don't
realize yet that they also have that goal.

Ian Collins

unread,
Oct 4, 2016, 8:42:39 PM10/4/16
to
On 10/ 5/16 08:08 AM, Rick C. Hodgin wrote:
> On Tuesday, October 4, 2016 at 3:00:45 PM UTC-4, John Gordon wrote:
>> In <20aae408-4b77-483a...@googlegroups.com> "Rick C. Hodgin" <rick.c...@gmail.com> writes:
>>
>>> For me? Barring the obvious things like functionality and error-free binary
>>> code, it would have to be edit-and-continue. And without equal.
>>
>> So when the compiler encounters an error, it retains the compilation
>> artifacts (parse tree, symbol table, etc.) up to that point, and when the
>> user corrects the error it continues compiling where it left off?
>>
>> What if the correction changed something that has already been compiled?
>>
>> How much time can it possibly save? Compilation is pretty fast, no?
>
> There are two approaches to the design. Microsoft uses the top-down
> recompilation which then looks for the deltas or diffs between the
> generated binary images, and then updates things accordingly. It often
> breaks data or functions on the stack, or fails to compile completely
> if you stray away.
>
> The model I'm using for CAlive takes the approach of saving the full
> compilation state. Compilation is carried out continuously as things
> are changed. But in general, dependency trees are maintained and if
> things which were dependent upon something that's changed has changed,
> the entire related lines of source code are stricken and recompiled.

All well and good for a single file or trivial application, but things
would get horribly complex (and memory consuming) if you were working on
one part of a large and complex application.

Most popular IDEs use some form of background compilation which
highlights errors as you type. This will take care of most of your
brain to finger problems. Unit tests will catch the rest!

--
Ian

Kenny McCormack

unread,
Oct 4, 2016, 9:07:57 PM10/4/16
to
In article <8a439a6c-5a61-43dd...@googlegroups.com>,
Rick C. Hodgin <rick.c...@gmail.com> wrote:
...
>I'm trying to create a God-fearing GNU / Linux alternative with
>hardware support as well, including CPUs, motherboards, peripheral
>hardware, and even manufacturing at some late point. It is a full
>hardware and software stack created in direct response to the teachings
>of Jesus Christ, in giving unto others rather than hoarding behind
>copyrights and patents. It is a labor of love applied in this industry,
>something I would like to see happen in all industries that are of
>benefit. It's part of the Village Freedom Project, of freeing us from
>traditional models of "seek gain for self," to instead apply the
>teachings of Jesus Christ of "seek gain for others, and in so doing
>also gain for yourself, through their gain."

Somebody needs to get back on their meds....

>It is not something I can do alone, nor do I want to. I have good
>ideas, a direct philosophical vision, a strong drive, and a solid
>and growing knowledge and understanding of what's required in all
>disciplines for my project to succeed.

Screwball.

>People can benefit from what I have to offer, and the One from / for
>whom I'm offering it. By following the teachings of Jesus Christ,
>everybody gains maximally. That is my goal, and it is not mine alone.
>Many people just don't realize yet that they also have that goal.

Off Topic.

>Best regards,
>Rick C. Hodgin

More than a few screws loose.


--
The randomly chosen signature file that would have appeared here is more than 4
lines long. As such, it violates one or more Usenet RFCs. In order to remain in
compliance with said RFCs, the actual sig can be found at the following web address:
http://www.xmission.com/~gazelle/Sigs/RoyDeLoon

Malcolm McLean

unread,
Oct 4, 2016, 10:12:14 PM10/4/16
to
On Tuesday, October 4, 2016 at 9:46:21 PM UTC+1, Rick C. Hodgin wrote:
>
> It's a comprehensive project. Everyone I've shown the details to in person
> so far tell me I'm crazy. But, it's the goal I have and it's what's upon
> my heart. It's been there since the early 90s.
>
You need micro-goals. plan something which gives a real benefit to a
user, and you know you can achieve within a few weeks. Then implement
that and release it.
Don't underestimate the time things take. I made about 20 commits
to Baby X, and all I did was essentially move codes about in folders
and set paths in IDEs. Then documenting my binary image library has
taken a long time. I've been floating B64 for quite some time now,
and whilst it's a far simpler project than C alive, I've yet to
write the first line of the first prototype interpreter.

Rick C. Hodgin

unread,
Oct 5, 2016, 7:53:15 AM10/5/16
to
Ian Collins wrote:
> All well and good for a single file or trivial application, but
> things would get horribly complex (and memory consuming)
> if you were working on one part of a large and complex application.

CAlive still supports traditional stand alone compile + link for
all apps. LiveCode is a build option, a feature that must be turned
on by switch, otherwise it generates an object file, or auto-links to
final output.

Memory is cheap. Storage is cheap. My language goals today look
to the future, not the past. Certain features should exist, and I
target them.

Malcolm McLean

unread,
Oct 5, 2016, 8:04:51 AM10/5/16
to
However if a resource like memory becomes cheap, often that means that
it is more rather than less important to use it efficiently. A very constrained
machine can't hold much in memory anyway, so it has to parse files line
by line. With a machine with a decent memory, you naturally load up the
entire code base. But that might include automatically generated files
such as those created by the Baby X resource compiler. As programs get
bigger and display more images, the rgba dumps get larger and larger,
and memory can start to be an issue in a way that it wasn't on the line-
based parser.

Rick C. Hodgin

unread,
Oct 5, 2016, 8:15:18 AM10/5/16
to
Exactly. My goals are for the hardware that will exist in the 2020s and
beyond. I do not imagine constrained general purpose machines in that
timeframe, except for explicit purposes (wristwatches may use inexpensive
general purpose hardware, but with fixed resources because they don't
need more).

In addition, the build machine does not need to be the target machine.
Build and test it on the large desktop, then deploy it on the smaller
device.

supe...@casperkitty.com

unread,
Oct 5, 2016, 11:09:38 AM10/5/16
to
On Tuesday, October 4, 2016 at 6:31:08 PM UTC-5, Bart wrote:
> > CAlive allows something I call Horizontal Debugging, which allows you to
> > even step into an expression, through individual components. So, CAlive
> > has to maintain context information about where to restart for the inner
> > parts of an expression, as well as lines.
>
> In that case you'd probably find it a lot easier to run an interpreter
> on your code while debugging, rather than trying to do it on binary
> native code.

While Visual Studio can do amazing things with edit-and-continue even in C
code that makes heavy use of macros, there are many cases where code cannot
be edited without abandoning execution. In languages like Forth where a
programmer manipulates code objects rather than merely editing a source
file, the system can maintain a clear relationship between the code and the
execution state, but in languages which are designed on an "edit source and
then compile" model things can get murky. No matter how brilliant an IDE
may be at handling edit-and-continue features, there will be cases where it
is not possible to make a necessary change without restarting the program.

Rick C. Hodgin

unread,
Oct 5, 2016, 2:52:19 PM10/5/16
to
Outside of express hardware state changes, or express external API interface
connections, that's not true, Mr. Supercat. All software changes can be made
to update everything related to its ABI appropriately.

My literal goal with RDC and LiveCode is to be able to create a blank
project, create a single myapp.ca file, begin with the code:

int main(int argc, char* argv[])
{
return 0;
}

Compile it, single-step into it, and then continue typing to build some
equivalent of something like the entire LibreOffice suite without every
stopping execution. So long as it's just software, and you have the
source code and are running a CAlive compiled version for every loaded
DLL, it should all work.

Now, that's the goal and the plan. We'll soon see if I can actually do

supe...@casperkitty.com

unread,
Oct 5, 2016, 5:08:31 PM10/5/16
to
On Wednesday, October 5, 2016 at 1:52:19 PM UTC-5, Rick C. Hodgin wrote:
> On Wednesday, October 5, 2016 at 11:09:38 AM UTC-4, supercat wrote:
> > While Visual Studio can do amazing things with edit-and-continue even in C
> > code that makes heavy use of macros, there are many cases where code cannot
> > be edited without abandoning execution. In languages like Forth where a
> > programmer manipulates code objects rather than merely editing a source
> > file, the system can maintain a clear relationship between the code and the
> > execution state, but in languages which are designed on an "edit source and
> > then compile" model things can get murky. No matter how brilliant an IDE
> > may be at handling edit-and-continue features, there will be cases where it
> > is not possible to make a necessary change without restarting the program.
>
> Outside of express hardware state changes, or express external API interface
> connections, that's not true, Mr. Supercat. All software changes can be made
> to update everything related to its ABI appropriately.

That's only true if an implementation can track everything having to do
with where objects and pointers come from. Given:

struct foo { int arr[4]; int y; };
struct foo bar;

What should happen if a pointer to bar.y is passed to a function, code is
stopped within that function, and array "arr" is changed to 5 elements?

Rick C. Hodgin

unread,
Oct 5, 2016, 5:25:21 PM10/5/16
to
Supercat wrote:
> What should happen if a pointer to bar.y is passed to a function,
> code is stopped within that function, and array "arr" is changed to 5 elements?

The location of all data is known. Pointers can be examined to see
what they point into in ABI_stale, to see how things are translated
into ABI_new, and a series of data change ops are created, including
updating pointer values.

Everything in the ABI is known. It's a natural and resolvable projection
from the logical image into the physical.

Chris M. Thomasson

unread,
Oct 5, 2016, 9:03:32 PM10/5/16
to
When do you think your compiler might be is a state that can allow for a
beta test?

Rick C. Hodgin

unread,
Oct 5, 2016, 9:15:40 PM10/5/16
to
Chris M. Thomasson wrote:
> When do you think your compiler might be is a state that can
> allow for a beta test?

I intend to be using it (James 4:15) around the 2018/2019 switchover,
so I would say early/mid-2018 for beta versions. I'd like to have Supercat
add fully compliant C90 and C99 support in 2019/2020.

Malcolm McLean

unread,
Oct 5, 2016, 10:08:38 PM10/5/16
to
We moved to Agile recently. Whilst it's still to early to say how it
will work, it seems to be much better. What used to happen was that
we'd agree on a deadline "comfortably off", which turned out to be
pure fantasy. Now we have micro-deadlines, and add a bit of
functionality in each week-long sprint.

Rick C. Hodgin

unread,
Oct 5, 2016, 10:20:22 PM10/5/16
to
Malcolm McLean wrote:
> We moved to Agile recently. Whilst it's still to early to say how
> it will work, it seems to be much better. What used to happen
> was that we'd agree on a deadline "comfortably off",
> which turned out to be pure fantasy. Now we have micro-deadlines,
> and add a bit of functionality in each week-long sprint.

I do not expect to be working alone forever. This effort I'm in
pursuit of has value, and at some point other Christians will come
on board and we together will complete the entirety of the project,
not just the compiler.

For now it's just me. I have to maintain the big picture vision and
pursue it. I'm able to work on the code outside of other constraints
only. It's slow going, and seems to go in spurts as by example over
time.

Jerry Stuckle

unread,
Oct 5, 2016, 10:22:31 PM10/5/16
to
In the year twenty-five twenty-five, if man is still alive...

--
==================
Remove the "x" from my email address
Jerry Stuckle
jstu...@attglobal.net
==================

Malcolm McLean

unread,
Oct 5, 2016, 10:32:06 PM10/5/16
to
I know.
If you have a big picture goal, on a timescale of over a year, you
may achieve it you may not. If you've got a little goal, on a
timescale of a week or two weeks, again you may achieve or or you
may not. But the consequences of failure are different. If you
can't achieve a week's goal in a week, maybe you can achieve it in
three weeks. That's something and nothing in itself, but if it's
always happening then that tells you that you are over-ambitious
and the year overall deadline is actually going to be about three
years (which might well make the whole project unrealistic). If
you can't achieve week's deadline in three weeks then that particular
feature maybe you don't have the skills to implement at all, so it
must be dropped or you must find someone who can tell you how to
set about it.

(I found this with the binary image library. I wanted to out it into
a better state for distribution. In fact documenting each component
took far longer than I expected. But it's being done bit by bit,
incrementally.)

Rick C. Hodgin

unread,
Oct 5, 2016, 10:43:29 PM10/5/16
to
Ron Reedy, co-founder of Peregrine Semiconductor, said that
he's discovered that almost every project will take 3x as long as
expected. He actually thinks it's pi times as long. :-)

My goals include other people helping me. And I'll continue on in
my big picture goals until I complete the project alone, or I get help
from others, or I'm called out of this world.

My current goal is to have my assembler completed by the end
of this year, and my kernel assembling in it correctly. I'm currently
about a month behind, but I had work that needed done around
the house in the summer / fall months.

jacobnavia

unread,
Oct 6, 2016, 3:57:06 AM10/6/16
to
Le 06/10/2016 à 03:03, Chris M. Thomasson a écrit :
> When do you think your compiler might be is a state that can allow for a
> beta test?

There isn't even a parser ready. We are discussing his vaporware because
if somebody behaves badly and spams this group with religious nonsense
he can only be given full attention: he is a religious nut, and should
not be misclassified like navia, that horrible guy that tried to promote
a compiler that wasn't vaporware at all.

I will do this and will do that. CAlive will do everytging better than
msvc/gcc with their teams of hundreds of people.

In this newsgroup C programmers are scarce. The containers library is
ignored.

Well, *obviously* since it is not vaporware

Rick C. Hodgin

unread,
Oct 6, 2016, 7:54:26 AM10/6/16
to
jacobnavia wrote:
> I will do this and will do that. CAlive will do everytging better
> than msvc/gcc with their teams of hundreds of people.

Two things: (1) I think lcc-win is awesome. You have justifiable
bragging rights. (2) I pray to have thousands of developers contributing
to the Village Freedom Project, not just on the compiler, but in all
apps.

I want LibSF to be a God-fearing hardware and software base for
a completely new top-down, bottom-up computer-related
offering unto God and mankind.

mark.b...@gmail.com

unread,
Oct 6, 2016, 8:07:07 AM10/6/16
to
On Thursday, 6 October 2016 12:54:26 UTC+1, Rick C. Hodgin wrote:

> I want LibSF to be a God-fearing hardware and software base for
> a completely new top-down, bottom-up computer-related
> offering unto God and mankind.

I look forward to hearing of the first user to host porn on site powered by it...

Rick C. Hodgin

unread,
Oct 6, 2016, 8:28:45 AM10/6/16
to
Other people have said similar things in the past. Here is my reply to
that line of thinking (updated to reflect the broader form I am now in
pursuit of):

https://groups.google.com/d/msg/publicesvfoxpro/ZVzZIcjQrY0/cKOlL1opcRwJ3

"I will do my best with [the Liberty Software Foundation]. [As our
software products are] completed, you will be free to use [all of] it.
And since I am offering the [products] in source code form under a
[type of Public Domain] license, you will also be free to take the
source code and modify it to suit [personal or] proprietary needs.
But it will not be [the] gift that is [being] used in that way, only
the perversion of the gift I am offering.

"And this is exactly what has happened here on this Earth with the gift
God gave unto mankind, and the perversion of that gift through sin by
the enemy [such that the great gifts of God given unto mankind, even
of the whole Earth and everything in and on it, are no longer being
used as they were created to be used or intended, but are now being
used for sinful purposes against God. God lets man continue on for a
time in that way, all the while warning about the dangers of doing so.

"He is patient such that He gives each person time to realize for
themselves that what they're doing is wrong. But the day is coming,
which is then THE LAST day. And on that day, all continuation of the
former sinful ways will no longer be tolerated, and the accounting
required by God for living that way under sin will be demanded]."

-----
It is right to pursue God and God's goals in one's life ... even if nobody
else will. God's ways will exist for eternity. And He calls for us to
follow Him even here upon the Earth ("Thy will be done, on Earth as it is
in Heaven").

Rick C. Hodgin

unread,
Oct 6, 2016, 8:32:16 AM10/6/16
to
On Thursday, October 6, 2016 at 8:28:45 AM UTC-4, Rick C. Hodgin wrote:
> Other people have said similar things in the past. Here is my reply to
> that line of thinking (updated to reflect the broader form I am now in
> pursuit of):
>
> https://groups.google.com/d/msg/publicesvfoxpro/ZVzZIcjQrY0/cKOlL1opcRwJ3

Corrected link:

https://groups.google.com/d/msg/publicesvfoxpro/ZVzZIcjQrY0/cKOlL1opcRwJ

Anand Hariharan

unread,
Oct 7, 2016, 6:02:34 PM10/7/16
to
On Tuesday, October 4, 2016 at 1:03:00 PM UTC-5, Rick C. Hodgin wrote:
> For me? Barring the obvious things like functionality and error-free binary
> code, it would have to be edit-and-continue. And without equal.
>
(...)
>
> -----
> As such, I view edit-and-continue as the single most desirable asset of
> any software developer. It is a fully primary area of focus for my CAlive
> compiler and IDE.
>

The thread has veered off into CAlive and other matters, but I think the subject of this thread viz., "Most important aspect of any compiler (C or otherwise)" is interesting.

What do folks consider as *Most* important feature of a compiler? (my emphasis)

I suppose standard conformance would come out top in this group?

Others might look at compilation times, link times (for C++, this is significant), performance of generated binaries, interop with binaries from other compilers, ...

To me, arguably the most important feature would be quality of diagnostics. Mundane errors such as an unmatched paren/squiggly, or a missing semi-colon in a header file make for the most interesting error messages. There are folks I admire (e.g., Leor Zolman) who write tools to parse and decrypt error messages from C++ compilers!

I consider features such as Edit-and-Continue to be among the feature-set of an IDE / debugger rather than a compiler. I suppose 'tool-support' would make it a compiler feature?

Thoughts?

- Anand

supe...@casperkitty.com

unread,
Oct 7, 2016, 7:17:41 PM10/7/16
to
On Friday, October 7, 2016 at 5:02:34 PM UTC-5, Anand Hariharan wrote:
> What do folks consider as *Most* important feature of a compiler? (my emphasis)
>
> I suppose standard conformance would come out top in this group?

Predictable and logical support for features of the target platform.

Since it's possible for an implementation to be conforming and yet
incapable of running any useful programs, and since limited platforms
may be better served by a compiler that deviates from the Standard than
one which adheres to it perfectly, I don't see rigid standards conformance
as being particularly valuable in and of itself.

luser droog

unread,
Oct 7, 2016, 9:13:56 PM10/7/16
to
It is possible, but that'd be stupid. So I doubt
anyone would set out to write a useless[1] compiler.

[1] not to be confused with the esolang entitled "useless". Here, the normal English word.

Jerry Stuckle

unread,
Oct 7, 2016, 9:15:09 PM10/7/16
to
Your fault is in thinking it is impossible for an implementation to be
conforming yet incapable of running any useful programs.

That is a false conjecture, so the rest of your post is immaterial. It
is perfectly possible for an implementation to be conforming and product
useful programs. In fact, if the implementation is strictly conforming,
it will be required to be able to produce useful programs.

Rick C. Hodgin

unread,
Oct 7, 2016, 9:31:28 PM10/7/16
to
Anand Hariharan wrote:
> I suppose standard conformance would come out top in this group?

Not for me. But, I can see a need for standards compliance options.

> To me, arguably the most important feature would be quality
> of diagnostics.

Very important.

> I consider features such as Edit-and-Continue to be among the
> feature-set of an IDE / debugger rather than a compiler. I suppose
> 'tool-support' would make it a compiler feature?

EaC is definitely an IDE feature, but it is the compiler which
does the work.

James Kuyper

unread,
Oct 7, 2016, 11:53:53 PM10/7/16
to
I've sometimes considered the idea of writing a fully conforming but
completely useless implementation of C, primarily for making the point
that the requirements for full conformance are a lot weaker than many
people think (including some of the people who had a role in creating
those requirements).
Creating a useless implementation would be a lot simpler than writing a
useful implementation, but (because #error directives have to be handled
correctly) it requires correctly implementing all of translation phases
1-4, which is sufficiently complicated to make the project interesting.

supe...@casperkitty.com

unread,
Oct 8, 2016, 4:03:01 AM10/8/16
to
On Friday, October 7, 2016 at 8:15:09 PM UTC-5, Jerry Stuckle wrote:
> On 10/7/2016 7:17 PM, supercat wrote:
> > Since it's possible for an implementation to be conforming and yet
> > incapable of running any useful programs, and since limited platforms
> > may be better served by a compiler that deviates from the Standard than
> > one which adheres to it perfectly, I don't see rigid standards conformance
> > as being particularly valuable in and of itself.
>
> Your fault is in thinking it is impossible for an implementation to be
> conforming yet incapable of running any useful programs.

The Implementation Limits section of the Standard requires that for every
implementation there must exist at least one program which exercises all
of the implementation limits described in the Standard, and which that
implementation will process without UB. The rationale makes very clear
that the Standard does not require that an implementation be capable of
running all such programs, since a program which exercised all limitations
simultaneously would require over a megabyte of storage, and many computers
in 1989 weren't that big. The stated intention in the rationale is that
someone who is trying to make a useful implementation which doesn't get
tripped up by any of the limits in isolation will likely produce an
implementation that can accommodate them in useful combinations, but the
Standard doesn't actually mandate that an implementation be capable of
running more than one possibly-contrived program.

supe...@casperkitty.com

unread,
Oct 8, 2016, 4:12:45 AM10/8/16
to
On Friday, October 7, 2016 at 10:53:53 PM UTC-5, James Kuyper wrote:
> Creating a useless implementation would be a lot simpler than writing a
> useful implementation, but (because #error directives have to be handled
> correctly) it requires correctly implementing all of translation phases
> 1-4, which is sufficiently complicated to make the project interesting.

Probably the simplest way to produce a useful implementation is to find a
program whose behavior could be achieved by a strictly-conforming program
that exercised all implementation limits, write such a strictly-performing
program that behaves likewise, and then hack a compiler so that no matter
what code it's given it always outputs the first program mentioned above.

Feed the One Program into the compiler and it will produce an executable
which behaves as the Standard requires. Feed anything else into the
compiler and--oops--it exceeded an implementation limit in a fashion over
which the Standard imposes no requirements, so anything the program might
do (including behaving just like the first program) would be conforming.

Malcolm McLean

unread,
Oct 8, 2016, 5:18:08 AM10/8/16
to
Very geeky thing to do, in the bad sense of the term.

Whilst the project isn't entirely trivial, it's not really "interesting",
the standard is just a human construct, it's subject to fairly regular
revision, and it's not designed for the purpose of preventing useless
but conforming implementations.

Jerry Stuckle

unread,
Oct 8, 2016, 6:05:14 AM10/8/16
to
The Standard requires a conforming implementation must process all
conforming programs. If it does not, it is not a conforming implementation.

Hardware limitations such as memory limits are not part of the Standard.

Ben Bacarisse

unread,
Oct 8, 2016, 6:45:14 AM10/8/16
to
Either you have that backwards or you are confusing a conforming program
with a /strictly/ conforming program. Section 4 defines the term
"conforming program":

7. A conforming program is one that is acceptable to a conforming
implementation.

The same section defines a conforming (hosted) implementation as one the
must accept any strictly conforming program.

So here's a counter-example -- a conforming program that a conforming
implementation is not obliged to accept:

int main(void) { return -1 >> 1, 0; }

> Hardware limitations such as memory limits are not part of the Standard.

--
Ben.

Kenny McCormack

unread,
Oct 8, 2016, 7:02:32 AM10/8/16
to
In article <09e74b5b-93b0-411e...@googlegroups.com>,
Which is kinda the same thing as when people insist that the Bible is
literally true, exact in all its particulars, except, of course, when it
isn't, and/or when it either says something that is manifestly silly or
something that is inconvenient to the speaker's agenda.

This all true, given the high regard and worshipful stance the regs here
have taken towards the standard.

--
The randomly chosen signature file that would have appeared here is more than 4
lines long. As such, it violates one or more Usenet RFCs. In order to remain in
compliance with said RFCs, the actual sig can be found at the following web address:
http://www.xmission.com/~gazelle/Sigs/Seneca

Rick C. Hodgin

unread,
Oct 8, 2016, 8:14:24 AM10/8/16
to
Kenny McCormack, if you ever pass through Indianapolis, IN in your
travels, drop me an email. There's a well known deli here called Shapiro's.
We could meet there for a meal and you could come to know the real-life
Rick rather than the perceived Rick from an interpreted-through-static-
text-version-of-Rick online.

I think you'll find I am not quite like you perceive, and that your
perceptions of me are influenced by prior experiences you've had.
I am a basic person. I like Star Trek (all except JJ Abrams' reboot universe,
what was he thinking?), Keith Rucker's VintageMachinery.org channel
on YouTube, wooden boats (Tips From a Shipwright channel on
YouTube), playing music (guitar, piano), Blender 3D artwork and animations,
driving through the country, working on cars, doing crafts projects, watching
movies, learning about new products and technologies, etc.
I subject these interests to Jesus Christ, so I avoid unholy aspects
within them. I edit movies and TV shows to remove certain parts,
and I avoid things which are profane.

I am just like other people with diverse interests and abilities, but what
makes me different than some other people is that I have been saved,
and because of that I see things differently, as by God's guidance. It
stands in stark contrast to worldly ways because what I tell you is true:
There's an active enemy at work in this world deceiving man into
believing and following lies, and his power exists upon an influence
of the flesh because of sin. Once Jesus takes a person's sin away, his
power is broken and we are set free. God then takes it upon Himself
to begin teaching us everything we'll receive.

I think if we could meet you'd find I'm awkward in speech at first,
fumbling over what to say because my mind's racing on new encounters,
that I'm funny and witty, creative and expressive over time, and that I've
come to believe in something real, not a fantasy or imagination.

It just takes an honest pursuit of the truth to see and receive.

-----
I extend this offer to others as well. We have an International airport
and are a common hub for flights. If any of you would like to honestly
meet up sometime, I'd enjoy meeting you. You'd get to see the fragile
Rick of real life.

james...@verizon.net

unread,
Oct 8, 2016, 10:52:01 AM10/8/16
to
On Saturday, October 8, 2016 at 6:05:14 AM UTC-4, Jerry Stuckle wrote:
> On 10/8/2016 4:02 AM, supe...@casperkitty.com wrote:
> > On Friday, October 7, 2016 at 8:15:09 PM UTC-5, Jerry Stuckle wrote:
> >> On 10/7/2016 7:17 PM, supercat wrote:
> >>> Since it's possible for an implementation to be conforming and yet
> >>> incapable of running any useful programs, ...
...
> >> Your fault is in thinking it is impossible for an implementation to be
> >> conforming yet incapable of running any useful programs.

You're responding to a paragraph in which he expressed, very clearly, the exact opposite of the thought you've just accused him of believing. He said "possible", while you said "impossible" - the wording is otherwise identical. I won't claim to know whether he's particularly honest or sane, but if you're going to accuse him of believing the exact opposite of what he's clearly said, you should provide some supporting evidence.

> > The Implementation Limits section of the Standard requires that for every
> > implementation there must exist at least one program which exercises all
> > of the implementation limits described in the Standard, and which that
> > implementation will process without UB. The rationale makes very clear
> > that the Standard does not require that an implementation be capable of
> > running all such programs, since a program which exercised all limitations
> > simultaneously would require over a megabyte of storage, and many computers
> > in 1989 weren't that big. The stated intention in the rationale is that
> > someone who is trying to make a useful implementation which doesn't get
> > tripped up by any of the limits in isolation will likely produce an
> > implementation that can accommodate them in useful combinations, but the
> > Standard doesn't actually mandate that an implementation be capable of
> > running more than one possibly-contrived program.
> >
>
> The Standard requires a conforming implementation must process all
> conforming programs. If it does not, it is not a conforming implementation.

The standard imposes no such requirement. The requirements which it does impose that bear the greatest similarity to that non-existent requirement are the following:

"The implementation shall be able to translate and execute at least one program that contains at least one instance of every one of the following limits:" (5.2.4.1p1). That one program, which could be different for every implementation, is the only program for which any such requirement applies.

"A conforming hosted implementation shall accept any strictly conforming program." (4p6). The standard does not define the meaning of "accept" that applies in this context. Therefore, the definition is either provided by ISO/IEC 2382−1:1993 (3p1), or by ordinary English usage in the context of computer programming. ISO 2382 is too expensive for me to justify buying it. In all the years since I first talked about that issue, no one who does have a copy has bothered to posted a response citing a definition for "accept" from that standard. In ordinary English usage, there's nothing self-contradictory about the statement "I accepted his gift, but never opened it and never did anything with it". A fully conforming implementation which processes every strictly conforming program (except the "one program" - which need NOT be strictly conforming) by saying "program accepted" and doing nothing more with it.

The requirement that you think the standard imposes would not make any sense, because of the way the standard defines "conforming program". I suspect that you are under the impression that it gives a useful, sane definition for that term. It does not. To understand what the problem with the standard's definition is, let me start with 4p4:

"The implementation shall not successfully translate a preprocessing translation unit containing a #error preprocessing directive unless it is part of a group skipped by conditional inclusion.". This is the only circumstance under which the standard prohibits successful translation of a program. Therefore, a fully conforming implementation could successfully translate every program that does not contain a #error directive that survives conditional inclusion. If a diagnostic message is mandatory, it issues that message, and then continues processing the program. For those programs with undefined behavior, "this international standard imposes no requirements", so no matter what the final executable does, it still conforms to all of the applicable requirements (in this case, none) of the standard.

"A conforming program is one that is acceptable to a conforming implementation." (4p7). The implementation I described in the previous paragraph renders every program it accepts a conforming program. This definition, like many ridiculous definitions, was the result of a political compromise. That compromise has no actual negative consequences - while the standard defines the term "conforming program", it never makes any use of the term. In particular, the standard does NOT impose the requirement you describe in your first paragraph above.

> Hardware limitations such as memory limits are not part of the Standard.

The standard indirectly acknowledges the existence of limits like that: "Both the translation and execution environments constrain the implementation of
language translators and libraries." (5.2.4p1). Most of the specific types of limits listed in 5.2.4.1 exist in real-world compilers are indirectly the result of memory limits.

Rick C. Hodgin

unread,
Oct 8, 2016, 11:05:09 AM10/8/16
to
James Kuyper, I don't know if you have me in a killfile filter or not,
but I wanted you to know I greatly enjoy your posts. They are very
clearly conveyed, to the point, and ripe with knowledge, wisdom, and
experience. They are also without artificial bias. They are peaceful, and
are among my favorite to read (along with Supercat's).

james...@verizon.net

unread,
Oct 8, 2016, 11:05:41 AM10/8/16
to
On Saturday, October 8, 2016 at 4:12:45 AM UTC-4, supe...@casperkitty.com wrote:
> On Friday, October 7, 2016 at 10:53:53 PM UTC-5, James Kuyper wrote:
> > Creating a useless implementation would be a lot simpler than writing a
> > useful implementation, but (because #error directives have to be handled
> > correctly) it requires correctly implementing all of translation phases
> > 1-4, which is sufficiently complicated to make the project interesting.
>
> Probably the simplest way to produce a useful implementation is to find a
> program whose behavior could be achieved by a strictly-conforming program
> that exercised all implementation limits, write such a strictly-performing

Note: the "one program" is NOT required to be strictly conforming. If it exceeds any of those limits (which is permitted), it won't be strictly conforming. It can also fail to be strictly conforming for any other reason.

> program that behaves likewise, and then hack a compiler so that no matter
> what code it's given it always outputs the first program mentioned above.
>
> Feed the One Program into the compiler and it will produce an executable
> which behaves as the Standard requires. Feed anything else into the
> compiler and--oops--it exceeded an implementation limit in a fashion over
> which the Standard imposes no requirements, so anything the program might
> do (including behaving just like the first program) would be conforming.

That doesn't sound like a "useful" implementation to me - I was talking about a "useless" implementation - did you accidentally reverse the term? The "one program" could be functionally equivalent to "int main(void) {}", which would somewhat simplify the implementation compared to the one you describe above, if the objective is to create a useless one.

I was specifically talking about a fully conforming but useless implementation. A conforming implementation must not successfully complete translation of a program containing a #error directive that survives conditional compilation - yours does.
"A conforming implementation shall produce at least one diagnostic message (identified in an implementation-defined manner) if a preprocessing translation unit or translation unit contains a violation of any syntax rule or constraint, even if the behavior is also explicitly specified as undefined or implementation-defined." (5.1.1.3p1). It would be trivial to meet this requirement by always issuing exactly one diagnostic message: "Your program may contain errors". Your implementation does not meet this requirement.

james...@verizon.net

unread,
Oct 8, 2016, 11:09:21 AM10/8/16
to
On Saturday, October 8, 2016 at 5:18:08 AM UTC-4, Malcolm McLean wrote:
> On Saturday, October 8, 2016 at 4:53:53 AM UTC+1, James Kuyper wrote:
> > On 10/07/2016 09:11 PM, luser droog wrote:
> >
> > I've sometimes considered the idea of writing a fully conforming but
> > completely useless implementation of C, primarily for making the point
> > that the requirements for full conformance are a lot weaker than many
> > people think (including some of the people who had a role in creating
> > those requirements).
> > Creating a useless implementation would be a lot simpler than writing a
> > useful implementation, but (because #error directives have to be handled
> > correctly) it requires correctly implementing all of translation phases
> > 1-4, which is sufficiently complicated to make the project interesting.
> >
> Very geeky thing to do, in the bad sense of the term.
>
> Whilst the project isn't entirely trivial, it's not really "interesting",

To me, it would be. I've never implemented a compiler, and don't have the time I'd need to implement a useful one. I don't have the time I'd need to implement a useless one, either - but it comes closer to being something I could find time for - purely as a recreational activity, of course.

supe...@casperkitty.com

unread,
Oct 8, 2016, 1:16:07 PM10/8/16
to
On Saturday, October 8, 2016 at 10:05:41 AM UTC-5, james...@verizon.net wrote:
> On Saturday, October 8, 2016 at 4:12:45 AM UTC-4, supe...@casperkitty.com wrote:
> > Probably the simplest way to produce a useful implementation is to find a
> > program whose behavior could be achieved by a strictly-conforming program
> > that exercised all implementation limits, write such a strictly-performing
>
> Note: the "one program" is NOT required to be strictly conforming. If it exceeds any of those limits (which is permitted), it won't be strictly conforming. It can also fail to be strictly conforming for any other reason.

I shouldn't have said "simplest", since one could simplify things further by
taking advantage of the fact that the Standard says even less than I've
implied.

On the other hand, if a program isn't strictly conforming, it might be
ambiguous whether it tests all the implementation limits. For example, a
source text containing two underscores, a linefeed, and nothing else, could
be regarded as a conforming C program which does anything whatsoever. One
could thus declare by fiat that an such a source text exercises all the
implementation limits, but that would render the limits even more meaningless
than they otherwise would be. On the other hand, such a reading would let
one get around the #error requirement by saying that, with a few specific
exceptions, inputs are limited to six characters per line, and the exceeding
that limit will invoke UB.

> > Feed the One Program into the compiler and it will produce an executable
> > which behaves as the Standard requires. Feed anything else into the
> > compiler and--oops--it exceeded an implementation limit in a fashion over
> > which the Standard imposes no requirements, so anything the program might
> > do (including behaving just like the first program) would be conforming.
>
> That doesn't sound like a "useful" implementation to me - I was talking about a "useless" implementation - did you accidentally reverse the term? The "one program" could be functionally equivalent to "int main(void) {}", which would somewhat simplify the implementation compared to the one you describe above, if the objective is to create a useless one.

Knowing that a program *won't* do anything is sometimes a useful guarantee.
My point is that the Standard doesn't say anything meaningful about what any
particular program might or might not do.

> I was specifically talking about a fully conforming but useless implementation. A conforming implementation must not successfully complete translation of a program containing a #error directive that survives conditional compilation - yours does.

My original intention was for the simplest "cross-compiler" for a
particular target system, using any program produces any predictable
output on the target and hacking any compiler for any system so it
would output the target system's One Program regardless of what else
happened. I then thought cross-compilation issues made things too
complicated, and edited them out, but that blurred the point that one
would start with an existing implementation (or at least preprocessor)
for some system but wouldn't need a useful code generator.

If one were trying to make a minimal preprocessor it might be useful to
keep for every macro a list of all the definitions it's ever been given,
and have any #error directive expand as the Cartesian product of all macros
contained therein. Doing that would eliminate the need to process any
#if directives or expressions contained therein. I don't think there's any
requirement that #include work with anything other than standard library
headers, since there is no requirement that an implementation include any
sort of file system.

> "A conforming implementation shall produce at least one diagnostic message (identified in an implementation-defined manner) if a preprocessing translation unit or translation unit contains a violation of any syntax rule or constraint, even if the behavior is also explicitly specified as undefined or implementation-defined." (5.1.1.3p1). It would be trivial to meet this requirement by always issuing exactly one diagnostic message: "Your program may contain errors". Your implementation does not meet this requirement.

My intention was that one hack an existing implementation, using the existing
implementation for the #error directive. Were it not for that requirement,
a program that blindly output "This is a diagnostic." would suffice.

IMHO, the Standard would be much more useful if most tasks could be coded
in such a way (I'd call it "selectively conforming") such that that
conforming implementations would be required to either process them without
UB or indicate, in an implementation-defined fashion, a refusal to do so.
To be useful, most programs which use recursion would need to indicate for
at least some recursive functions the maximum number of times they would be
nested (a compiler could then determine an upper bound on stack utilization
or indicate that it cannot guarantee UB-free operation).

Even if a particular program be rejected by 98% of implementations, a
guarantee that it will work correctly on the 2% that do accept it may be
more useful than a vague statement that it will likely work on 98% of
implementations, but without any guarantees that it will be reliable on
any particular implementation.

Further, an implementations that can't meet all the Implementation Limits
could be useful for running programs which have smaller requirements, if
implementations could promise to reject without UB any programs that
exceeded their abilities. I've found almost-standard C to be a useful
language when programming a variety of machines with 256 bytes or less of
RAM (typically with a non-volatile code store in the 2K-8K range); while
such an devices could not comply with the Implementation Limits section in
the fashion the authors intended, many useful programs will in fact fit in
such devices and a program which successfully builds can be guaranteed not
to run out of memory during execution.

Malcolm McLean

unread,
Oct 8, 2016, 3:20:40 PM10/8/16
to
On Saturday, October 8, 2016 at 12:02:32 PM UTC+1, Kenny McCormack wrote:
> In article <09e74b5b-93b0-411e...@googlegroups.com>,
> Malcolm McLean <malcolm...@btinternet.com> wrote:
>
> >Whilst the project isn't entirely trivial, it's not really "interesting",
> >the standard is just a human construct, it's subject to fairly regular
> >revision, and it's not designed for the purpose of preventing useless
> >but conforming implementations.
>
> Which is kinda the same thing as when people insist that the Bible is
> literally true, exact in all its particulars, except, of course, when it
> isn't, and/or when it either says something that is manifestly silly or
> something that is inconvenient to the speaker's agenda.
>
> This all true, given the high regard and worshipful stance the regs here
> have taken towards the standard.
>
Hmm.

Maybe you should have a look at the Jewish exegete, Rashi. Rashi
attached great importance to the fact that an early line of the Bible,
usually translated

"And it was even and it was morning, the first day"

is in Hebrew

veohi erev veohi boker, yom echad.

"And it was even and it was morning, a single day"
(or maybe "day one", "a day", but not "the first day" which
would be "yom rishon").

one day: According to the sequence of the language of the chapter,
it should have been written, “the first day,” as it is written
regarding the other days, “second, third, fourth.” Why did Scripture
write“one” ? Because the Holy One, blessed be He, was the only one
in His world, for the angels were not created until the second day.
[i.e., יוֹם אֶחָד is understood as ‘the day of the only One’] So is it
explained in Genesis Rabbah (3:8).

Virtually all the commentary is like that, it picks up little
bits of slightly surprising grammar and assigns mystical significance
to them.

So is Rashi barking up totally the wrong tree, or is he right?
How would you decide? And would you apply a similar analysis to
the C standard?


Jerry Stuckle

unread,
Oct 8, 2016, 4:57:08 PM10/8/16
to
No confusion at all.

> 7. A conforming program is one that is acceptable to a conforming
> implementation.
>
> The same section defines a conforming (hosted) implementation as one the
> must accept any strictly conforming program.
>
> So here's a counter-example -- a conforming program that a conforming
> implementation is not obliged to accept:
>
> int main(void) { return -1 >> 1, 0; }
>

I never said it was obliged to accept it.

>> Hardware limitations such as memory limits are not part of the Standard.
>



--

Jerry Stuckle

unread,
Oct 8, 2016, 4:58:42 PM10/8/16
to
On 10/8/2016 10:49 AM, james...@verizon.net wrote:
> On Saturday, October 8, 2016 at 6:05:14 AM UTC-4, Jerry Stuckle wrote:
>> On 10/8/2016 4:02 AM, supe...@casperkitty.com wrote:
>>> On Friday, October 7, 2016 at 8:15:09 PM UTC-5, Jerry Stuckle wrote:
>>>> On 10/7/2016 7:17 PM, supercat wrote:
>>>>> Since it's possible for an implementation to be conforming and yet
>>>>> incapable of running any useful programs, ...
> ...
>>>> Your fault is in thinking it is impossible for an implementation to be
>>>> conforming yet incapable of running any useful programs.
>
> You're responding to a paragraph in which he expressed, very clearly, the exact opposite of the thought you've just accused him of believing. He said "possible", while you said "impossible" - the wording is otherwise identical. I won't claim to know whether he's particularly honest or sane, but if you're going to accuse him of believing the exact opposite of what he's clearly said, you should provide some supporting evidence.
>

Not at all. But I already know you have problems with reading
comprehension.

>>> The Implementation Limits section of the Standard requires that for every
>>> implementation there must exist at least one program which exercises all
>>> of the implementation limits described in the Standard, and which that
>>> implementation will process without UB. The rationale makes very clear
>>> that the Standard does not require that an implementation be capable of
>>> running all such programs, since a program which exercised all limitations
>>> simultaneously would require over a megabyte of storage, and many computers
>>> in 1989 weren't that big. The stated intention in the rationale is that
>>> someone who is trying to make a useful implementation which doesn't get
>>> tripped up by any of the limits in isolation will likely produce an
>>> implementation that can accommodate them in useful combinations, but the
>>> Standard doesn't actually mandate that an implementation be capable of
>>> running more than one possibly-contrived program.
>>>
>>
>> The Standard requires a conforming implementation must process all
>> conforming programs. If it does not, it is not a conforming implementation.
>
> The standard imposes no such requirement. The requirements which it does impose that bear the greatest similarity to that non-existent requirement are the following:
>
> "The implementation shall be able to translate and execute at least one program that contains at least one instance of every one of the following limits:" (5.2.4.1p1). That one program, which could be different for every implementation, is the only program for which any such requirement applies.
>
> "A conforming hosted implementation shall accept any strictly conforming program." (4p6). The standard does not define the meaning of "accept" that applies in this context. Therefore, the definition is either provided by ISO/IEC 2382−1:1993 (3p1), or by ordinary English usage in the context of computer programming. ISO 2382 is too expensive for me to justify buying it. In all the years since I first talked about that issue, no one who does have a copy has bothered to posted a response citing a definition for "accept" from that standard. In ordinary English usage, there's nothing self-contradictory about the statement "I accepted his gift, but never opened it and never did anything with it". A fully conforming implementation which processes every strictly conforming program (except the "one program" - which need NOT be strictly conforming) by saying "program accepted" and doing nothing more with it.
>
> The requirement that you think the standard imposes would not make any sense, because of the way the standard defines "conforming program". I suspect that you are under the impression that it gives a useful, sane definition for that term. It does not. To understand what the problem with the standard's definition is, let me start with 4p4:
>
> "The implementation shall not successfully translate a preprocessing translation unit containing a #error preprocessing directive unless it is part of a group skipped by conditional inclusion.". This is the only circumstance under which the standard prohibits successful translation of a program. Therefore, a fully conforming implementation could successfully translate every program that does not contain a #error directive that survives conditional inclusion. If a diagnostic message is mandatory, it issues that message, and then continues processing the program. For those programs with undefined behavior, "this international standard imposes no requirements", so no matter what the final executable does, it still conforms to all of the applicable requirements (in this case, none) of the standard.
>
> "A conforming program is one that is acceptable to a conforming implementation." (4p7). The implementation I described in the previous paragraph renders every program it accepts a conforming program. This definition, like many ridiculous definitions, was the result of a political compromise. That compromise has no actual negative consequences - while the standard defines the term "conforming program", it never makes any use of the term. In particular, the standard does NOT impose the requirement you describe in your first paragraph above.
>
>> Hardware limitations such as memory limits are not part of the Standard.
>
> The standard indirectly acknowledges the existence of limits like that: "Both the translation and execution environments constrain the implementation of
> language translators and libraries." (5.2.4p1). Most of the specific types of limits listed in 5.2.4.1 exist in real-world compilers are indirectly the result of memory limits.
>

You also seem to have reading comprehension problems with the standard.

james...@verizon.net

unread,
Oct 8, 2016, 5:42:12 PM10/8/16
to
On Saturday, October 8, 2016 at 4:58:42 PM UTC-4, Jerry Stuckle wrote:
...
> Not at all. But I already know you have problems with reading
> comprehension.
...
> You also seem to have reading comprehension problems with the standard.

You routinely respond to reasoned arguments solidly supported with valid citations from the standard by not bothering to address the points raised, not challenging the citations as invalid, and you almost never bother making relevant citations of your own. You simply insult the intelligence of the person making the argument.
I thought you should know - you're not really fooling many people this way. It's perfectly obvious that you don't address any of the points raised because you have no arguments that can address them. You don't cite parts of the standard to support your point of view because there are no such parts. You don't even bother to explain what it is you think I've misunderstood - not because you don't deign to, but simply because you can't.

Keith Thompson

unread,
Oct 8, 2016, 5:49:10 PM10/8/16
to
James Kuyper <james...@verizon.net> writes:
[...]
> I've sometimes considered the idea of writing a fully conforming but
> completely useless implementation of C, primarily for making the point
> that the requirements for full conformance are a lot weaker than many
> people think (including some of the people who had a role in creating
> those requirements).
> Creating a useless implementation would be a lot simpler than writing a
> useful implementation, but (because #error directives have to be handled
> correctly) it requires correctly implementing all of translation phases
> 1-4, which is sufficiently complicated to make the project interesting.

I've thought about doing the same thing.

You don't need to implement phases 1-4 -- just re-use an existing
preprocessor.

--
Keith Thompson (The_Other_Keith) ks...@mib.org <http://www.ghoti.net/~kst>
Working, but not speaking, for JetHead Development, Inc.
"We must do something. This is something. Therefore, we must do this."
-- Antony Jay and Jonathan Lynn, "Yes Minister"

Keith Thompson

unread,
Oct 8, 2016, 6:07:24 PM10/8/16
to
James, I think you misspelled "plonk".

supe...@casperkitty.com

unread,
Oct 8, 2016, 6:32:00 PM10/8/16
to
On Saturday, October 8, 2016 at 2:20:40 PM UTC-5, Malcolm McLean wrote:
> "And it was even and it was morning, a single day"

> one day: According to the sequence of the language of the chapter,
> it should have been written, “the first day,” as it is written
> regarding the other days, “second, third, fourth.” Why did Scripture
> write“one” ?

By what name was Queen Elizabeth I generally called during her lifetime, or
indeed at any time prior to the coronation of Queen Elizabeth II? While I
hadn't realized just how far back the linguistic convention of calling
things "first" only when there is a "second" goes, I don't see any reason
to attach any significance beyond that.

> And would you apply a similar analysis to the C standard?

I think the proper way to read the C89 Standard is to look at what problems
the authors of the Standard were trying to solve, and what they did to solve
those problems. Prior to C89, the normal interpretation of "C" would be
"A language which behaves the way Dennis Ritchie's C compiler would behave,
adapted to the extent necessary for any particular target platform".

On platforms like common microcomputers whose natural semantics were very
close to those of the PDP-11, C compilers were expected to behave similarly.
On platforms with things like ones'-complement arithmetic, however, things
were far less clear. If a machine's natural arithmetic instructions operate
mod 262,143, for example, should "unsigned int" be an 18-bit type which wraps
with an odd modulus, or should be be a 17-bit type with a bit of padding?
Either approach would be reasonable, but if some C compilers used one and
some used the other, it would be impossible for such a machine to meaningfully
run "standard C".

Consequently, the C89 Standard focuses on issues where it would be plausible
that an implementer facing a choice without the Standard might decide to do
something contrary to what the authors of the Standard felt should be
expected (e.g. having an unsigned types whose modulus is not a power of two).
There was no reason to mandate that on two's-complement machines the expression
x<<y should multiply x by a power of 2 whenever the arithmetically-correct
result would be representable (whether x is positive or negative) given that
every single two's-complement implementation ever made had been doing that
and there was no reason to expect one to do otherwise.

I don't know what roles were played by the members of C89 who were absent
in C99, but the authors of C99 seem to have presumed that anything which was
not mandated in C89 was never a useful feature of the language, even if 99%
or for that matter 100% of implementations to date had behaved the same way.
Many aspects of the Standard would need to be reworded to support such a
reading, but such rewording has never taken place.

Malcolm McLean

unread,
Oct 8, 2016, 6:58:15 PM10/8/16
to
On Saturday, October 8, 2016 at 11:32:00 PM UTC+1, supe...@casperkitty.com wrote:
> On Saturday, October 8, 2016 at 2:20:40 PM UTC-5, Malcolm McLean wrote:
> > "And it was even and it was morning, a single day"
>
> > one day: According to the sequence of the language of the chapter,
> > it should have been written, “the first day,” as it is written
> > regarding the other days, “second, third, fourth.” Why did Scripture
> > write“one” ?
>
> By what name was Queen Elizabeth I generally called during her lifetime,
> or indeed at any time prior to the coronation of Queen Elizabeth II?
>

TO
THE MOST HIGH,
MIGHTIE
and
MAGNIFICENT
EMPRESSE RENOVV-
MED FOR PIETIE, VER-
TVE, AND ALL GRATIOVS
GOVERNMENT ELIZABETH BY
THE GRACE OF GOD QVEENE
OF ENGLAND FRAVNCE AND
IRELAND AND OF VIRGI-
NIA, DEFENDOVR OF THE
FAITH, &. HER MOST
HVMBLE SERVANT
EDMVND SPENSER
DOTH IN ALL HV-
MILITIE DEDI-
CATE, PRE-
SENT
AND CONSECRATE THESE
HIS LABOVRS TO LIVE
VVITH THE ETERNI-
TIE OF HER
FAME.

> While I hadn't realized just how far back the linguistic convention
> of calling things "first" only when there is a "second" goes, I don't
> see any reason to attach any significance beyond that.
>
It is a bit odd, however. Most English translations have "the first
day". You're right that when it was created, there was no second,
so it was not then the "first".
>
> > And would you apply a similar analysis to the C standard?
>
> I think the proper way to read the C89 Standard is to look at
> what problems the authors of the Standard were trying to solve,
> and what they did to solve those problems. Prior to C89, the
> normal interpretation of "C" would be
> "A language which behaves the way Dennis Ritchie's C compiler
> would behave, adapted to the extent necessary for any particular
> target platform".
>
That's how languages usually start of course. You've got an informal
specification or description of what you want to achieve, knock
it up, then the behaviour of the compiler becomes the language
definition. That's why you often get slightly different dialects, it's
always easier to write your own than to make sure you conform to
someone else's spec.
>
> On platforms like common microcomputers whose natural semantics
> were very close to those of the PDP-11, C compilers were expected
> to behave similarly.
> On platforms with things like ones'-complement arithmetic, however,
> things were far less clear. If a machine's natural arithmetic
> instructions operate mod 262,143, for example, should "unsigned int"
> be an 18-bit type which wraps with an odd modulus, or should be be a
> 17-bit type with a bit of padding?
> Either approach would be reasonable, but if some C compilers used
> one and some used the other, it would be impossible for such a machine
> to meaningfully run "standard C".
>
> Consequently, the C89 Standard focuses on issues where it would
> be plausible that an implementer facing a choice without the Standard
> might decide to do something contrary to what the authors of the Standard
> felt should be expected (e.g. having an unsigned types whose modulus
> is not a power of two).
> There was no reason to mandate that on two's-complement machines the
> expression
> x<<y should multiply x by a power of 2 whenever the arithmetically-correct
> result would be representable (whether x is positive or negative)
> given that every single two's-complement implementation ever made had
> been doing that and there was no reason to expect one to do otherwise.
>
The problem is that you can't really legislate for new technology
which hasn't been invented yet. No modern computer has analogue
registers, for example, but there's no real reason to think that
they might not become common at some future date, if electronics
goes in that direction. Even something as simple as the introduction
of 64 bit machines caused a problem - previously int was the "natural
register size", which meant 16 bits if you had 64k memory /segments
and 32 bits if you had 4Gb. But on a 64 bit machine, most integers
are very low, and the code to load a half word is often more efficient
than the code to load a word. So what's the natural size now?


Jerry Stuckle

unread,
Oct 8, 2016, 7:28:02 PM10/8/16
to
Yes, I've already learned there are too many language lawyers in this
newsgroup who quote the Standard but have absolutely no idea what it
means. It is not worth my time to try to teach the pigs to sing.

Malcolm McLean

unread,
Oct 8, 2016, 7:37:47 PM10/8/16
to
On Sunday, October 9, 2016 at 12:28:02 AM UTC+1, Jerry Stuckle wrote:
>
> Yes, I've already learned there are too many language lawyers in this
> newsgroup who quote the Standard but have absolutely no idea what it
> means. It is not worth my time to try to teach the pigs to sing.
>
Just don't engage in discussions that don't interest you.

I seldom contribute to standards exegesis type discussions. It
isn't something that I've got much to say about. But I've no
objection to other people doing so.

Ben Bacarisse

unread,
Oct 8, 2016, 8:25:29 PM10/8/16
to
Jerry Stuckle <jstu...@attglobal.net> writes:

> On 10/8/2016 6:45 AM, Ben Bacarisse wrote:
>> Jerry Stuckle <jstu...@attglobal.net> writes:
<snip>
>>> The Standard requires a conforming implementation must process all
>>> conforming programs. If it does not, it is not a conforming
>>> implementation.
>>
>> Either you have that backwards or you are confusing a conforming program
>> with a /strictly/ conforming program. Section 4 defines the term
>> "conforming program":
>>
>
> No confusion at all.
>
>> 7. A conforming program is one that is acceptable to a conforming
>> implementation.
>>
>> The same section defines a conforming (hosted) implementation as one the
>> must accept any strictly conforming program.
>>
>> So here's a counter-example -- a conforming program that a conforming
>> implementation is not obliged to accept:
>>
>> int main(void) { return -1 >> 1, 0; }
>>
>
> I never said it was obliged to accept it.

You said the standard requires it to be processed. Can you tell me what
that means?

--
Ben.

supe...@casperkitty.com

unread,
Oct 8, 2016, 8:46:57 PM10/8/16
to
On Saturday, October 8, 2016 at 5:58:15 PM UTC-5, Malcolm McLean wrote:
> On Saturday, October 8, 2016 at 11:32:00 PM UTC+1, supercat wrote:
> > I think the proper way to read the C89 Standard is to look at
> > what problems the authors of the Standard were trying to solve,
> > and what they did to solve those problems. Prior to C89, the
> > normal interpretation of "C" would be
> > "A language which behaves the way Dennis Ritchie's C compiler
> > would behave, adapted to the extent necessary for any particular
> > target platform".
> >
> That's how languages usually start of course. You've got an informal
> specification or description of what you want to achieve, knock
> it up, then the behaviour of the compiler becomes the language
> definition. That's why you often get slightly different dialects, it's
> always easier to write your own than to make sure you conform to
> someone else's spec.

The main sources of problems were either with (1) platforms where it wasn't
clear how things should behave, or (2) features which got added, in slightly
different forms, in different implementations (the most common problems being
with standard library functions inhabiting different headers in different
implementations). For aspects of the original which coincided with target
hardware features there really wasn't much ambiguity.

> The problem is that you can't really legislate for new technology
> which hasn't been invented yet. No modern computer has analogue
> registers, for example, but there's no real reason to think that
> they might not become common at some future date, if electronics
> goes in that direction. Even something as simple as the introduction
> of 64 bit machines caused a problem - previously int was the "natural
> register size", which meant 16 bits if you had 64k memory /segments
> and 32 bits if you had 4Gb. But on a 64 bit machine, most integers
> are very low, and the code to load a half word is often more efficient
> than the code to load a word. So what's the natural size now?

If the Standard describes an optional-but-recommended feature which becomes
expensive to support, it can prescribe a replacement which is better but
cheaper and deprecate the earlier form. Code which needs the feature can
then be migrated to the new form in a clear and orderly fashion.

If the Standard instead pretends that the feature does not exist, then
implementations which want to provide the functionality but not in the
expensive fashion will have to formulate their own ways of supporting
it--thus negating the purpose of having a standard in the first place.

Jerry Stuckle

unread,
Oct 8, 2016, 9:29:28 PM10/8/16
to
Sorry, Ben, if you don't know, I'm not going to continue. I've already
learned not to try to discuss the standards with you. It's a complete
waste of time, just like teaching a pig to sing.

Jerry Stuckle

unread,
Oct 8, 2016, 9:30:48 PM10/8/16
to
There are some things which interest me. However, I've learned not to
try to discuss standards with certain people in this newsgroup. It's
like trying to teach a pig to sing. Its a complete waste of time.

James R. Kuyper

unread,
Oct 9, 2016, 11:12:43 AM10/9/16
to
On 10/08/2016 05:49 PM, Keith Thompson wrote:
> James Kuyper <james...@verizon.net> writes:
> [...]
>> I've sometimes considered the idea of writing a fully conforming but
>> completely useless implementation of C, primarily for making the point
>> that the requirements for full conformance are a lot weaker than many
>> people think (including some of the people who had a role in creating
>> those requirements).
>> Creating a useless implementation would be a lot simpler than writing a
>> useful implementation, but (because #error directives have to be handled
>> correctly) it requires correctly implementing all of translation phases
>> 1-4, which is sufficiently complicated to make the project interesting.
>
> I've thought about doing the same thing.
>
> You don't need to implement phases 1-4 -- just re-use an existing
> preprocessor.

What I was thinking about was designing one from scratch; I'd learn more
that way. Reusing an existing preprocessor would render the exercise too
simple to be of interest - but it would have the advantage of making the
project small enough that I might plausibly someday be able to find time
to do it..

0 new messages