Google Groups no longer supports new Usenet posts or subscriptions. Historical content remains viewable.
Dismiss

Any chance of asking Knuth to stop the TeX feature-freeze?

115 views
Skip to first unread message

Karl-Heinz Zimmer

unread,
Jun 5, 2004, 8:24:14 AM6/5/04
to
Hi,

having learned that Knuth as TeX inventor and current maintainer is
following a "feature-freeze' policy since some years and only accepts
a very limited number of minor adjustments to TeX I am wondering if
there might be any chance of asking him to stop this freeze and let
the TeX project return to normal development again.

From other big projects (like e.g. KDE) I know about the importance
of feature freezes of course but I wonder if there ever was a freeze
of such loooong duration than the TeX project is experiencing at the
moment.

Of course I know that the maintainer has every _right_ to do so but
this does not mean his decisions are to be loved by everybody.

Great projects like PDFTeX and others show that there still _are_
skilled people contributing to provide answers for the TeX using
world, but these attempts (even if they produce GOOD things) are
still sub-optimal.

Optimal would be one of these:

A) Knuth stops the feature-freeze, resumes normal maintainership.

B) Knuth hands over TeX maintainership to another person who is
willing to actively maintain the project.

If one of these could be achieved then things like UTF8 support
or optical alignment of the right border could be added to TeX
where they (IMhO) belong.

Karl-Heinz
--
Karl-Heinz <mailto:k...@indeview.org> <mailto:k...@kde.org>
Zimmer I n d e V i e w K D E
Föhren Presentations Beyond Limitations Conquer your Desktop
www.fiehr.de www.indeview.org www.kde.org

Karl-Heinz Zimmer

unread,
Jun 5, 2004, 8:26:54 AM6/5/04
to
Hi,

having learned that Knuth as TeX inventor and current maintainer is
following a 'feature-freeze' policy since some years and only accepts
a very limited number of minor adjustments to TeX I am wondering if
there might be any chance of asking him to stop this freeze and let
the TeX project return to normal development again.

From other big projects (like e.g. KDE) I know about the importance
of feature freezes of course but I wonder if there ever was a freeze
of such loooong duration than the TeX project is experiencing at the
moment.

Of course I know that the maintainer has every _right_ to do so but
this does not mean his decisions are to be loved by everybody.

Great projects like PDFTeX and others show that there still _are_
skilled people contributing to provide answers for the TeX using
world, but these attempts (even if they produce GOOD things) are
still sub-optimal.

Optimal would be one of these:

A) Knuth stops the feature-freeze, resumes normal maintainership.

B) Knuth hands over TeX maintainership to another person who is
willing to actively maintain the project.

If one of these could be achieved then things like full Unicode

Torsten Bronger

unread,
Jun 5, 2004, 8:29:10 AM6/5/04
to
Halloechen!

Karl-Heinz Zimmer <k...@kde.org> writes:

> [...]


>
> Optimal would be one of these:
>
> A) Knuth stops the feature-freeze, resumes normal maintainership.
>
> B) Knuth hands over TeX maintainership to another person who is
> willing to actively maintain the project.

The TeX community can live well with the fact that actively
developed TeX derivates just have other names.

Tschoe,
Torsten.

--
Torsten Bronger, aquisgrana, europa vetus

David Kastrup

unread,
Jun 5, 2004, 8:58:40 AM6/5/04
to
Karl-Heinz Zimmer <k...@kde.org> writes:

> having learned that Knuth as TeX inventor and current maintainer

Oh good grief. After your misinformation campaign in
de.comp.text.tex, you have to start right through to comp.text.tex
and keep talking your same nonsense here.

Knuth is not the "current maintainer". Knuth is the author of TeX.
He is not maintaining TeX at all.

> is following a 'feature-freeze' policy since some years

20 years. That's not "some". There was a very limited update
(introduction of 8bit characters, multiple hyphenation patterns and
some other minor scope things) 14 years ago. 14 years still is not
"some".

But the 3.0 changes were not as much "development" as "additions".
They were absolutely not structural changes to TeX. So actual
_development_ stopped 20 years ago.

Get over it, for heaven's sake. Development for the software called
"TeX" is dead. Deal with it.

> and only accepts a very limited number of minor adjustments to TeX

Wrong. He accepts _no_ adjustment at all. He accepts _bug_ _fixes_
only, and does not even look more often than every 5 years currently
at them.

"TeX" development is dead. It has been explicitly announced to be
such in 1984, and the changes in 1990 were done only so that Knuth
could leave TeX dead with less of a bad aftertaste.

> I am wondering if there might be any chance of asking him to stop
> this freeze and let the TeX project return to normal development
> again.

You are completely lunatic. There is no such thing as "the TeX
project" outside of Knuth, and there never has been "normal
development". There was university-controlled development in the time
before 1978, and this kept on, under tight supervision and complete
control and main authorship of Knuth until 1984, when it was finished.
There _never_ _ever_ was such a thing as a "TeX project" publicly
going on. It was a one-man project (of course with students and
helpers).

> From other big projects (like e.g. KDE) I know about the importance
> of feature freezes of course but I wonder if there ever was a freeze
> of such loooong duration than the TeX project is experiencing at the
> moment.

Look, you are making yourself really, really, ridiculous. Please read
up on Knuth's statements about TeX, in particular those given in 1984.
The "TeX project" is not experiencing a "freeze of long duration".
"TeX" development is _dead_. This is _definitive_ from the author of
TeX, and it has been for decades.

Deal with it.

> Of course I know that the maintainer has every _right_ to do so but
> this does not mean his decisions are to be loved by everybody.

Knuth is irrelevant. If you want to have something different from
his final TeX, then develop something under a different name. That's
what happens with projecs like Omega, eTeX, ExTeX, PDFTeX, Aleph, NTS
and so on.

The name "TeX" itself is reserved for the final Knuth version.

> Great projects like PDFTeX and others show that there still _are_
> skilled people contributing to provide answers for the TeX using
> world, but these attempts (even if they produce GOOD things) are
> still sub-optimal.
>
> Optimal would be one of these:
>
> A) Knuth stops the feature-freeze, resumes normal maintainership.

You are a freaking lunatic. Knuth is retired and has pressing
business on his hand (TAOCP), TeX is well-defined and has been so for
decades. There is absolutely no sense in Knuth meddling with
development again.

> B) Knuth hands over TeX maintainership to another person who is
> willing to actively maintain the project.

There is nothing to hand over. TeX is public domain software.
Whoever wants to continue to develop TeX, already _has_ TeX available.

> If one of these could be achieved then things like full Unicode
> support or optical alignment of the right border could be added to
> TeX where they (IMhO) belong.

They have been added to TeX where they belong. You can get them
under the names of Aleph or Omega, or PDFTeX.

I'd like to see PDFeOmega at one point of time, sure. But _Knuth_ is
the last person on Earth to get involved with _that_.

--
David Kastrup, Kriemhildstr. 15, 44793 Bochum
UKTUG FAQ: <URL:http://www.tex.ac.uk/cgi-bin/texfaq2html>

Frank Mittelbach

unread,
Jun 5, 2004, 9:11:49 AM6/5/04
to
Karl-Heinz

> having learned that Knuth as TeX inventor and current maintainer is
> following a 'feature-freeze' policy since some years and only accepts
> a very limited number of minor adjustments to TeX I am wondering if
> there might be any chance of asking him to stop this freeze and let
> the TeX project return to normal development again.

i think you completely misunderstand the siutation.

"TeX-like" development is not dead at all; there are people working on it in
several directions and that is good, eg eTeX, pdftex, omega, nts and its
offsprings etc.

what has ended is Don's involvment in that matter and the only thing that is
frozen is that TeX and Metafont as names are supposed to refer to two very
precisely defined programs with a very precisely defined set of features

see http://www.tug.org/TUGboat/Articles/tb10-3/tb25knut.pdf


it is nonsense to require that the name TeX must be attached to a moving
maintained system to make progress, it is equally good or in fact better if
this stays frozen and the community moves forward to choose on a successor
as a formatter for daytodays operations one day.

in fact this is on a small scale already happening. the LaTeX project has
announced that they want eTeX features as part of the underlying formatter
for LaTeX and as a result of this nowadays Tex installations start to use
eTeX instead of TeX. in fact there is a move towards pdftex as far as i
know.


> From other big projects (like e.g. KDE) I know about the importance
> of feature freezes of course but I wonder if there ever was a freeze
> of such loooong duration than the TeX project is experiencing at the
> moment.

there is a big difference between something like KDE and something like TeX.
TeX and LaTeX are exchange formats defining languages and stability is an
important facture far more important than with something like kde where it
is absolutely irrelevant that i still run kde3.0 beta and you whatever kde
because they can make us both happy and we can still communicate.

on the other hand if i would send you a document relying on features in
version X of TeX but you only have Y installed it would fall over any we
can't communicate any more and that is what would happen if the worldwide
production system would be an actively modified program

of course you can have that, just choose any of the the TeX offsprings and
and ensure that it get reasonably many updates and changes. while that is
fine if all you are interested in is improving quality typesetting for
yourself it would be deadly for the exchange capabilities of of TeX
sources.

my two cents

frank

Karl-Heinz Zimmer

unread,
Jun 5, 2004, 9:31:14 AM6/5/04
to
Am Samstag, 5. Juni 2004 14:58 schrieb David Kastrup:

> Karl-Heinz Zimmer <k...@kde.org> writes:
>
>> having learned that Knuth as TeX inventor and current maintainer
>
> Oh good grief. After your misinformation campaign in
> de.comp.text.tex, you have to start right through to comp.text.tex
> and keep talking your same nonsense here.

I am sorry, David, but this is not the way I 'normally address' other
people's opinions.

I am not trying to start a "misinformation campaign" but was hoping
to get some polite answers to my calmly expressed opinion here - more
polite ones than the one you have sent now. But such is life, I can
stand that.

> Knuth is not the "current maintainer". Knuth is the author of TeX.
> He is not maintaining TeX at all.

Wait a moment, you are mixing two different things:

1. You say that Knuth is not the maintainer - what is wrong.

2. You repeat what I wrote: Maintanance is freezed.

Of course the maintainer of a program has the right to say that
there is a feature freeze - but this does not mean he isn't the
maintainer any longer.

See what Frank wrote: that's a very interesting posting since he says
in his opinion is it a _good_ thing that [TeX itself] "stays frozen


and the community moves forward to choose on a successor as a
formatter for daytodays operations one day."

While I se this differently I can accept that position - what I can
_not_ accept is behaviour of (skilled!) people like you who threat
other ones like you do.

Sorry.

David Kastrup

unread,
Jun 5, 2004, 9:44:02 AM6/5/04
to
Karl-Heinz Zimmer <k...@kde.org> writes:

> Am Samstag, 5. Juni 2004 14:58 schrieb David Kastrup:
>
> > Karl-Heinz Zimmer <k...@kde.org> writes:
> >
> >> having learned that Knuth as TeX inventor and current maintainer
> >
> > Oh good grief. After your misinformation campaign in
> > de.comp.text.tex, you have to start right through to comp.text.tex
> > and keep talking your same nonsense here.
>
> I am sorry, David, but this is not the way I 'normally address'
> other people's opinions.

No. You just choose to ignore any explanation. It is not like this
has not been explained, in detail, by several different people in
dozens of posts to you already.

That's not "normal addressing" of their efforts either. I am not
going to fall down in admiration over your self-chosen ignorance in
spite of repeated and detailed explanations.

Karl-Heinz Zimmer

unread,
Jun 5, 2004, 10:06:13 AM6/5/04
to
Am Samstag, 5. Juni 2004 15:44 schrieb David Kastrup:

> Karl-Heinz Zimmer <k...@kde.org> writes:
>
>> Am Samstag, 5. Juni 2004 14:58 schrieb David Kastrup:
>>
>> > Karl-Heinz Zimmer <k...@kde.org> writes:
>> >
>> >> having learned that Knuth as TeX inventor and current maintainer
>> >
>> > Oh good grief. After your misinformation campaign in
>> > de.comp.text.tex, you have to start right through to comp.text.tex
>> > and keep talking your same nonsense here.
>>
>> I am sorry, David, but this is not the way I 'normally address'
>> other people's opinions.
>
> No. You just choose to ignore any explanation. It is not like this
> has not been explained, in detail, by several different people in
> dozens of posts to you already.
>
> That's not "normal addressing" of their efforts either. I am not
> going to fall down in admiration over your self-chosen ignorance in
> spite of repeated and detailed explanations.

As I wrote before: Your lack of politeness did not prevent me from
understanding that the core of the problem is Knuth's decision not
to accept new feature patches to TeX but close its development.

That's the very reason for my posting here: I want to find out about
other peoples opinions _if_ there is a chance to solve that problem
by either

A) Knuth stops the feature-freeze, resumes normal maintainership.

or


B) Knuth hands over TeX maintainership to another person who is
   willing to actively maintain the project.

There was absolute _no_ reason for you to insult me here and repeat
your old statements which I understood (and told you before I under-
stand them).

The facts described by you are not wrong but I asked for something else.

Johannes Mueller

unread,
Jun 5, 2004, 10:37:52 AM6/5/04
to
Karl-Heinz Zimmer <k...@kde.org> skribis:

>
> That's the very reason for my posting here: I want to find out about
> other peoples opinions _if_ there is a chance to solve that problem
> by either

The thing is: There is no problem.

joh

Frank Mittelbach

unread,
Jun 5, 2004, 10:41:32 AM6/5/04
to
Karl-Heinz Zimmer wrote:


> That's the very reason for my posting here: I want to find out about
> other peoples opinions _if_ there is a chance to solve that problem
> by either
>
> A) Knuth stops the feature-freeze, resumes normal maintainership.
> or
> B) Knuth hands over TeX maintainership to another person who is
> willing to actively maintain the project.

the answers to both are no (by Don, see other posting) and I second them.

i do however think that the time is ripe to move from TeX as the underlying
engine in a standard Tex distribution to eTeX (or a stable (and frozen:-)
pdfetex) and that is what is currently happening.

i also think that the real challengage is high quality typesetting are still
not enough researched and understand and that most current successors are
attempts to find better solutions without actually being on a clear single
path. which is why again i rather welcome forks that experient in different
directions.


TeX beside being a program is a language standard like XML or what have you
and those do noth evolve either (without changing their name) and that for
good reason. granted that most standards change by applying version numbers
so the situation looks similar to that of program updates, but it is really
not the case if you look more closely.

it is also the big difference between free software licenses like GPL (which
i think is great for some tasks) and that preserves one kind of freedom and
LPPL which preserves some other kind of freedom (and which is important
for other tasks)

I understand you as coming from the ideas behind GPL which is as i said
something i value very much, eg for KDE but not for something like TeX. It
might be interesting for you to learn a bit more about the rationale behind
other ideas, eg read

ftp://ftp.tex.ac.uk/tex-archive/macros/latex/doc/modguide.pdf

or the 1000+ messages in the debian legal archives in 2002/2003 that
resulted in LPPL being accepted as DSFG compliant

best
frank

Karl-Heinz Zimmer

unread,
Jun 5, 2004, 10:43:23 AM6/5/04
to

OK, I see this is your opinion and the one of Torsten and David...
but please take a minute and read this text:

"The main reason for the changes was the fact that I had
guessed wrong about 7-bit character sets versus 8-bit
character sets. I believed that standard text input would
continue indefinitely to be confined to at most 128
characters, since I did not think a keyboard with 256
different outputs would be especially efficient. Needless
to say, I was proved wrong, especially by developments in
Europe and Asia. As soon as I realized that a text
formatting program with 7-bit input would rapidly begin
to seem as archaic as the 6-bit systems we once had, I
knew that a fundamental revision was necessary."

Spoke Knuth in 1989 when announcing his last revision.

Reading this I see lots of reason for hope: If Knuth was able to
understand "that a fundamental revision was necessary" - even if
he did NOT like this idea - then he might be able to understand
that the same is true today: now the issue is called Unicode
then the issue was called 8-bit.

Please help me understand why 'everybody' seems to be absolutely
sure that it is nonsense to think Knuth will understand that?

Karl-Heinz Zimmer

unread,
Jun 5, 2004, 10:45:34 AM6/5/04
to
Am Samstag, 5. Juni 2004 16:37 schrieb Johannes Mueller:

Hi,

just in case your news server is not fast enough to revoke my last
posting: Please regard my previous answer to your posting as void.

Frank's posting answered the question already and I now think I got it.

Karl-Heinz Zimmer

unread,
Jun 5, 2004, 10:52:11 AM6/5/04
to
<intentionally ignoring F'up-To:poster>

Am Samstag, 5. Juni 2004 16:41 schrieb Frank Mittelbach:

> Karl-Heinz Zimmer wrote:
>
>
>> That's the very reason for my posting here: I want to find out about
>> other peoples opinions _if_ there is a chance to solve that problem
>> by either
>>
>> A) Knuth stops the feature-freeze, resumes normal maintainership.
>> or
>> B) Knuth hands over TeX maintainership to another person who is
>> willing to actively maintain the project.
>
> the answers to both are no (by Don, see other posting) and I
> second them.
>
> i do however think that the time is ripe to move from TeX as the
> underlying engine in a standard Tex distribution to eTeX (or a
> stable (and frozen:-) pdfetex) and that is what is currently
> happening.

(...)

Thank you for the interesting comment, I now have another theory
why all of you cry (more or less) loudly upon my askings:

Might it be that you feel that the overall benefits from a big
re-write like PDFTeX or ExTeX are far better than the small
benefits that could be achieved if (if!) Knuth could be persuaded
to accept a few more contributions?

This would also explain why Torsten tells me the community can
well live with the state of TeX and appreciates the derivates...

So the end of the story is that I was wrong and you do NOT feel
that the big freeze is a bad think: partially because you like
the idea of starting from scratch or starting based upon the
old rui^H^H^HTeX code, like PDFTeX does.

Is that correct?

Karl-Heinz Zimmer

unread,
Jun 5, 2004, 10:53:17 AM6/5/04
to
<intentionally ignoring F'up-To:poster>

Am Samstag, 5. Juni 2004 16:41 schrieb Frank Mittelbach:

> Karl-Heinz Zimmer wrote:
>
>
>> That's the very reason for my posting here: I want to find out about
>> other peoples opinions _if_ there is a chance to solve that problem
>> by either
>>
>> A) Knuth stops the feature-freeze, resumes normal maintainership.
>> or
>> B) Knuth hands over TeX maintainership to another person who is
>> willing to actively maintain the project.
>
> the answers to both are no (by Don, see other posting) and I
> second them.
>
> i do however think that the time is ripe to move from TeX as the
> underlying engine in a standard Tex distribution to eTeX (or a
> stable (and frozen:-) pdfetex) and that is what is currently
> happening.

(...)

Thank you for the interesting comment, I now have another theory
why all of you cry (more or less) loudly upon my askings:

Might it be that you feel that the overall benefits from a big
re-write like PDFTeX or ExTeX are far better than the small
benefits that could be achieved if (if!) Knuth could be persuaded
to accept a few more contributions?

This would also explain why Torsten tells me the community can
well live with the state of TeX and appreciates the derivates...

So the end of the story is that I was wrong and you do NOT feel

that the big freeze is a bad thing: partially because you like


the idea of starting from scratch or starting based upon the
old rui^H^H^HTeX code, like PDFTeX does.

Is that correct?

Karl-Heinz

Thomas Widmann

unread,
Jun 5, 2004, 11:01:41 AM6/5/04
to
Karl-Heinz Zimmer <k...@kde.org> writes:

> Might it be that you feel that the overall benefits from a big
> re-write like PDFTeX or ExTeX are far better than the small benefits
> that could be achieved if (if!) Knuth could be persuaded to accept a
> few more contributions?

You seem to imply that people are crying out for just a few more
contributions to be added to TeX. Why do you think that, and what
kind of contributions do you have in mind?

/Thomas
--
Thomas Widmann Bye-bye to BibTeX: join the Bibulus project now!
tw...@bibulus.org <http://www.bibulus.org>
Glasgow, Scotland, EU <http://savannah.nongnu.org/projects/bibulus/>

Frank Mittelbach

unread,
Jun 5, 2004, 11:05:25 AM6/5/04
to
Karl-Heinz Zimmer wrote:

> <intentionally ignoring F'up-To:poster>

which is not nice, given my workload i'm not regularily following news so i
rather like being reminded (even through right now i follow)


>> i do however think that the time is ripe to move from TeX as the
>> underlying engine in a standard Tex distribution to eTeX (or a
>> stable (and frozen:-) pdfetex) and that is what is currently
>> happening.
> (...)
>
> Thank you for the interesting comment, I now have another theory
> why all of you cry (more or less) loudly upon my askings:

you seam to be a fast reader (have you already looked a bit into the
philosophy behind the frozen name TeX or LPPL etc?) or is this new theory
solemnly based on my short postings?


> Might it be that you feel that the overall benefits from a big
> re-write like PDFTeX or ExTeX are far better than the small
> benefits that could be achieved if (if!) Knuth could be persuaded
> to accept a few more contributions?

yes


> This would also explain why Torsten tells me the community can
> well live with the state of TeX and appreciates the derivates...

yes

>
> So the end of the story is that I was wrong and you do NOT feel
> that the big freeze is a bad thing: partially because you like
> the idea of starting from scratch or starting based upon the
> old rui^H^H^HTeX code, like PDFTeX does.

that is rubbish, but i have a concert to attend and not the time to go into
that right now

by the way, was that above "rubbish" code that was changed into TeX code?


> Is that correct?

no

frank

Karl-Heinz Zimmer

unread,
Jun 5, 2004, 11:10:20 AM6/5/04
to
Am Samstag, 5. Juni 2004 17:01 schrieb Thomas Widmann:

> Karl-Heinz Zimmer <k...@kde.org> writes:
>
>> Might it be that you feel that the overall benefits from a big
>> re-write like PDFTeX or ExTeX are far better than the small benefits
>> that could be achieved if (if!) Knuth could be persuaded to accept a
>> few more contributions?
>
> You seem to imply that people are crying out for just a few more
> contributions to be added to TeX. Why do you think that, and what
> kind of contributions do you have in mind?

My original point was this:

TeX is a great system, but it can not do optical alignment of
the right border characters.
PDFTeX however can do that.
Since such a feature (at least IMHO) touches the basic scope of TeX
it would have been 'natural' to include it into plain TeX.

In the meanwhile I understood that people just do not care about this
question since they (a) have given up expecting Knuth from waking up
and/or (b) appreciate the _new_ developments or derived works like
e.g. PDFTeX.

No problem, so the answer to my initial questions is both "No" and
the anser to the subject question of this posting is "Who cares!".

:-)

Karl-Heinz Zimmer

unread,
Jun 5, 2004, 11:12:54 AM6/5/04
to
Am Samstag, 5. Juni 2004 17:05 schrieb Frank Mittelbach:
> Karl-Heinz Zimmer wrote:

>> So the end of the story is that I was wrong and you do NOT feel
>> that the big freeze is a bad thing: partially because you like
>> the idea of starting from scratch or starting based upon the
>> old rui^H^H^HTeX code, like PDFTeX does.
>
> that is rubbish, but i have a concert to attend and not the time
> to go into that right now
>
> by the way, was that above "rubbish" code that was changed into
> TeX code?

Nope, sorry, that should be "starting upon the old ruins code", but
forget it. :-)

Karl-Heinz Zimmer

unread,
Jun 5, 2004, 11:13:41 AM6/5/04
to
Am Samstag, 5. Juni 2004 17:05 schrieb Frank Mittelbach:
> Karl-Heinz Zimmer wrote:

>> So the end of the story is that I was wrong and you do NOT feel
>> that the big freeze is a bad thing: partially because you like
>> the idea of starting from scratch or starting based upon the
>> old rui^H^H^HTeX code, like PDFTeX does.
>
> that is rubbish, but i have a concert to attend and not the time
> to go into that right now
>
> by the way, was that above "rubbish" code that was changed into
> TeX code?

Nope, sorry, that should be "starting upon the old ruins of TeX code",
but forget it. :-)

Karl-Heinz

Karl-Heinz Zimmer

unread,
Jun 5, 2004, 11:14:28 AM6/5/04
to
Am Samstag, 5. Juni 2004 17:05 schrieb Frank Mittelbach:
> Karl-Heinz Zimmer wrote:

>> So the end of the story is that I was wrong and you do NOT feel
>> that the big freeze is a bad thing: partially because you like
>> the idea of starting from scratch or starting based upon the
>> old rui^H^H^HTeX code, like PDFTeX does.
>
> that is rubbish, but i have a concert to attend and not the time
> to go into that right now
>
> by the way, was that above "rubbish" code that was changed into
> TeX code?

Nope, sorry, that should be "starting based upon the old ruins",
but forget it. :-)

Karl-Heinz

David Kastrup

unread,
Jun 5, 2004, 11:32:56 AM6/5/04
to
Karl-Heinz Zimmer <k...@kde.org> writes:

> My original point was this:
>
> TeX is a great system, but it can not do optical alignment of
> the right border characters.

It can't do so using PDFTeX's mechanisms. However, there is a less
convenient method explained in the TeXbook, and even the first
edition of the LaTeX companion was set with either optical alignment
or full hanging punctuation (do not remember quite which).

> PDFTeX however can do that. Since such a feature (at least IMHO)
> touches the basic scope of TeX it would have been 'natural' to
> include it into plain TeX.

plain TeX is a macro format. And it has been included into TeX, and
the resulting TeX is called PDFTeX.

> In the meanwhile I understood that people just do not care about
> this question since they (a) have given up expecting Knuth from
> waking up

Stop insinuating. Just because you refuse to listen to Knuth or
anybody else or read relevant documents does not imply that anybody
else chooses to live in a fantasy world where Knuth is "just
sleeping". Nobody except you has _ever_ expected Knuth to "wake up"
sine he did not fall asleep but left. And Knuth has completely
deliberately and poignantly declared development of TeX to be over.
This never has been a question of "falling asleep".

> and/or (b) appreciate the _new_ developments or derived works like
> e.g. PDFTeX.

What is a "new" development in your book? Developments always are
new.

Karl-Heinz Zimmer

unread,
Jun 5, 2004, 11:35:37 AM6/5/04
to
Am Samstag, 5. Juni 2004 17:32 schrieb David Kastrup:

> Karl-Heinz Zimmer <k...@kde.org> writes:

>> and/or (b) appreciate the _new_ developments or derived works like
>> e.g. PDFTeX.
>
> What is a "new" development in your book? Developments always are
> new.

Yes, sure. So do you agree that it is NOT frustrating that such new
features can not become part of TeX but instead new projects like
PDFTeX address the needs of today?

Probably me thinking of TeX like a "programming project" was wrong
and that's why I thought about finding a way to "wake up" the author.

David Kastrup

unread,
Jun 5, 2004, 12:20:00 PM6/5/04
to
Karl-Heinz Zimmer <k...@kde.org> writes:

> Am Samstag, 5. Juni 2004 17:32 schrieb David Kastrup:
>
> > Karl-Heinz Zimmer <k...@kde.org> writes:
>
> >> and/or (b) appreciate the _new_ developments or derived works like
> >> e.g. PDFTeX.
> >
> > What is a "new" development in your book? Developments always are
> > new.
>
> Yes, sure. So do you agree that it is NOT frustrating that such new
> features can not become part of TeX but instead new projects like
> PDFTeX address the needs of today?

Could you please stop irritating everybody with loaded questions?
Such new features _have_ been incorporated into TeX and the result is
called PDFTeX.

Karl-Heinz Zimmer

unread,
Jun 5, 2004, 12:27:27 PM6/5/04
to
Am Samstag, 5. Juni 2004 18:20 schrieb David Kastrup:

> Karl-Heinz Zimmer <k...@kde.org> writes:
>
>> Am Samstag, 5. Juni 2004 17:32 schrieb David Kastrup:
>>
>> > Karl-Heinz Zimmer <k...@kde.org> writes:
>>
>> >> and/or (b) appreciate the _new_ developments or derived works like
>> >> e.g. PDFTeX.
>> >
>> > What is a "new" development in your book? Developments always are
>> > new.
>>
>> Yes, sure. So do you agree that it is NOT frustrating that such new
>> features can not become part of TeX but instead new projects like
>> PDFTeX address the needs of today?
>
> Could you please stop irritating everybody with loaded questions?
> Such new features _have_ been incorporated into TeX and the result is
> called PDFTeX.

David, you won, I give up.

Robin Fairbairns

unread,
Jun 5, 2004, 2:57:20 PM6/5/04
to
Karl-Heinz Zimmer <k...@kde.org> writes:
>Am Samstag, 5. Juni 2004 14:58 schrieb David Kastrup:
>> Karl-Heinz Zimmer <k...@kde.org> writes:
>>> having learned that Knuth as TeX inventor and current maintainer
>>
>> Oh good grief. After your misinformation campaign in
>> de.comp.text.tex, you have to start right through to comp.text.tex
>> and keep talking your same nonsense here.
>
>I am sorry, David, but this is not the way I 'normally address' other
>people's opinions.
>
>I am not trying to start a "misinformation campaign" but was hoping
>to get some polite answers to my calmly expressed opinion here - more
>polite ones than the one you have sent now. But such is life, I can
>stand that.

felt like a misinformation campaign, to me. david's views are
characteristically forthright, but are as nothing to what you'll
experience if you attract the attention of the people i characterise
as the "knuth is god" group.

for myself, i merely think knuth is a wise man. he knows as well as
the rest of us that current tex is fundamentally un-extendable. you
perhaps haven't looked at the code: it's written in a fascinating
1970s (or early 1980s) mechanism of knuth's, known as "literate
computing". there is essentially no internal modularity in the code
of tex; it's written in pascal, most variables are global, and most
functions return results as side-effects -- were even such a modest[*]
program as tex to be started today, the end product would be
unbelievably different.

the excellent e-tex extensions are probably the last that will, or
even could be, performed on tex in a style that flows with knuth's.

both pdftex and omega progressed by small changes to core tex, with
most of the extra functionality in some other language than pascal.
the omega people are just putting the final touches to a complete
re-write of everything -- tex and omega extensions all in c++ (euch).

tex is _not_ a target for any sane person to extend. not even its
original author ... who as i said is a wise man; wise enough to do
everything in his power to prevent people doing what you suggest.

anyone who believes the ideas behind tex have a future should not be
trying to change tex the program, but should be throwing their support
behind one or other of the tex-based projects that david lists.

[*] by today's standards, tex is a tiny program.
--
Robin (http://www.tex.ac.uk/faq) Fairbairns, Cambridge

David Kastrup

unread,
Jun 5, 2004, 3:23:31 PM6/5/04
to
r...@cl.cam.ac.uk (Robin Fairbairns) writes:

> Karl-Heinz Zimmer <k...@kde.org> writes:
> >Am Samstag, 5. Juni 2004 14:58 schrieb David Kastrup:
> >> Karl-Heinz Zimmer <k...@kde.org> writes:
> >>> having learned that Knuth as TeX inventor and current maintainer
> >>
> >> Oh good grief. After your misinformation campaign in
> >> de.comp.text.tex, you have to start right through to comp.text.tex
> >> and keep talking your same nonsense here.
> >
> >I am sorry, David, but this is not the way I 'normally address' other
> >people's opinions.
> >
> >I am not trying to start a "misinformation campaign" but was hoping
> >to get some polite answers to my calmly expressed opinion here - more
> >polite ones than the one you have sent now. But such is life, I can
> >stand that.
>
> felt like a misinformation campaign, to me.

You haven't seen his battery of postings in de.comp.text.tex. If
somebody is under the mistaken impression that TeX can be called an
active or even dormant project with Knuth as maintainer, that's just a
mistake and that's it. But if he keeps campaigning for his
misconceptions and support of them time and again, even though quite a
few people try to explain to him that they are completely misguided
and why, it gets aggravating after some time. And if he then starts
the same unchanged litany in a different forum because everybody tells
him he is out of whack in de.comp.text.tex and why, it becomes hard to
fathom any motivation ascribable to a sentient being endowed with
reason. Maybe some sort of Turing test programmed in honor of Knuth
or the recent 25 year celebration of TeX, or something. Whatever.

What I wanted to say is that thanks for defending me, but the
comp.text.tex history alone was not accountable for the level of steam
built up in my reply.

I already unsubscribed de.comp.text.tex for a while because some
people felt utterly offended by my comparing his repeated loaded
questions with the classical "have you stopped beating your wife yet"
line.

Karl-Heinz Zimmer

unread,
Jun 5, 2004, 3:33:53 PM6/5/04
to
Am Samstag, 5. Juni 2004 21:23 schrieb David Kastrup:
> r...@cl.cam.ac.uk (Robin Fairbairns) writes:

> I already unsubscribed de.comp.text.tex for a while because some
> people felt utterly offended by my comparing his repeated loaded
> questions with the classical "have you stopped beating your wife yet"
> line.

Which is something I could not understand, you surely noticed that I
immediately responded to people accusing you for this - that was not
the type of words I complained about but a more understandable phrase.

I would really regret if you unsubscribed de.comp.text.tex because
of some reactions that came from the disput we had there: your
postings are more needed in this group than mine - by far.

So please think again over this decision and accept me asking for
pardon instead of staying away from the group.

David Kastrup

unread,
Jun 5, 2004, 3:51:25 PM6/5/04
to
Karl-Heinz Zimmer <k...@kde.org> writes:

> Am Samstag, 5. Juni 2004 21:23 schrieb David Kastrup:
> > r...@cl.cam.ac.uk (Robin Fairbairns) writes:
>
> > I already unsubscribed de.comp.text.tex for a while because some
> > people felt utterly offended by my comparing his repeated loaded
> > questions with the classical "have you stopped beating your wife
> > yet" line.
>

> So please think again over this decision and accept me asking for
> pardon instead of staying away from the group.

I did not unsubscribe because of you, but because my tone was deemed
inacceptable by qualified, capable and helpful persons there. Since
the current level of actively participating expertise in the group is
currently at a very high level, there is less harm inflicted if I just
stay away until the emotions have subsided than if I continue
answering in my accustomed manner, getting on good contributors'
nerves or leading them to infighting over etiquette as it should apply
to me or whatever.

There has rarely been a "manners for answering" discussion that has
not lead to one or several qualified people turning the back to the
group at least temporarily. Not a good idea to provoke another one of
those. They lead nowhere.

Karl-Heinz Zimmer

unread,
Jun 5, 2004, 3:56:25 PM6/5/04
to

Yes, you are right, we played this gamer several (too many) times in the
Linux groups and each time the result was a never-ending thread where
obsolutely nothing came out of. :(

Perhaps it would have been wiser had I not started these discussions.

For now there is only one simple question left: If I start contributing
to one of the new projects, how can I be sure this work does not lead
to more incompatibility? My main reason for not liking forkes is that
there normally is big problems to stay compatible if there is more than
one fork.

Stefan Nobis

unread,
Jun 5, 2004, 4:03:12 PM6/5/04
to
Karl-Heinz Zimmer <k...@kde.org> writes:

> Yes, sure. So do you agree that it is NOT frustrating that such new
> features can not become part of TeX but instead new projects like
> PDFTeX address the needs of today?

Hmmm... there is a program that does all you want, gives you even
new featues and at the same time all old features. And there is
another program that only gives you old featues.

So you say it's frustrating to still use the old program as you so
badly want the new features.

Why not just use the new program? What the hell is your problem?

--
Stefan.

Karl-Heinz Zimmer

unread,
Jun 5, 2004, 4:16:58 PM6/5/04
to

Since there is more than one new program, "my problem" is compatibility.

Which of the projects should I contribute to?
Which one will be a lost and lonely branch on the TeX derivates tree?

Note: This is not about "what to use" but "how to code on" and I was
having my own little skills in mind when I started this irri-
tating thread.

Life would be easier if I could just go to Knuth and say "Here is my
patch" - since this is not possible I first have to think about things
like "Is my Java knowledge good enough?" or "Will the world turn to
prefer PDFTeX finally, so I should support this project?" or "If I add
this or that functionality, might that lead to more incompatibility to
the other new projects?" and so on...

There is nothing _wrong_ with these forks, but from the point-of-view
of a programmer that's not an ideal situation.

David Kastrup

unread,
Jun 5, 2004, 5:06:00 PM6/5/04
to
Karl-Heinz Zimmer <k...@kde.org> writes:

> Am Samstag, 5. Juni 2004 21:51 schrieb David Kastrup:
>
> > There has rarely been a "manners for answering" discussion that
> > has not lead to one or several qualified people turning the back
> > to the group at least temporarily. Not a good idea to provoke
> > another one of those. They lead nowhere.
>
> Yes, you are right, we played this gamer several (too many) times in
> the Linux groups and each time the result was a never-ending thread
> where obsolutely nothing came out of. :(
>
> Perhaps it would have been wiser had I not started these
> discussions.

Forget it. You did not start a manner discussion. You got me to
display bad temper, but that's not particularly difficult.

> For now there is only one simple question left: If I start
> contributing to one of the new projects, how can I be sure this work
> does not lead to more incompatibility?

By contributing to a project that concentrates on decreasing
incompatibilities.

> My main reason for not liking forkes is that there normally is big
> problems to stay compatible if there is more than one fork.

Right. Currently "feasible" forks to work on (and responsible
persons) are:

eTeX Peter Breitenlohner
PDFTeX Han The Thanh
Omega2 John Plaice
Aleph Guiseppe Bilotta
ExTeX Michael Niedermair

eTeX is currently stagnating at version 2. Changes in typesetting
are probably best brought into eTeX, from which it will eventually
make it into PDFeTeX, Aleph, and by future Osmosis ExTeX. If you
have particular ideas, it would probably be most intelligent to
contact either Peter Breitenlohner or the tex-implementors' list of
TUG. I could also imagine a bit of backporting from Omega into eTeX:
namely the left/rightbox stuff would be convenient to have. I also
could propose quite a few of accounting and dissembling primitives
that would be quite beneficial to have. eTeX is basically written as
changefiles to the Pascal Web code.

PDFTeX is area of active work from time to time, as long as Han The
Thanh manages to get financial support from user groups. New
development gets done here actually, and new typesetting features
make it into PDFTeX. The PDF backend itself is not being worked on
much at the moment, it basically works.

Omega2 is an 31bit character set extension of TeX. It will solve
quite a few typesetting problems, as well as most character set
problems once it is there. It is not there yet, and actually, nobody
has seen much of it lately. There is supposed to be a publicly
accessible CVS archive. Ask John about details. Better yet, pester
him to finish it. He needs a cheering squad. Omega2 is mostly
written in C++, or supposed to be written in it. At the moment,
probably some Pascal remains.

Aleph has merged the last usable Omega1 version with eTeX.
Documentation for Omega1 is sparse, tutorials and examples close to
non-existent. That is because nobody will need Omega1 once Omega2 is
there, and things will be different. Except that this state has
persisted for years now. Working on Omega1/Aleph documentation, code,
tutorials and examples might be a dead end stopping when and where
Omega2 gets usable, but that could happen this month, or it could
happen in 10 years. And frankly, most of that could be adapted
probably. I think this would be a worthwhile project: if it could
lead to lambda1 or Aleph becoming a new TeX engine, this would vastly
increase the Omega1 user base, give people ideas. And if enough
people get ideas and pester John, so much the better. Omega1 was,
IIRC, again basically Pascal changefiles. Guiseppe is pretty much
involved with his thesis right now, but I guess that getting write
access to the Aleph code base might prove not too difficult. Quite a
bit of potential for working on that, but the results will probably
not make it as easily into the rest than if you worked on eTeX
instead.

ExTeX is basically reimplementing PDFeTeX with a bit of Omega
functionality and large character sets in Java. Actually, NTS was
pretty much the same, but ExTeX will be quite cleaner, or that's the
intent. ExTeX needs to get lots of work done, still, before one can
even talk about "extending" it. It requires rather recent Java
systems from Sun. There is little point in worrying about
compatibility to older systems before one has a system running: the
backward compatibility concerns might well be obsoleted by the time
ExTeX is getting close to finishing.

So that's basically your choice of areas to work on. Hemming
fragmentation can be done by the following approaches:

Work on eTeX. That usually makes it everywhere after a while.

Work on spreading Aleph. If Aleph/Omega becomes well enough
usable/stable/documented that it can be turned into the default engine
for LaTeX, PDFTeX will probably have enough of a pressure to follow
suit.

Work on finding out what it would take to have Omega2 finished.
People everywhere will wish you good luck.

David Kastrup

unread,
Jun 5, 2004, 5:06:31 PM6/5/04
to
Karl-Heinz Zimmer <k...@kde.org> writes:

> Am Samstag, 5. Juni 2004 21:51 schrieb David Kastrup:
>
> > There has rarely been a "manners for answering" discussion that
> > has not lead to one or several qualified people turning the back
> > to the group at least temporarily. Not a good idea to provoke
> > another one of those. They lead nowhere.
>
> Yes, you are right, we played this gamer several (too many) times in
> the Linux groups and each time the result was a never-ending thread
> where obsolutely nothing came out of. :(
>
> Perhaps it would have been wiser had I not started these
> discussions.

Forget it. You did not start a manner discussion. You got me to


display bad temper, but that's not particularly difficult.

> For now there is only one simple question left: If I start


> contributing to one of the new projects, how can I be sure this work
> does not lead to more incompatibility?

By contributing to a project that concentrates on decreasing
incompatibilities.

> My main reason for not liking forkes is that there normally is big


> problems to stay compatible if there is more than one fork.

Right. Currently "feasible" forks to work on (and responsible
persons) are:

--

David Fuchs

unread,
Jun 5, 2004, 5:12:04 PM6/5/04
to
Perhaps it needs to be pointed out that Knuth wants TeX to be, above all
things, stable. A plain TeX document from 1982 is runnable today, and will
produce the exact same line and page breaks. You can't open a Microsoft
Word (or WordPerfect or any other word-processing product) Version 1
document with any 2004 product and get a perfect match; you can't run the
old versions of the software that created those documents on any 2004
operating system product, either. I'll bet you can't even compile any
significant fraction of C or C++ programs from 1982 unchanged on a 2004
compiler (due to library and include-file issues if nothing else). Same for
VisiCalc spread-sheets, DBase databases, etc., etc. The absolue goal of the
way TeX was developed was to ensure this very strong and unprecedented
future/past compatability.

Adding a new feature to plain TeX would jeopardize the main feature of
stability. Hard as you may try, somewhere out there in the millions of
existing TeX documents there's going to be an edge case to trip you up.
And, even if you do manage to avoid any incompatability, there's still the
problem that a large portion of TeX users won't or can't upgrade, and that
will have a large, negative impact on the goal of universal document
interchangability.

So, that's why things are the way they are. Anyone with TeX can run any TeX
document from any place or time.

(All this being said, there's no problem adding new features on new
branches; just don't call it "TeX". And, of course, some day there might
well be a new defacto standard for technical documents, and perhaps it will
be based more or less closely on TeX.)


William F. Adams

unread,
Jun 5, 2004, 8:43:07 PM6/5/04
to
I'm kind of mystified that the OP wasn't pointed to NTS as one example of an
effort to create a compleat successor to TeX by means of re-writing. (A very
different one is the project ANT)

I guess this all falls back to the complexities of defining, ``What is TeX?''

- a Web program copyrighted by Donald E. Knuth, trademarked by AMS, published
by Addison-Wesley as _TeX: The Program_ and feature-frozen except for periodic
bug reviews. Explicit permission is provided to allow copying and the creation
of executables with new / different / varied capabilities by way of ``change
files'' (which may include ``TeX'' as a part of their name so long as they pass
the ``trip test'')

- a collection of algorithms placed in the public domain which is used as an
element of, or the basis of a number of other computer tools / applications,
including opensource (pdftex, Omega, Aleph) and commercial (Adobe's InDesign
uses URW's HZ algorithm which was based on TeX)

- a broad umbrella of systems for accomplishing typesetting on a variety of
systems (there's some overlap with the above) distributed under various
licenses.

Really, there should be enough in all of the above to make _anyone_ happy
(unless of course, one wants _all_ of the features in a single tool---I too
would dearly love to see a pdfeomega w/ xetex extensions written in Java (I
think that covers everything but ANT ;)).

For example the optical-alignment bit is a non-issue, since pdftex can stand in
as a replacement for regular tex, producing a .dvi, but while still allowing
character protruding / hanging punctuation (this was done in _The LaTeX
Companion, 2nd Edition_)

I'd suggest anyone who is interested in the state and history of TeX to read
DEK's _Digital Typography_, if in the future, the preprints of the 2004 meeting
recently shipped, and contains a number of papers looking forward to successors
and some of the features which they might hold.


--
William Adams
http://members.aol.com/willadams
Sphinx of black quartz, judge my vow.

Joerg Fischer

unread,
Jun 6, 2004, 1:23:42 AM6/6/04
to
* Karl-Heinz Zimmer wrote:
> There is nothing _wrong_ with these forks, but from the point-of-view
> of a programmer that's not an ideal situation.

The complete sources of TeX (the original unforked one) can be found at
http://www.ctan.org/tex-archive/systems/knuth/tex/

They are all in the file `tex.web'.

IMO it would be a good idea to get properly imformed about a
topic, before posting too many wise things about it.

Cheers,
Jörg

Karl-Heinz Zimmer

unread,
Jun 6, 2004, 3:25:23 AM6/6/04
to
Am Sonntag, 6. Juni 2004 07:23 schrieb Joerg Fischer:

> * Karl-Heinz Zimmer wrote:
>> There is nothing _wrong_ with these forks, but from the point-of-view
>> of a programmer that's not an ideal situation.
>
> The complete sources of TeX (the original unforked one) can be found at
> http://www.ctan.org/tex-archive/systems/knuth/tex/
>
> They are all in the file `tex.web'.

Sure, I knew.

> IMO it would be a good idea to get properly imformed about a
> topic, before posting too many wise things about it.

Joerg, you are absolutely right.

John Culleton

unread,
Jun 6, 2004, 2:45:47 PM6/6/04
to
Frank Mittelbach <frank.mi...@latex-project.org> wrote in message news:<c9sh0t$6ih$1...@online.de>...
> Karl-Heinz
>
> > having learned that Knuth as TeX inventor and current maintainer is
> > following a 'feature-freeze' policy since some years and only accepts
> > a very limited number of minor adjustments to TeX I am wondering if
> > there might be any chance of asking him to stop this freeze and let
> > the TeX project return to normal development again.
>
> i think you completely misunderstand the siutation.
>
> "TeX-like" development is not dead at all; there are people working on it in
> several directions and that is good, eg eTeX, pdftex, omega, nts and its
> offsprings etc.

>
> >
> what has ended is Don's involvment in that matter and the only thing that is
> frozen is that TeX and Metafont as names are supposed to refer to two very
> precisely defined programs with a very precisely defined set of features


What are not frozen are the formats, which Knuth in his wisdom made a
mandatory feature of TeX. Anyone can take any part of plain.tex and
fiddle to his/her heart's content. Or one can build an entirely new
set of conventions while maintaining compatibility with the plain
format, as in Context. The only real limitation is the essentially
non-wysiwyg and batch workflow of TeX, and there may be ways to work
around that. I have my own workarounds using Gvim as a controlling
program and editor and Xpdf as the WYSIWYG component. True we are
constrained a bit by Knuth's original scheme, but we are also
constrained by
the ASCII character set and indeed by whatever language we use to
express algorithms. The canonical TeX program is a bit like the
musical scale. There is plenty of room to maneuver within the 12 basic
tones.

Given this adaptability I see no reason to restring the piano. Not all
the tunes have been written yet. And one can of course take even the
frozen code, rewrite it and call it something else, as in Metapost.
The basic paragraph setting algorithms have been adapted to help
create InDesign.

But without any modification TeX continues to be a programming
language, one specialized to the task of formatting text and laying
out pages. Those who feel the urge to program can find plenty of
opportunity to exercise their skill.

John Culleton

David Kastrup

unread,
Jun 6, 2004, 2:57:46 PM6/6/04
to
jo...@wexfordpress.com (John Culleton) writes:

> Frank Mittelbach <frank.mi...@latex-project.org> wrote in message news:<c9sh0t$6ih$1...@online.de>...
>

> > what has ended is Don's involvment in that matter and the only
> > thing that is frozen is that TeX and Metafont as names are
> > supposed to refer to two very precisely defined programs with a
> > very precisely defined set of features
>
> What are not frozen are the formats, which Knuth in his wisdom made
> a mandatory feature of TeX.

Oh, come off it. They are just a speed optimization, and of little
relevance nowadays.

> Anyone can take any part of plain.tex and fiddle to his/her heart's
> content.

But he may not call the result plain TeX. And anyone can take any
part of tex.web and fiddle to his/her heart's content. But he may
not call the result TeX. Same difference.

> Or one can build an entirely new set of conventions while
> maintaining compatibility with the plain format, as in Context. The
> only real limitation is the essentially non-wysiwyg and batch
> workflow of TeX, and there may be ways to work around that.

Nonsense. You can do things as WYSIWYG as you want to, as long as
you don't call the result TeX.

You seem to be under a delusion as to TeX's licence. It is in the
Public Domain. You can do whatever you want with it. But the name
TeX is trademarked, and you can call things "TeX" or "plain TeX" only
under specific conditions (the condition for calling something "plain
TeX" basically is that you are Don Knuth, so it is more difficult
than calling things "TeX").

Phillip Helbig---remove CLOTHES to reply

unread,
Jun 6, 2004, 3:27:18 PM6/6/04
to
In article <x5r7stw...@lola.goethe.zz>, David Kastrup <d...@gnu.org>
writes:

> Right. Currently "feasible" forks to work on (and responsible
> persons) are:
>
> eTeX Peter Breitenlohner
> PDFTeX Han The Thanh
> Omega2 John Plaice
> Aleph Guiseppe Bilotta
> ExTeX Michael Niedermair

I used (La)TeX quite a lot up until about three-and-one-half years ago.
I am now starting to want to use it again, and the prospect is daunting.
Back in the good old days, there was LaTeX, which of course was a macro
package for TeX. TeX was stable. LaTeX, especially after the LaTeX2e
stuff came out, was on the one hand stable but also easily extensible
via the packages etc. Whatever one wanted, one knew where to look, and
if something wasn't available, one could write it oneself.

Now, I don't know where to start. It would be a shame to go with one
branch, then see it whither and die.

Could someone provide a brief summary of the above branches, for someone
who has been out of it since the beginning of 2001?

Where is the best place to start for upgrading my (La)TeX stuff. Of
course, (La)TeX is portable, but from a practical standpoing I would
like to start with an out-of-the-box VMS distribution. Ralf Gärtner
used to have a good one....

David Kastrup

unread,
Jun 6, 2004, 3:41:53 PM6/6/04
to
hel...@astro.multiCLOTHESvax.de (Phillip Helbig---remove CLOTHES to reply) writes:

> In article <x5r7stw...@lola.goethe.zz>, David Kastrup <d...@gnu.org>
> writes:
>
> > Right. Currently "feasible" forks to work on (and responsible
> > persons) are:
> >
> > eTeX Peter Breitenlohner
> > PDFTeX Han The Thanh
> > Omega2 John Plaice
> > Aleph Guiseppe Bilotta
> > ExTeX Michael Niedermair
>
> I used (La)TeX quite a lot up until about three-and-one-half years
> ago. I am now starting to want to use it again, and the prospect is
> daunting.

Nonsense. Just take TeXlive and be done.

> Back in the good old days, there was LaTeX, which of course was a
> macro package for TeX. TeX was stable. LaTeX, especially after the
> LaTeX2e stuff came out, was on the one hand stable but also easily
> extensible via the packages etc. Whatever one wanted, one knew
> where to look, and if something wasn't available, one could write it
> oneself.

No change here.

> Now, I don't know where to start. It would be a shame to go with
> one branch, then see it whither and die.

For practical use nowadays, PDFeTeX is by far the most flexible
choice, unless you also need Omega functionality, in which case you
can go for old Omega versions or, if you need eTeX functionality,
Aleph (which is basically Omega1+eTeX).

None of the _workable_ branches will "whither and die". The most that
can happen is that they become stale. TeX itself has been in that
state for more than a decade, and no TeX source has become obsolete in
consequence.

Timothy Murphy

unread,
Jun 6, 2004, 4:42:42 PM6/6/04
to
Phillip Helbig---remove CLOTHES to reply wrote:

> I used (La)TeX quite a lot up until about three-and-one-half years ago.
> I am now starting to want to use it again, and the prospect is daunting.
> Back in the good old days, there was LaTeX, which of course was a macro
> package for TeX. TeX was stable. LaTeX, especially after the LaTeX2e
> stuff came out, was on the one hand stable but also easily extensible
> via the packages etc. Whatever one wanted, one knew where to look, and
> if something wasn't available, one could write it oneself.
>
> Now, I don't know where to start. It would be a shame to go with one
> branch, then see it whither and die.
>
> Could someone provide a brief summary of the above branches, for someone
> who has been out of it since the beginning of 2001?

I just use LaTeX and pdfLaTeX, like 99.9% of people.
All these other programs are for pointy-heads,
and people who want to write maths in Sanscrit,
or play chess in Hebrew.

--
Timothy Murphy
e-mail (<80k only): tim /at/ birdsnest.maths.tcd.ie
tel: +353-86-2336090, +353-1-2842366
s-mail: School of Mathematics, Trinity College, Dublin 2, Ireland

William F. Adams

unread,
Jun 6, 2004, 4:45:17 PM6/6/04
to
David said:
>You seem to be under a delusion as to TeX's licence. It is in the
>Public Domain. You can do whatever you want with it.

It's a bit more complex than that, as I noted in my post.

- tex.web -> copyright by DEK (explicit permission for verbatim copying and
extending by way of change files)
- name \TeX -> trademarked by AMS
- algorithms underlying tex -> placed in the public domain by DEK

Also, Phillip, you seem to be misunderstanding the relationship between LaTeX2e
(the macro package) and pdftex or omega (the binary which one uses to typeset
using said macro package).

Anyway, there was some discussion of VMS binaries here a couple of month's
back, and you should be able to dig that out using http:\\groups.google.com

William

David Kastrup

unread,
Jun 6, 2004, 4:53:11 PM6/6/04
to
Timothy Murphy <t...@birdsnest.maths.tcd.ie> writes:

> I just use LaTeX and pdfLaTeX, like 99.9% of people.

LaTeX is a macro package, and so can run with any of the TeX variants,
and there is no such thing as "pdfLaTeX". There is a program titled
"TeX" and one called "PDFTeX", and there are executable links called
"latex" and "pdflatex".

With a current TeXlive, the executables run by "latex" and "pdflatex"
are eTeX and PDFeTeX, respectively, and the LaTeX format is used by
both.

It is pretty much likely that in future releases, PDFeTeX (with
differing config files, however) will be run by _both_ commands.

> All these other programs are for pointy-heads,
> and people who want to write maths in Sanscrit,
> or play chess in Hebrew.

In short: you certainly don't have a clue about what "99.9%" happen to
be running, and most likely not even what you yourself are running.

Check those startup messages next time: maybe eTeX has already crept
into your LaTeX setup without you noticing it.

Giuseppe Bilotta

unread,
Jun 6, 2004, 5:48:42 PM6/6/04
to
David Kastrup wrote:

> jo...@wexfordpress.com (John Culleton) writes:
> > What are not frozen are the formats, which Knuth in his wisdom made
> > a mandatory feature of TeX.
>
> Oh, come off it. They are just a speed optimization, and of little
> relevance nowadays.

Uh, David, sorry to contradict you but the ability to preload
formats is a *relevant* speed optimization. Maybe not for plain
or LaTeX, but do you have the *slightest* idea on how much time
it takes to create ConTeXt?

--
Giuseppe "Oblomov" Bilotta

Can't you see
It all makes perfect sense
Expressed in dollar and cents
Pounds shillings and pence
(Roger Waters)

Robin Fairbairns

unread,
Jun 6, 2004, 6:02:43 PM6/6/04
to
hel...@astro.multiCLOTHESvax.de (Phillip Helbig---remove CLOTHES to reply) writes:
[omitting the stuff about _which_ tex system to use]

>Where is the best place to start for upgrading my (La)TeX stuff. Of
>course, (La)TeX is portable, but from a practical standpoing I would
>like to start with an out-of-the-box VMS distribution. Ralf Gärtner
>used to have a good one....

then ask him. as a maintainer of the archives, and a long-time happy
user of vms, i spent some time trying to find information about vms
distributions, but have long since given up. (i did once get a new
vms distribution onto the archives, but i note that we have nothing
but odd subsystems with vms tags on them, now.)

a quick google leads me to 9-10 year old vms tex information. the
last tex i ever used on vms was compiled by me, from pascal source
direct out of weave -- some time like 1990 (istr the change file was
from brian hamilton kelly, among others).

Frank Mittelbach

unread,
Jun 6, 2004, 6:04:04 PM6/6/04
to
Giuseppe Bilotta wrote:

> David Kastrup wrote:
>> jo...@wexfordpress.com (John Culleton) writes:
>> > What are not frozen are the formats, which Knuth in his wisdom made
>> > a mandatory feature of TeX.
>>
>> Oh, come off it. They are just a speed optimization, and of little
>> relevance nowadays.
>
> Uh, David, sorry to contradict you but the ability to preload
> formats is a *relevant* speed optimization. Maybe not for plain
> or LaTeX, but do you have the *slightest* idea on how much time
> it takes to create ConTeXt?

the main argument is that formats are just memory dumps done for speed and
not relevant to the discussion and that statement holds.

also assuming there would be no format that can be precompiled i'm sure
context would compile differently and have a setup routine that you could
run seperately to produce a sort of "ascii dump"

also do you have the slighest idea how long it took Tex to compile a single
page if plain tex math when i started using it? (about a minute)

so (sorry to contradict you :-) even context bootstrapping is not really
that much of a time problem nowadays.

cheers
frank

David Kastrup

unread,
Jun 6, 2004, 6:08:59 PM6/6/04
to
Giuseppe Bilotta <bilo...@hotpop.com> writes:

> David Kastrup wrote:
> > jo...@wexfordpress.com (John Culleton) writes:
> > > What are not frozen are the formats, which Knuth in his wisdom made
> > > a mandatory feature of TeX.
> >
> > Oh, come off it. They are just a speed optimization, and of little
> > relevance nowadays.
>
> Uh, David, sorry to contradict you but the ability to preload
> formats is a *relevant* speed optimization. Maybe not for plain
> or LaTeX, but do you have the *slightest* idea on how much time
> it takes to create ConTeXt?

If it was really important, one could just dump the complete
executable memory image as-is, without any special code inside of TeX
proper. Emacs is "dumped" in a similar manner. Sure, a TeX-dumped
format is somewhat more compact, but I'd not be willing to bet my life
on it that a simple-minded memory image might not load faster, as it
does not need to rearrange the memory before starting operation.

Giuseppe Bilotta

unread,
Jun 6, 2004, 7:02:29 PM6/6/04
to
David Kastrup wrote:

> Giuseppe Bilotta <bilo...@hotpop.com> writes:
> > Uh, David, sorry to contradict you but the ability to preload
> > formats is a *relevant* speed optimization. Maybe not for plain
> > or LaTeX, but do you have the *slightest* idea on how much time
> > it takes to create ConTeXt?
>
> If it was really important, one could just dump the complete
> executable memory image as-is, without any special code inside of TeX
> proper. Emacs is "dumped" in a similar manner. Sure, a TeX-dumped
> format is somewhat more compact, but I'd not be willing to bet my life
> on it that a simple-minded memory image might not load faster, as it
> does not need to rearrange the memory before starting operation.

I think the format dumping/loading scheme achieves a good
enough balance between speed and size (ConTeXt is 5Mb).

Giuseppe Bilotta

unread,
Jun 6, 2004, 7:03:41 PM6/6/04
to
Frank Mittelbach wrote:
> the main argument is that formats are just memory dumps done for speed and
> not relevant to the discussion and that statement holds.

Ok, not discussion there.

> also assuming there would be no format that can be precompiled i'm sure
> context would compile differently and have a setup routine that you could
> run seperately to produce a sort of "ascii dump"

Uhm?

> also do you have the slighest idea how long it took Tex to compile a single
> page if plain tex math when i started using it? (about a minute)
>
> so (sorry to contradict you :-) even context bootstrapping is not really
> that much of a time problem nowadays.

Well, that's the kind of argument that makes Omega 1.23 look
like a slim and fast program.

Timothy Murphy

unread,
Jun 6, 2004, 8:09:15 PM6/6/04
to
David Kastrup wrote:

>> I just use LaTeX and pdfLaTeX, like 99.9% of people.
>
> LaTeX is a macro package, and so can run with any of the TeX variants,
> and there is no such thing as "pdfLaTeX".
> There is a program titled
> "TeX" and one called "PDFTeX", and there are executable links called
> "latex" and "pdflatex".

That's just pedantry.
You might as well say there is no such thing as LaTeX.

> With a current TeXlive, the executables run by "latex" and "pdflatex"
> are eTeX and PDFeTeX, respectively, and the LaTeX format is used by
> both.
>
> It is pretty much likely that in future releases, PDFeTeX (with
> differing config files, however) will be run by _both_ commands.
>
>> All these other programs are for pointy-heads,
>> and people who want to write maths in Sanscrit,
>> or play chess in Hebrew.
>
> In short: you certainly don't have a clue about what "99.9%" happen to
> be running, and most likely not even what you yourself are running.

I know I am running eTeX;
what I am saying is that I do not personally need or use
any of the features added to TeX,
and I don't believe the vast majority of LaTeX users do either.

Incidentally, I consider the appropriation of the terms tex and latex
for etex and elatex verges on deception.
If eTeX is so superior to TeX, why not let it run under its own name?
I suggest the reason is because if there were two programs
called latex and elatex,
the vast majority of people would run latex,
since it satisfies all their needs.

The proliferation of variants of TeX could have been a disaster.
Fortunately none of them has attracted significant usage
(except when masquerading as TeX)
with the exception of pdfTeX,
which involves only a modest change in the TeX engine.

David Fuchs

unread,
Jun 7, 2004, 1:18:26 AM6/7/04
to
The purpose of TeX ".fmt" files is to not to replace, but to significantly
enhance the creation of pre-loaded macro packages in "undumped" executables.
Quoting from the source code, Volume B of Computers and Typesetting (see
especially the *'ed portion):

We have noted that there are two versions of \TeX82. One, called
\.{INITEX},
has to be run first; it initializes everything from scratch, without
reading a format file, and it has the capability of dumping a format
file.
The other one is called `\.{VIRTEX}'; it is a ``virgin'' program that
needs
to input a format file in order to get started. \.{VIRTEX} typically
has
more memory capacity than \.{INITEX}, because it does not need the
space
consumed by the auxiliary hyphenation tables and the numerous calls on
|primitive|, etc.

The \.{VIRTEX} program cannot read a format file instantaneously, of
course;
the best implementations therefore allow for production versions of
\TeX\ that
not only avoid the loading routine for \PASCAL\ object code, they also
have
a format file pre-loaded. This is impossible to do if we stick to
standard
\PASCAL; but there is a simple way to fool many systems into avoiding
the
initialization, as follows:\quad(1)~We declare a global integer
variable
called |ready_already|. The probability is negligible that this
variable holds any particular value like 314159 when \.{VIRTEX} is
first
loaded.\quad(2)~After we have read in a format file and initialized
everything, we set |ready_already:=314159|.\quad(3)~Soon \.{VIRTEX}
will print `\.*', waiting for more input; and at this point we
* interrupt the program and save its core image in some form that the
* operating system can reload speedily.\quad(4)~When that core image is
activated, the program starts again at the beginning; but now
|ready_already=314159| and all the other global variables have
their initial values too. The former chastity has vanished!

In other words, if we allow ourselves to test the condition
|ready_already=314159|, before |ready_already| has been
assigned a value, we can avoid the lengthy initialization. Dirty tricks
rarely pay off so handsomely.

The tex and latex commands available to users in the earliest years were not
shortcuts for "virtex &plain" and "virtex &latex", where the .fmt file would
be re-read during each execution; rather, we always had fully pre-loaded
"tex.exe" and "latex.exe" executables that didn't have to re-read .fmt files
and started up instantly (since all the OS had to do was to mmap the
executable into your address space and jump to it, and it would just get
paged right in; the data segment was copy-on-write, and programs always got
the same logical memory address, so no extra copying of data or
address-fixups in the code was needed).

Historically speaking, the original development of TeX was done on machines
that had the ability to save an interrupted program built into the operating
system, so you didn't even have to use a special user-land program to
accomplish the task. The whole business of Initex vs. Virtex vs. Tex
(defined as Virtex with plain.fmt preloaded into the executable) was to
squeeze every last byte out of the program, to leave as much room as
possible for the mem array and friends, since we were running on machines
that could only address 2^18 36-bit words (that's about a megabyte, and note
it's address space, not physical memory). On the machine Knuth developed
TeX on (a DEC10 running a highly customized cousin of TOPS10), the OS split
the address space into two halves, for code vs. data, so only half a
megabyte for data was available to TeX. Adding a single byte to any of the
common data structures (especially a charnode or token) would have meant
that TeX wouldn't be able to compose the TeXbook. The machine that most
early TeX users ran on (DEC20's running TOPS20) didn't have the
half-and-half restriction, and the users considered themselves lucky to have
a few hundred kbytes extra for the data segment. But I digress...

-David


Robin Fairbairns

unread,
Jun 7, 2004, 3:01:26 AM6/7/04
to
Timothy Murphy <t...@birdsnest.maths.tcd.ie> writes:

>David Kastrup wrote:
>> It is pretty much likely that in future releases, PDFeTeX (with
>> differing config files, however) will be run by _both_ commands.
>
>I know I am running eTeX;
>what I am saying is that I do not personally need or use
>any of the features added to TeX,
>and I don't believe the vast majority of LaTeX users do either.

except if they want any but trivial colour support in pdftex, of
course.

>Incidentally, I consider the appropriation of the terms tex and latex
>for etex and elatex verges on deception.
>If eTeX is so superior to TeX, why not let it run under its own name?
>I suggest the reason is because if there were two programs
>called latex and elatex,
>the vast majority of people would run latex,
>since it satisfies all their needs.

ha ha ha. very droll.

the amalgamation of the "executables" was a suggestion from the latex
team, to allow developments (like heiko's pdftex colour stack). the
suggestion was welcomed with great glee by the distribution managers,
who saw a route to simplifying their tortuous packages. (the current
discussion on merging everything into pdfetex comes from them, not the
latex team.)

>The proliferation of variants of TeX could have been a disaster.
>Fortunately none of them has attracted significant usage
>(except when masquerading as TeX)
>with the exception of pdfTeX,
>which involves only a modest change in the TeX engine.

apart, of course, from its narvellous stuff on margin kerning. it
must be sad to feel so embittered about people who're trying to
improve your working environment, that you feel the need to
characterise them as "pointy-heads"; like hoi polloi's despising of
those stupid scientists who will insist on making things that hoi
polloi then buy in great excitement...

Giuseppe Bilotta

unread,
Jun 7, 2004, 3:11:51 AM6/7/04
to
David Fuchs wrote:
> Historically speaking, the original development of TeX was done on machines
> that had the ability to save an interrupted program built into the operating
> system, so you didn't even have to use a special user-land program to
> accomplish the task. The whole business of Initex vs. Virtex vs. Tex
> (defined as Virtex with plain.fmt preloaded into the executable) was to
> squeeze every last byte out of the program, to leave as much room as
> possible for the mem array and friends, since we were running on machines
> that could only address 2^18 36-bit words (that's about a megabyte, and note
> it's address space, not physical memory). On the machine Knuth developed
> TeX on (a DEC10 running a highly customized cousin of TOPS10), the OS split
> the address space into two halves, for code vs. data, so only half a
> megabyte for data was available to TeX. Adding a single byte to any of the
> common data structures (especially a charnode or token) would have meant
> that TeX wouldn't be able to compose the TeXbook. The machine that most
> early TeX users ran on (DEC20's running TOPS20) didn't have the
> half-and-half restriction, and the users considered themselves lucky to have
> a few hundred kbytes extra for the data segment. But I digress...

Wow. Interesting reading. Thanks.

David Kastrup

unread,
Jun 7, 2004, 4:23:18 AM6/7/04
to
Giuseppe Bilotta <bilo...@hotpop.com> writes:

> David Kastrup wrote:
> > Giuseppe Bilotta <bilo...@hotpop.com> writes:
> > > Uh, David, sorry to contradict you but the ability to preload
> > > formats is a *relevant* speed optimization. Maybe not for plain
> > > or LaTeX, but do you have the *slightest* idea on how much time
> > > it takes to create ConTeXt?
> >
> > If it was really important, one could just dump the complete
> > executable memory image as-is, without any special code inside of TeX
> > proper. Emacs is "dumped" in a similar manner. Sure, a TeX-dumped
> > format is somewhat more compact, but I'd not be willing to bet my life
> > on it that a simple-minded memory image might not load faster, as it
> > does not need to rearrange the memory before starting operation.
>
> I think the format dumping/loading scheme achieves a good
> enough balance between speed and size (ConTeXt is 5Mb).

Do you still remember what the discussion is about? Hint: it was not
about speed at all.

David Kastrup

unread,
Jun 7, 2004, 4:33:23 AM6/7/04
to
Timothy Murphy <t...@birdsnest.maths.tcd.ie> writes:

> David Kastrup wrote:
>
> >> I just use LaTeX and pdfLaTeX, like 99.9% of people.
> >
> > LaTeX is a macro package, and so can run with any of the TeX
> > variants, and there is no such thing as "pdfLaTeX". There is a
> > program titled "TeX" and one called "PDFTeX", and there are
> > executable links called "latex" and "pdflatex".
>
> That's just pedantry. You might as well say there is no such thing
> as LaTeX.

Except that it would be wrong. LaTeX is the macro package. The point
of "pedantry" is not to make claims even wronger than the original.

> > With a current TeXlive, the executables run by "latex" and
> > "pdflatex" are eTeX and PDFeTeX, respectively, and the LaTeX
> > format is used by both.
> >
> > It is pretty much likely that in future releases, PDFeTeX (with
> > differing config files, however) will be run by _both_ commands.
> >
> >> All these other programs are for pointy-heads, and people who
> >> want to write maths in Sanscrit, or play chess in Hebrew.
> >
> > In short: you certainly don't have a clue about what "99.9%"
> > happen to be running, and most likely not even what you yourself
> > are running.
>
> I know I am running eTeX; what I am saying is that I do not
> personally need or use any of the features added to TeX, and I don't
> believe the vast majority of LaTeX users do either.
>
> Incidentally, I consider the appropriation of the terms tex and
> latex for etex and elatex verges on deception.

So you actually have no clue what you are running. The `tex' command
still runs TeX if it is available. The `latex' command runs what the
LaTeX team has considered fit to be running under the name of
`latex', and the `tex' command runs what Knuth has considered fit to
be running under the name of `tex'.

> If eTeX is so superior to TeX, why not let it run under its own
> name?

It does.

> I suggest the reason is because if there were two programs called
> latex and elatex, the vast majority of people would run latex, since
> it satisfies all their needs.

It doesn't. How often do you get complaints about "no room for a new
dimension"? I am creating macro packages that could not be done
without eTeX. How convenient is it for users to go reconfiguring
every TeX shell they might be using?

For what gain? To be running a crippled version of LaTeX that
supports fewer packages?

> The proliferation of variants of TeX could have been a disaster.
> Fortunately none of them has attracted significant usage (except
> when masquerading as TeX) with the exception of pdfTeX, which
> involves only a modest change in the TeX engine.

"a" modest change? Like character protruding and font scaling to get
better line filling?

Do you actually know what you are talking about?

Timothy Murphy

unread,
Jun 7, 2004, 7:46:22 AM6/7/04
to
David Kastrup wrote:

>> That's just pedantry. You might as well say there is no such thing
>> as LaTeX.
>
> Except that it would be wrong. LaTeX is the macro package. The point
> of "pedantry" is not to make claims even wronger than the original.

I didn't make any claim.
I wrote "pdfLaTeX" and you said I should have written "pdflatex".
That is pure pedantry.
You probably should be writing the Court column for the London Times.



> >> > With a current TeXlive, the executables run by "latex" and
>> > "pdflatex" are eTeX and PDFeTeX, respectively, and the LaTeX
>> > format is used by both.

>> If eTeX is so superior to TeX, why not let it run under its own
>> name?
>
> It does.

?

>> I suggest the reason is because if there were two programs called
>> latex and elatex, the vast majority of people would run latex, since
>> it satisfies all their needs.
>
> It doesn't. How often do you get complaints about "no room for a new
> dimension"?

Never

> I am creating macro packages that could not be done
> without eTeX.

Tell me one command in one package that requires eTeX,
and is actually used by more than 3 people in the universe.

> How convenient is it for users to go reconfiguring
> every TeX shell they might be using?

I don't know what this means; but I know I never do it.

> "a" modest change? Like character protruding and font scaling to get
> better line filling?

In my view this later feature of pdftex/PDFTeX/pdfTeX/PDFtex/PdFTeX,
although ingenious, is quite unnecessary.
I doubt if one person in 10,000 would notice
whether or not this was used in printing a document.



> Do you actually know what you are talking about?

Probably

Johannes Mueller

unread,
Jun 7, 2004, 8:33:25 AM6/7/04
to
Timothy Murphy <t...@birdsnest.maths.tcd.ie> skribis:
> David Kastrup wrote:
[...]
>> "a" modest change? Like character protruding and font scaling to get
>> better line filling?
>
> In my view this later feature of pdftex/PDFTeX/pdfTeX/PDFtex/PdFTeX,
> although ingenious, is quite unnecessary.
> I doubt if one person in 10,000 would notice
> whether or not this was used in printing a document.

Typographical quality of that kind is something that is only noticed
consciously by one person in 10,000. But nevertheless it improves
readability also for the 9,999 others.

joh

David Kastrup

unread,
Jun 7, 2004, 8:43:12 AM6/7/04
to
Timothy Murphy <t...@birdsnest.maths.tcd.ie> writes:

> David Kastrup wrote:
>
> >> That's just pedantry. You might as well say there is no such thing
> >> as LaTeX.
> >
> > Except that it would be wrong. LaTeX is the macro package. The point
> > of "pedantry" is not to make claims even wronger than the original.
>
> I didn't make any claim.
> I wrote "pdfLaTeX" and you said I should have written "pdflatex".
> That is pure pedantry.

No, it isn't. Because your whole point was that it would be
fraudulent to confuse PDFLaTeX and PDFeLaTeX in the manner that it
would be fraudulent to confuse TeX and eTeX. But when we are talking
about TeX and eTeX, we are talking about the programs themselves, not
about the names of some symbolic links.

There is no such thing as "eLaTeX" or "PDFeLaTeX", and so your claims
of deceipt are nonsensical. What kind if engine LaTeX runs on is up
to the implementor and/or the LaTeX team. And since the LaTeX team
has recommended that `latex' and `pdflatex' are to run the eTeX
resp. PDFeTeX executable, there is absolutely no deceipt involved
here.

But to make clear why your arguments are nonsensical, one has to
analyze that your complaint is about a nonexisting entity.

> >> If eTeX is so superior to TeX, why not let it run under its own
> >> name?
> >
> > It does.
>
> ?

`tex' runs TeX with the plain TeX format, `etex' runs eTeX with the
plain TeX format, and `latex' runs eTeX with the LaTeX format.

That is the setup in TL2003, and the recommended one currently.

For all that I care, all of pdftex, pdflatex, latex could be running
PDFeTeX (with `latex' having a default \pdfoutput=0).

Only `tex' is supposed and more or less guaranteed to run just TeX
with plain TeX preloaded.

> >> I suggest the reason is because if there were two programs called
> >> latex and elatex, the vast majority of people would run latex,
> >> since it satisfies all their needs.
> >
> > It doesn't. How often do you get complaints about "no room for a
> > new dimension"?
>
> Never

Well, then you don't read quite often here. It is actually an FAQ.

> > I am creating macro packages that could not be done without eTeX.
>
> Tell me one command in one package that requires eTeX, and is
> actually used by more than 3 people in the universe.

\ifcsname in suffix.dtx. Without it, the package's approach would be
unusable since it would cause an explosion in hash size requirements.

The color mark mechanisms in pdfcolmk.sty. Oodles of stuff in
bigfoot.sty (under development) which is intended to supplant most
footnote-related packages.

Things like lineno.sty silently let spacing between headings and text
disappear, simply because they don't use the \savingvdiscards
mechanism of eTeX.

> > How convenient is it for users to go reconfiguring every TeX shell
> > they might be using?
>
> I don't know what this means; but I know I never do it.

Which is why it is a good thing that `latex' will refer to the
correct executable necessary to run all relevant packages properly.

> > "a" modest change? Like character protruding and font scaling to
> > get better line filling?
>
> In my view this later feature of pdftex/PDFTeX/pdfTeX/PDFtex/PdFTeX,
> although ingenious, is quite unnecessary.

And the former?

> I doubt if one person in 10,000 would notice whether or not this was
> used in printing a document.

That's the point. You don't notice the presence of good typography.
Only its absence.

> > Do you actually know what you are talking about?
>
> Probably

With regard to your complaints about what command was executing what
executable with what format, this probability is not apparent to the
observer.

Timothy Murphy

unread,
Jun 7, 2004, 8:44:37 AM6/7/04
to
Robin Fairbairns wrote:

>>The proliferation of variants of TeX could have been a disaster.
>>Fortunately none of them has attracted significant usage
>>(except when masquerading as TeX)
>>with the exception of pdfTeX,
>>which involves only a modest change in the TeX engine.
>
> apart, of course, from its narvellous stuff on margin kerning. it
> must be sad to feel so embittered about people who're trying to
> improve your working environment, that you feel the need to
> characterise them as "pointy-heads"; like hoi polloi's despising of
> those stupid scientists who will insist on making things that hoi
> polloi then buy in great excitement...

Surely not a slanderous term ...

===============================
pointy-head

SYLLABICATION:
point·y-head

NOUN:
Slang An intellectual.

OTHER FORMS:
pointy-headed -ADJECTIVE
===============================

Someone said in this thread that Knuth is not God.
Of course he is not; but he is the Pope of TeX
(and probably more infallible than the other man).

To my mind, TeX is like Chartres cathedral.
If it were being built today different materials would be used.
That does not mean it would be improved
by re-casting some of the damaged figures in plastic.

Knuth based TeX on a certain view of printing,
derived from a study of the craft of hot-metal printers.
None of the "improvers" of TeX seem to share this vision.

You, Robin, attacked tex.web and Knuth's idea of Literate Programming.
I believe that tex.web has a quality that has never been equalled.
(The nearest, in my judgement, would be Minix
and perhaps the original Unix.)

In brief, while I would increase array sizes wherever possible,
the original TeX still serves its original purpose,
namely to print mathematics,
as perfectly as we can expect in this imperfect world,
and the various attempts to re-write TeX are fundamentally mistaken.

Morten Høgholm

unread,
Jun 7, 2004, 8:46:52 AM6/7/04
to
On Mon, 07 Jun 2004 12:46:22 +0100, Timothy Murphy
<t...@birdsnest.maths.tcd.ie> wrote:

> David Kastrup wrote:
>
>> It doesn't. How often do you get complaints about "no room for a new
>> dimension"?
>
> Never

You are lucky then.

>> I am creating macro packages that could not be done
>> without eTeX.
>
> Tell me one command in one package that requires eTeX,
> and is actually used by more than 3 people in the universe.

suffix.sty would be one, bigfoot.sty another. Both written by David
actually. In my own package empheq.sty, people who use eTeX as compiler
will get 1) a speed improvement and b) robustness.

Typesetting Hebrew is of course quite difficult with TeX as the compiler.

>> "a" modest change? Like character protruding and font scaling to get
>> better line filling?
>
> In my view this later feature of pdftex/PDFTeX/pdfTeX/PDFtex/PdFTeX,
> although ingenious, is quite unnecessary.
> I doubt if one person in 10,000 would notice
> whether or not this was used in printing a document.

In documents written in German, Danish or any other language with
unlimited numners of compound words, you quite often see two or more
consecutive lines ending with hyphens, which in effect makes these lines
look too short, although technically they're not. The features of pdfTeX
for providing better line fillings are quite useful to me as a Dane. The
point is of course that the readers shouldn't notice it, because the text
appears uniform.
--
Morten Høgholm
I haven't got a smelly address.
UK-TUG FAQ: <URL:http://www.tex.ac.uk/cgi-bin/texfaq2html>

Didier Verna

unread,
Jun 7, 2004, 8:50:43 AM6/7/04
to
David Kastrup <d...@gnu.org> wrote:

> Right. Currently "feasible" forks to work on (and responsible
> persons) are:
>
> eTeX Peter Breitenlohner
> PDFTeX Han The Thanh
> Omega2 John Plaice
> Aleph Guiseppe Bilotta
> ExTeX Michael Niedermair

I'm far for being a typesetting nitpicker, but one thing I suffer from
is the TeX API. Sometimes, I dream that I have the features of TeX but that
the interface is something sensible: no macros, decent function call, argument
passing and evaluation semantics. No \if, real scripting capabilities etc.
Actually, I dream that TeX is available through a totally (La)TeX-incompatible
Lisp engine.

Has any thought / work been already produced along these lines ?


Kastrup Disclaimer: Of course, I have no clue what I'm talking about :-)

--
Didier Verna, did...@lrde.epita.fr, http://www.lrde.epita.fr/~didier

EPITA / LRDE, 14-16 rue Voltaire Tel.+33 (1) 44 08 01 85
94276 Le Kremlin-Bicętre, France Fax.+33 (1) 53 14 59 22 did...@xemacs.org

Torsten Bronger

unread,
Jun 7, 2004, 8:54:30 AM6/7/04
to
Halloechen!

Timothy Murphy <t...@birdsnest.maths.tcd.ie> writes:

> [...] I believe that tex.web has a quality that has never been
> equalled. [...]

Extensability and maintainability are important qualities of
programs. As far as I can see, TeX failed in these respects.

> In brief, while I would increase array sizes wherever possible,
> the original TeX still serves its original purpose, namely to
> print mathematics, as perfectly as we can expect in this imperfect
> world, and the various attempts to re-write TeX are fundamentally
> mistaken.

Well, if you limit the expectations/design goals enough, a certain
program will always seem to be good.

Not that I want to say that TeX is a bad program -- but one or two
mistakes have been made in my opinion.

Tschoe,
Torsten.

--
Torsten Bronger, aquisgrana, europa vetus

Maarten Sneep

unread,
Jun 7, 2004, 9:02:14 AM6/7/04
to
In article <muxfz97a5...@uzeb.lrde.epita.fr>,
Didier Verna <did...@lrde.epita.fr> wrote:

> I'm far for being a typesetting nitpicker, but one thing I suffer from
> is the TeX API. Sometimes, I dream that I have the features of TeX but that
> the interface is something sensible: no macros, decent function call, argument
> passing and evaluation semantics. No \if, real scripting capabilities etc.
> Actually, I dream that TeX is available through a totally (La)TeX-incompatible
> Lisp engine.

Not lisp, but it might be interesting non-the-less:
http://www.pytex.org/

Maarten

Jonathan Fine

unread,
Jun 7, 2004, 9:09:17 AM6/7/04
to
In response to a previous thread
> Any chance of asking Knuth to stop the TeX feature-freeze?
"Robin Fairbairns" <r...@cl.cam.ac.uk> wrote in message
news:c9t52g$jjb$2...@pegasus.csx.cam.ac.uk...
<snip>
> for myself, i merely think knuth is a wise man. he knows as well as
> the rest of us that current tex is fundamentally un-extendable.

Robin, would you provide some sources for this being Knuth's view?

Jonathan
--
Jonathan Fine
The Open University, Milton Keynes, England


David Kastrup

unread,
Jun 7, 2004, 9:14:25 AM6/7/04
to
Timothy Murphy <t...@birdsnest.maths.tcd.ie> writes:

> Someone said in this thread that Knuth is not God.
> Of course he is not; but he is the Pope of TeX
> (and probably more infallible than the other man).

It is a popular diversion for religious fanatics to ignore or kill
their purported leader in order to make nonsensical claims about him.

If you want to glorify Knuth, then don't forget to heed his words:

As stated on the copyright pages of Volumes B, D, and
E, anybody can make use of my programs in whatever
way they wish, as long as they do not use the names
TEX, METAFONT, or Computer Modern. In particular,
any person or group who wants to produce a program
superior to mine is free to do so. However, nobody is
allowed to call a system TEX or METAFONT unless that
system conforms 100% to my own programs, as I have
specified in the manuals for the TRIP and TRAP tests.

[...]

Of course I do not claim to have found the best solu-
tion to every problem. I simply claim that it is a great
advantage to have a fixed point as a building block. Im-
proved macro packages can be added on the input side;
improved device drivers can be added on the output side.
I welcome continued research that will lead to alterna-
tive systems that can typeset documents better than TEX
is able to do. But the authors of such systems must think
of another name.

You see that Knuth _very_ clearly encourages that people will work on
beyond TeX.

> To my mind, TeX is like Chartres cathedral. If it were being built
> today different materials would be used. That does not mean it
> would be improved by re-casting some of the damaged figures in
> plastic.

But it does not mean that no buildings must ever be erected after
that.

> Knuth based TeX on a certain view of printing, derived from a study
> of the craft of hot-metal printers. None of the "improvers" of TeX
> seem to share this vision.

So what? Knuth is a self-admitted amateur that has wisely restricted
himself to acquiring just the knowledge necessary to make "The Art of
Computer Programming" typeset in a manner that he would consider
appropriate and a piece of art in itself. That makes him a true and
valiant artist, but not a pope. And part of his life's artwork is
that which he chose to call "TeX". The Art of Computer Programming
is cast into ink by employing quite a few of its algorithms (such as
tries, hash tables and so on) in the program called "TeX".

But the typesetting needs of today don't stop at "The Art of Computer
Programming". Believe it or not: there are more books to be typeset.

Knuth has set a monument for typesetting, not a tombstone.

> You, Robin, attacked tex.web and Knuth's idea of Literate
> Programming. I believe that tex.web has a quality that has never
> been equalled.

Nonsense. tex.web is a heap of crap when compared to metafont.web.
The code quality, and in particular the quality and conciseness of
the implemented language, is quite different.

Knuth himself does no longer program in Pascal web, but works with
Cweb. And so on.

You don't need to take others' words for it: just look at the acts
and words of Knuth himself.

> In brief, while I would increase array sizes wherever possible, the
> original TeX still serves its original purpose, namely to print
> mathematics, as perfectly as we can expect in this imperfect world,
> and the various attempts to re-write TeX are fundamentally mistaken.

If you are of the opinion that the world's typesetting needs must not
go beyond the mathematics in "The Art of Computer Programming".

Since you are a mathematician, one could call it a selfish stance.
However, since you have nothing at all to gain by keeping others from
having their typesetting needs accomplished, it is merely foolish.

And not even mathematicians suffer any detrimental effects from being
able to produce web-publishable documents (which need to go beyond
Knuth's cm fonts) with good quality.

Nobody is taking plain TeX and Knuth's program "TeX" from you, so
your bickerings are not even self-serving.

Achim Blumensath

unread,
Jun 7, 2004, 9:52:01 AM6/7/04
to
Didier Verna wrote:
> I'm far for being a typesetting nitpicker, but one thing I suffer from
> is the TeX API. Sometimes, I dream that I have the features of TeX but
> that the interface is something sensible: no macros, decent function
> call, argument passing and evaluation semantics. No \if, real
> scripting capabilities etc. Actually, I dream that TeX is available
> through a totally (La)TeX-incompatible Lisp engine.
>
> Has any thought / work been already produced along these lines ?

You might want to take a look at ant (available on my home page). It is
written in OCaml. The current development version includes a
Haskell-like scripting language (actually a mixture between Haskell and
MetaFont).

I have to add though that it not 100% TeX compatible and still
incomplete.

Achim
--
________________________________________________________________________
| \_____/ |
Achim Blumensath \O/ \___/\ |
LaBRI / Bordeaux =o= \ /\ \|
www-mgi.informatik.rwth-aachen.de/~blume /"\ o----|
____________________________________________________________________\___|

David Kastrup

unread,
Jun 7, 2004, 9:23:53 AM6/7/04
to
Torsten Bronger <bro...@physik.rwth-aachen.de> writes:

> Timothy Murphy <t...@birdsnest.maths.tcd.ie> writes:
>
> > [...] I believe that tex.web has a quality that has never been
> > equalled. [...]
>
> Extensability and maintainability are important qualities of
> programs. As far as I can see, TeX failed in these respects.

It was not designed for that (with the minor exception of whatsits,
perhaps). It was not written in a language designed for that, which
is not really very much Knuth's fault since at the time it was
designed, there were few good language choices. And TeX was never
supposed to be "maintained" beyond bug fixes.

In short, I don't see that Knuth has registered any "failures" here:
he has been quite clear about his aims and what he intends TeX to be
and do. You can't blame Knuth for the misconceptions of cultists that
fanatically choose to ignore what he himself has to say about the aims
and scope of TeX.

> Well, if you limit the expectations/design goals enough, a certain
> program will always seem to be good.
>
> Not that I want to say that TeX is a bad program -- but one or two
> mistakes have been made in my opinion.

TeX is an excellent program, but the world's typesetting needs don't


stop at "The Art of Computer Programming".

--

David Kastrup

unread,
Jun 7, 2004, 9:29:11 AM6/7/04
to
Didier Verna <did...@lrde.epita.fr> writes:

> David Kastrup <d...@gnu.org> wrote:
>
> > Right. Currently "feasible" forks to work on (and responsible
> > persons) are:
> >
> > eTeX Peter Breitenlohner
> > PDFTeX Han The Thanh
> > Omega2 John Plaice
> > Aleph Guiseppe Bilotta
> > ExTeX Michael Niedermair
>
> I'm far for being a typesetting nitpicker, but one thing I
> suffer from is the TeX API. Sometimes, I dream that I have the
> features of TeX but that the interface is something sensible: no
> macros,

That's not TeX but something else. Embedded into plain TeX is the
assumption that any non-trivial document will have to extend plain
TeX's markup with macros. So you need to ask this question about a
more complete markup system than plain TeX, such as ConTeXt or LaTeX.

> decent function call, argument passing and evaluation semantics. No
> \if, real scripting capabilities etc. Actually, I dream that TeX is
> available through a totally (La)TeX-incompatible Lisp engine.
>
> Has any thought / work been already produced along these lines ?

ANT <URL:http://www-mgi.informatik.rwth-aachen.de/~blume/Download.html>

> Kastrup Disclaimer: Of course, I have no clue what I'm talking about
> :-)

Naturally.

William F. Adams

unread,
Jun 7, 2004, 9:56:50 AM6/7/04
to
> David Kastrup wrote:
[...]
>>> "a" modest change? Like character protruding and font scaling to get
>>> better line filling?

Timothy Murphy <t...@birdsnest.maths.tcd.ie> skribis:
(in reply)


>> In my view this later feature of pdftex/PDFTeX/pdfTeX/PDFtex/PdFTeX,
>> although ingenious, is quite unnecessary.
>> I doubt if one person in 10,000 would notice
>> whether or not this was used in printing a document.

> Typographical quality of that kind is something that is only noticed
> consciously by one person in 10,000. But nevertheless it improves
> readability also for the 9,999 others.

It also has the concrete benefit of allowing one to add punctuation at the end
or beginning of a line with almost _no_ chance of re-flow.

William

--
William Adams
http://members.aol.com/willadams
Sphinx of black quartz, judge my vow.

Paul Repacholi

unread,
Jun 7, 2004, 11:22:27 AM6/7/04
to
r...@cl.cam.ac.uk (Robin Fairbairns) writes:

There is a mostly up to date TeX for VMS in the Freeware Disks collection.

I have one that:

A. Buckets the tedious file layout

B. Puts all the executables into SYS$SYSTEM and sets it up so it will
`just work' modulo picking the subset of gigabytes of fonts you
want/need.

The only set up needed was to define a logical for the TFMs for TeX,
then to defines MODES et al for you printer.

A better XDVI was about the only thing that struck me as needed I think.

It now would need an up to date LaTeX and other packages.

--
Paul Repacholi 1 Crescent Rd.,
+61 (08) 9257-1001 Kalamunda.
West Australia 6076
comp.os.vms,- The Older, Grumpier Slashdot
Raw, Cooked or Well-done, it's all half baked.
EPIC, The Architecture of the future, always has been, always will be.

Robin Fairbairns

unread,
Jun 7, 2004, 12:27:24 PM6/7/04
to
Timothy Murphy <t...@birdsnest.maths.tcd.ie> writes:

>Robin Fairbairns wrote:
>You, Robin, attacked tex.web and Knuth's idea of Literate Programming.
>I believe that tex.web has a quality that has never been equalled.
>(The nearest, in my judgement, would be Minix
>and perhaps the original Unix.)

weird.

first, i didn't attack tex.web, i merely said that tex.web is not a
good starting point for developing anything. i base my view on
observations of projects that have tried it, and their experiences.

second, the suggestion that early unix code is a model of programming
is ... astounding. around those sorts of time, i wrote single-machine
code of astounding quality and beauty. none of it for pdp-7s, i
admit, but equally written for dead-end processors and definitely not
portable without a complete rewrite. the difference is, that my
operating systems from back then didn't attract people in the same way
(after all, mine were first systems, whereas the un*x guys were on
their second at least), and my utilities' beauty was like the beauty
of some of david k's code -- astounding ducking and weaving to deal
with a tricky environment.

>In brief, while I would increase array sizes wherever possible,

boggle. you wouldn't care to permit dynamic arrays? which knuth
omitted because of the lack of reliable dynamic memory libraries for
the grotty pascal compiler he was using...

Phillip Helbig---remove CLOTHES to reply

unread,
Jun 7, 2004, 12:31:48 PM6/7/04
to
In article <ca04a3$vf$3...@pegasus.csx.cam.ac.uk>, r...@cl.cam.ac.uk (Robin
Fairbairns) writes:

> then ask him. as a maintainer of the archives, and a long-time happy
> user of vms, i spent some time trying to find information about vms
> distributions, but have long since given up. (i did once get a new
> vms distribution onto the archives, but i note that we have nothing
> but odd subsystems with vms tags on them, now.)
>
> a quick google leads me to 9-10 year old vms tex information. the
> last tex i ever used on vms was compiled by me, from pascal source
> direct out of weave -- some time like 1990 (istr the change file was
> from brian hamilton kelly, among others).

His [TEXMF] stuff was on one of the VMS freeware CDs, which are now
online: http://h71000.www7.hp.com/openvms/freeware/ .

Michele Dondi

unread,
Jun 7, 2004, 3:43:02 PM6/7/04
to
On Mon, 07 Jun 2004 14:50:43 +0200, Didier Verna
<did...@lrde.epita.fr> wrote:

>> eTeX Peter Breitenlohner
>> PDFTeX Han The Thanh
>> Omega2 John Plaice
>> Aleph Guiseppe Bilotta
>> ExTeX Michael Niedermair

[snip]


>Actually, I dream that TeX is available through a totally (La)TeX-incompatible
>Lisp engine.
>
> Has any thought / work been already produced along these lines ?

Not exactly what you asked for but IIRC Giuseppe Bilotta was working
on a project consistently called TeXlib aimed at producing a system
lib for TeX&C. thus allowing to separate the macro expansion language,
the typesetting engine, etc. I think the project is dead now, but
Giuseppe often contributes here so he may shed some light on this...

Also, again it's not what you asked for, but there's PerlTeX: I've
only tried it and not actually used it in "production code", but I
think it's definitely *great*!!


HTH,
Michele
--
#!/usr/bin/perl -lp
BEGIN{*ARGV=do{open $_,q,<,,\$/;$_}}s z^z seek DATA,11,$[;($,
=ucfirst<DATA>)=~s x .*x q^~ZEX69l^^q,^2$;][@,xe.$, zex,s e1e
q 1~BEER XX1^q~4761rA67thb ~eex ,s aba m,P..,,substr$&,$.,age
__END__

Patrick TJ McPhee

unread,
Jun 7, 2004, 7:29:54 PM6/7/04
to
In article <muxfz97a5...@uzeb.lrde.epita.fr>,
Didier Verna <did...@lrde.epita.fr> wrote:

% Actually, I dream that TeX is available through a totally (La)TeX-incompatible
% Lisp engine.

I think nts start off as a re-implemenation of TeX in LISP, using CLOS. I
don't know if the intent was to expose that at the user level.

In any case, have you looked at lout? The author's claim is that it has
a much better scripting interface than TeX. I never got into it enough
to form an opinion. It used to be mirrored on CTAN.
--

Patrick TJ McPhee
East York Canada
pt...@interlog.com

Jonathan Fine

unread,
Jun 8, 2004, 6:45:09 AM6/8/04
to
"Jonathan Fine" <J.F...@open.ac.uk> wrote in message
news:ca1pc6$pg3$1...@yarrow.open.ac.uk...

> In response to a previous thread
> > Any chance of asking Knuth to stop the TeX feature-freeze?
> "Robin Fairbairns" <r...@cl.cam.ac.uk> wrote in message
> news:c9t52g$jjb$2...@pegasus.csx.cam.ac.uk...
> <snip>
> > for myself, i merely think knuth is a wise man. he knows as well as
> > the rest of us that current tex is fundamentally un-extendable.
>
> Robin, would you provide some sources for this being Knuth's view?
>

I've found a source for Don's views on this matter.

It is the Question and Answer session with Don Knuth at the
1995 TUG meeting.

The transcription of the session has been published:
Digital Typography, Questions and Answers I
TUGboat 17(1)
http://www.tug.org/TUGboat/Articles/tb17-1/tb50knut.pdf

Here's the relevant passage (DT pp597-8, TB pp20-21):
==
Fred Bartlett: I heard you say you expected more people to extend
TeX than have done so.

DEK: Yeah, absolutely. I expected extensions whenever someone
had a special-purpose important project, like the Encyclopedia
Brittanica or making an Arabic-Chinese dictionary, or whatever
--- a large project. I never expeced a single tool to be able
to handle everybody's exotic projects. So I built a lot of hooks
in the code so that it should be fairly easy for a computer
science graduate to set up a new program for special occasions
in a week or so. That was my thought. But I don't think people
have done that very much.
==

Don's further comments in response to this question are also
well worth reading.

It seems that Robin was present at that session. The published
transcript shows him asking a question about fonts.

John Culleton

unread,
Jun 8, 2004, 12:31:06 PM6/8/04
to
David Kastrup <d...@gnu.org> wrote in message news:<x5u0xn2...@lola.goethe.zz>...
> Giuseppe Bilotta <bilo...@hotpop.com> writes:
>


>
> Do you still remember what the discussion is about? Hint: it was not
> about speed at all.

The original request was to "unfreeze" the TeX code but presumably
still call it TeX, and my point was that the ability to create macros,
formats or even variations of the base code under another name allows
us to have our cake and eat it too. The only restriction is that at a
certain level one loses the right to call the result "TeX."

Changing the base TeX code permanently and thus replacing what Knuth
hath wrought loses the considerable advantage of an essentially bug
free base.
Since TeX is itself a programming language of sorts there is plenty of
scope for experimenting and improving. And as has been pointed out one
can take the base code, modify it, call it something else, and try to
convince the world
to pay attention to it. The authors of pdftex have done just
that---sucessfully.

One point we can (almost) all agree to: "unfreezing" of the TeX code
in the sense of cutting loose from the Knuth base in a permanent way
makes no sense at all.

John Culleton

David Fuchs

unread,
Jun 8, 2004, 1:09:47 PM6/8/04
to
It's also instructive to look in the comments in TeX's source code; the
semipenultimate section of Volume B is all about how to extend TeX:

@* \[53] Extensions.
The program above includes a bunch of ``hooks'' that allow further
capabilities to be added without upsetting \TeX's basic structure.
Most of these hooks are concerned with ``whatsit'' nodes, which are
intended to be used for special purposes; whenever a new extension to
\TeX\ involves a new kind of whatsit node, a corresponding change needs
to be made to the routines below that deal with such nodes,
but it will usually be unnecessary to make many changes to the
other parts of this program.

In order to demonstrate how extensions can be made, we shall treat
`\.{\\write}', `\.{\\openout}', `\.{\\closeout}', `\.{\\immediate}',
`\.{\\special}', and `\.{\\setlanguage}' as if they were extensions.
These commands are actually primitives of \TeX, and they should
appear in all implementations of the system; but let's try to imagine
that they aren't. Then the program below illustrates how a person
could add them.

Sometimes, of course, an extension will require changes to \TeX\
itself;
no system of hooks could be complete enough for all conceivable
extensions.
The features associated with `\.{\\write}' are almost all confined to
the
following paragraphs, but there are small parts of the |print_ln| and
| print_char| procedures that were introduced specifically to
\.{\\write}
characters. Furthermore one of the token lists recognized by the
scanner
is a |write_text|; and there are a few other miscellaneous places where
we
have already provided for some aspect of \.{\\write}. The goal of a
\TeX\
extender should be to minimize alterations to the standard parts of the
program, and to avoid them completely if possible. He or she should
also
be quite sure that there's no easy way to accomplish the desired goals
with the standard features that \TeX\ already has. ``Think thrice
before
extending,'' because that may save a lot of work, and it will also keep
incompatible extensions of \TeX\ from proliferating.

And, earlier:

A |whatsit_node| is a wild card reserved for extensions to \TeX. The
|subtype| field in its first word says what `\\{whatsit}' it is, and
implicitly determines the node size (which must be 2 or more) and the
format of the remaining words. When a |whatsit_node| is encountered
in a list, special actions are invoked; knowledgeable people who are
careful not to mess up the rest of \TeX\ are able to make \TeX\ do new
things by adding code at the end of the program. For example, there
might be a `\TeX nicolor' extension to specify different colors of ink,
and the whatsit node might contain the desired parameters.

The present implementation of \TeX\ treats the features associated with
`\.{\\write}' and `\.{\\special}' as if they were extensions, in order
to
illustrate how such routines might be coded. We shall defer further
discussion of extensions until the end of this program.

There's even more internal support to help extension-writers. For instance,
see comments like:

If \TeX\ is extended improperly, the |mem| array might get screwed up.
For example, some pointers might be wrong, or some ``dead'' nodes might
not
have been freed when the last reference to them disappeared. Procedures
|check_mem| and |search_mem| are available to help diagnose such
problems. These procedures make use of two arrays called |free| and
|was_free| that are present only if \TeX's debugging routines have
been included. (You may want to decrease the size of |mem| while you
are debugging.)

Further help for TeX-extenders is found under the index entries for "data
structure assumptions". Volume B also talks about how to extend TeX to
support languages with large character sets; see the index entries for
"oriental characters" (though I have my doubts about the advice given).
Also note that Knuth even helped work on the original TeX-XeT extensions
(for mixing left-to-right with right-to-left text). It's written about in
early TugBoats.

Perhaps the final word is in the very first section of Volume B:

No doubt there still is plenty of room for improvement, but the author
is firmly committed to keeping \TeX82 ``frozen'' from now on; stability
and reliability are to be its main virtues.

On the other hand, the \.{WEB} description can be extended without
changing
the core of \TeX82 itself, and the program has been designed so that
such
extensions are not extremely difficult to make.

The |banner| string defined here should be changed whenever \TeX\
undergoes any modifications, so that it will be clear which version of
\TeX\ might be the guilty party when a problem arises.

If this program is changed, the resulting system should not be called
`\TeX'; the official name `\TeX' by itself is reserved
for software systems that are fully compatible with each other.
A special test suite called the ``\.{TRIP} test'' is available for
helping to determine whether a particular implementation deserves to be
known as `\TeX' [cf.~Stanford Computer Science report CS1027,
November 1984].

@d banner=='This is TeX, Version 3.141592' {printed when \TeX\ starts}


Of course, this is not to say that the architecture of TeX, if written today
rather than decades ago, wouldn't be rather different, and fundamentally
much more easily extendable. But in 1981, nobody had a language with any
sort of dynamic binding that was implemented on DEC10/20 as well as IBM
360/370 architectures (neither of which even had a C compiler generally
available). The very first BSD Unix, for the very first VAX machines, had
only recently started to become widely used, and that was the first Unix
system with more than 64K data per process you could easily get. Plus, BSD
claimed to have a working Pascal compiler. So, at the time, Pascal was
chosen because it was more generally available than C. (There were other
issues, too, such as whether there was any portable way in C to reasonably
handle mixed 16- and 32-bit variables in arithmetic expressions; consider
"short x,y; long z; z=x+y;", which may or may not do what you think it ought
to.)

Even if there were C compilers available on non-Unix machines at the time,
their "malloc()" schemes added (at least) 4 bytes per object (and that
pretty much remains true today). And the "new" in Pascal had the same
problem. Since TeX was pushing the address-space bounds of 1Mbyte machines,
Knuth had to do all his own "zero overhead" memory management. The memory
management scheme used in TeX is also tuned to perform well on demand-paging
systems; see "virtual memory" in the index of Volume B.

Those were the considerations at the time that lead to the choice of Pascal
and a not-so-easy-to-extend type system. The goal was good portability and
reasonable speed, even at the expense of extensibility. I can't think of
any other system, commercial or free, that worked on all of DEC10/20, VAX
VMS, VAX Unix, and IBM 360/370, all producing the same output from the same
input (and I haven't even mentioned the EBCDIC character set issues, nor
the lrecl/recfm junk from OS360). If it were done today, it would be
different, but given the trade-offs, TeX had the best extensibility we could
afford at the time.

-David


Frank Mittelbach

unread,
Jun 8, 2004, 2:33:29 PM6/8/04
to
David Fuchs wrote:

> It's also instructive to look in the comments in TeX's source code; the
> semipenultimate section of Volume B is all about how to extend TeX:
>
> @* \[53] Extensions.
> The program above includes a bunch of ``hooks'' that allow further
> capabilities to be added without upsetting \TeX's basic structure.

> [...]

David, I don't think this is really the issue (to me at least it is not).
TeX is extensible on that level it is even extensible on different levels
(like adding low-level hanging punctuation, or like integrating hz
algorithm ideas, or like adding additional primitives, or ...)

whether via those hooks (which i personally thought being too much of an
afterthought, though) or via digging into the program and actually doing
more direct manipulations.

all that is possible, it has been done by people (me included for fun and
during the discussions with Don for TeX3) and it has been successfully done

but TeX is fairly unchangable if you really want to get at the heart of some
internal algorithms that are spread for speed purposes and other
considerations across the whole of the program code. and with getting at
them i mean: replacing them or changing them fundamentally with something
else. That is more or less impossible even if you know the program
backwards like say, somebody like Peter Breitenlohner.


> the lrecl/recfm junk from OS360). If it were done today, it would be
> different, but given the trade-offs, TeX had the best extensibility we
> could afford at the time.

I fully agree with you there (and one of the proofs is that TeX is still
with us, as well as being available with extensions) --- even if some
people claim this is only because it is unchangable.

but this is also why in principle i agree that for more radical research and
experimentation, e.g., into questions of optimizing page layout (yes i know
Michael Plass's thesis, but that is only sratching the surface), it would
be nice to have a program that has better separated modules.

the problem with that is, that separating modules means definining
communication streams and interfaces and there (in my opinion) are the
current attempts (ie all projects) not paying enough or no attention too.

the worst example was NTS which i consider the biggest failure of all ---
just reimplementing the interwoven structure of the TeX program in a
different language and hoping that from that magically a different level of
extensibility appears was naive --- especially if nobody ever started
asking questions like how does paragraph and page formatting communicate,
how should or could it communicate is this unidirectional or bidirectional,
...

on the other hand all of the already longer existing projects have proven
that it is possible to extend into any new area starting from TeXs
monolithic core. it may not be the best software design as a result (eg the
mixtures of web2c + C or C++ or ...) but as far as research into the really
open questions of automated typography is concerned I honestly dn't give a
damn because in those areas you have to first find any solution before you
can enhance it to becomes something like a production environment.

frank


David Fuchs

unread,
Jun 8, 2004, 3:10:33 PM6/8/04
to
Frank,

Of course, I completely agree. The only thing I'd add is that the very
monolithic-ness of the source code is a result of the goal of space and
speed efficiency. Most every part of the code knows too much about many
other parts of the code, but that's only to save data space and (don't
forget!) code space. There's all sorts of places where "interesting" hacks
are used, just to save a few bytes of code. The general idea was "it all
adds up", and we might not hit the tipping point if it's just a little too
slow or a little too big ("too big" being even worse than "too slow",
because you can always wait longer, but you can't increase your address
space). Much of what looks quaint or downright confusing in the code is
actually the result of the practition of the lost art of counting cycles and
bytes for every line of code (see Knuth and my(!) paper in Software Practice
and Experience). You can even see many design decisions that implicitly
take into account the relative speeds of the CPU vs. RAM of the day, and
this ratio has changed dramatically in the ensuing years. Even the nature
of the TeX language itself is a result of space and speed tradeoffs with the
language design---will 64Kwords of memory be enough to hold a decent macro
package as well as a page worth of text?

So, fast, small, and portable won out over modular and extensible. Knuth
wanted a real production tool that would be of real use on the computers
configurations of the time, and not just stuck in a few experimental
computer labs. And in that he succeeded. That said, there's nothing wrong
in wanting something more modern! There's no way to sweep under the rug the
fact that TeX is hard to extend.

Then again, it's also fair to say that Knuth's brain seems to work in a
monolithic sort of way...

-David


Frank Mittelbach

unread,
Jun 8, 2004, 4:48:19 PM6/8/04
to
David Fuchs wrote:

> So, fast, small, and portable won out over modular and extensible. Knuth
> wanted a real production tool that would be of real use on the computers
> configurations of the time, and not just stuck in a few experimental
> computer labs. And in that he succeeded.

indeed

> That said, there's nothing
> wrong
> in wanting something more modern! There's no way to sweep under the rug
> the fact that TeX is hard to extend.

agreed.

but what i was saying that something more modern is not achieved by simply
reimplementing a monolithic typesetting engine in a modern or thought to be
modern language.

the issues are the interactions and looong long thought and research about
interfaces that are flexible enough to allow other
algorithms/ideas/whatever to be incorporated.

TeX has whole has a lot of ideas and concepts on that and those internal
interfaces have then be scrambled into spagetti code as Don once put it to
account for speed and space improvements as you describe. however, my claim
here is that even if one undoes the spagetti part not much is really gained
if one doesn't start earlier and questions the conceptual ideas in TeX. Not
because they are not good (most of them are) but because they are based on
an underlying model that has been restricted to what was feasibly doable
with computers back then. to give an example: why start thinking about
interaction between paragraph breaking and galley breaking or about several
parallel input streams when it was clear that the design target was to get
"one page worth of data in the limited ram space". consequently, none of
the internal concepts provide for any such interaction and this will not
change if somebody produces an unscrambled TeX (sorry should say
unscrambledTeX before the TeX is a fixed name guards awake :-)


> Then again, it's also fair to say that Knuth's brain seems to work in a
> monolithic sort of way...

yes and there is nothing wrong with that is it? (wished i could)

frank

David Kastrup

unread,
Jun 8, 2004, 5:05:09 PM6/8/04
to
jo...@wexfordpress.com (John Culleton) writes:

> One point we can (almost) all agree to: "unfreezing" of the TeX code
> in the sense of cutting loose from the Knuth base in a permanent way
> makes no sense at all.

Not? Strange then that we have so many projects bent on doing just
that. Omega2, NTS, ExTeX, ANT...

Robin Fairbairns

unread,
Jun 8, 2004, 6:38:23 PM6/8/04
to
"David Fuchs" <dfu...@comcast.net> writes:
>[...] Much of what looks quaint or downright confusing in the code is

>actually the result of the practition of the lost art of counting cycles and
>bytes for every line of code (see Knuth and my(!) paper in Software Practice
>and Experience). You can even see many design decisions that implicitly
>take into account the relative speeds of the CPU vs. RAM of the day, and
>this ratio has changed dramatically in the ensuing years. Even the nature
>of the TeX language itself is a result of space and speed tradeoffs with the
>language design---will 64Kwords of memory be enough to hold a decent macro
>package as well as a page worth of text?

there are probably several people here who recognise this syndrome
(though in my case it was 4k words typically including 512 words of
program: steve bourne's editor ran to an amazing 512 words of program
plus 64 words of data). as i've already remarked in this thread, many
such programs just died when the hardware constraints that enforced
them died.

>So, fast, small, and portable won out over modular and extensible. Knuth
>wanted a real production tool that would be of real use on the computers
>configurations of the time, and not just stuck in a few experimental
>computer labs. And in that he succeeded. That said, there's nothing wrong
>in wanting something more modern!

yet the very legacy, of a curiously restrictive older program that
still has few equals in typesetting, makes the process of developing
the "more modern" so much more tricky. witness the time taken in
producing all the developments to date...

>There's no way to sweep under the rug the
>fact that TeX is hard to extend.
>
>Then again, it's also fair to say that Knuth's brain seems to work in a
>monolithic sort of way...

so do you suppose that i was wrong, in the suggestion that so offended
jonathan, to suppose that knuth, wise man that he is, doesn't believe
"us" capable of extending tex? does he believe, despite the evidence
of breitenlohner's reported reluctance to go any further, that further
extension of the basic engine is just a matter of a bit of applied
hard work?

it's nice to have the views of someone who knows knuth fairly well (to
put it mildly) -- i've only met him a couple of times, to talk to, and
the first of those was on his tour to promote taocp vol 3 ... i.e.,
pre-tex.

Giuseppe Bilotta

unread,
Jun 8, 2004, 6:51:31 PM6/8/04
to
Timothy Murphy wrote:
> You, Robin, attacked tex.web and Knuth's idea of Literate Programming.
> I believe that tex.web has a quality that has never been equalled.
> (The nearest, in my judgement, would be Minix
> and perhaps the original Unix.)

As the Aleph maintainer, i.e. someone who has to deal with the
.web/.ch concept to actually work on it, there's one thing I
can say: I think it's the most heinous yet impressively
powerful thing someone could design.

Ok, actually the only think I have a complain about is that no
debugger will (yet) allow me to follow the .web sources while
debugging. And debugging by trial and error, with heaps of WAG
on what may cause what on code written by someone else is, uhm,
well, what can I say?

OTOH hadn't it been for .web+.ch I would have probably never
been able to roll Aleph.
--
Giuseppe "Oblomov" Bilotta

Can't you see
It all makes perfect sense
Expressed in dollar and cents
Pounds shillings and pence
(Roger Waters)

Giuseppe Bilotta

unread,
Jun 8, 2004, 6:53:22 PM6/8/04
to
David Kastrup wrote:
> Knuth himself does no longer program in Pascal web, but works with
> Cweb. And so on.

BTW, are there tools to convert WEB to CWEB? After all, Pascal
can be converted quite straightforwardly into C ...

Giuseppe Bilotta

unread,
Jun 8, 2004, 6:56:38 PM6/8/04
to
Michele Dondi wrote:
> On Mon, 07 Jun 2004 14:50:43 +0200, Didier Verna
> <did...@lrde.epita.fr> wrote:
>
> >> eTeX Peter Breitenlohner
> >> PDFTeX Han The Thanh
> >> Omega2 John Plaice
> >> Aleph Guiseppe Bilotta
> >> ExTeX Michael Niedermair
> [snip]
> >Actually, I dream that TeX is available through a totally (La)TeX-incompatible
> >Lisp engine.
> >
> > Has any thought / work been already produced along these lines ?
>
> Not exactly what you asked for but IIRC Giuseppe Bilotta was working
> on a project consistently called TeXlib aimed at producing a system
> lib for TeX&C. thus allowing to separate the macro expansion language,
> the typesetting engine, etc. I think the project is dead now, but
> Giuseppe often contributes here so he may shed some light on this...

I pretty much prefer to use the term "sleeping" when referring
to TeXlib. I really do prefer to work on Aleph and my PhD
thesis. At least one of these two things does give me some
satisfaction! (Hint: it's the one that is mentioned twice in
this post.)

Timothy Murphy

unread,
Jun 8, 2004, 7:26:45 PM6/8/04
to
Giuseppe Bilotta wrote:

> BTW, are there tools to convert WEB to CWEB? After all, Pascal
> can be converted quite straightforwardly into C ...

Is it really straightforward?
I've always been completely unsuccessful in using Pacal-to-C translators.
(There used to be a number of programs which were meant to produce PS fonts
from MF source, but if anyone ever succeeded in getting them working
I never saw the results.
As I recall, they were based on obsolete versions of MF,
so one had to translate directly from Pascal to C.)

I think web-to-C is much simpler,
firstly because you can make changes in the .ch file,
and secondly because Knuth was extremely disciplined,
so that eg functions and procedures were always exited
with "goto exit".
Also, in the case of tex and mf at least
there are a large number of auxiliary C files.

Incidentally, why did Pascal suddenly become unfashionable?
One year everyone was using it,
and the next year you couldn't give away Wirth & Jensen.
I think it happened at about the same time as string vests became naff;
maybe there was a connection?

--
Timothy Murphy
e-mail (<80k only): tim /at/ birdsnest.maths.tcd.ie
tel: +353-86-2336090, +353-1-2842366
s-mail: School of Mathematics, Trinity College, Dublin 2, Ireland

Giuseppe Bilotta

unread,
Jun 8, 2004, 7:32:29 PM6/8/04
to
Timothy Murphy wrote:
> Giuseppe Bilotta wrote:
>
> > BTW, are there tools to convert WEB to CWEB? After all, Pascal
> > can be converted quite straightforwardly into C ...
>
> Is it really straightforward?

For what I know of the two languages, it seems so. I'm not
doubting it would need some hand tuning, but it doesn't seem to
me that Pascal has such idiosincratic (sp?) constructs which
couldn't be converted ...

Scott Pakin

unread,
Jun 8, 2004, 9:16:56 PM6/8/04
to
Timothy Murphy wrote:
> (There used to be a number of programs which were meant to produce PS fonts
> from MF source, but if anyone ever succeeded in getting them working
> I never saw the results.
> As I recall, they were based on obsolete versions of MF,
> so one had to translate directly from Pascal to C.)

There are some postprocessors that don't require modifications to the
Metafont program (e.g., mf2pt1).

-- Scott

David Fuchs

unread,
Jun 9, 2004, 2:59:02 AM6/9/04
to
My guess is that Knuth believes that it is "just a matter of a bit of
applied hard work" to extend TeX. But realize that he's also the guy who
wrote out the entire source code for TeX82 on a yellow legal pad, then typed
it all in, got all the syntax errors out, and then started in with the
debugger, and made it work. (And that without even reformatting the tangled
Pascal code to have each statement be on its own line, not to mention the
un-human expressions that the macros for members of the main data structures
turn into; how anyone can debug in that mode is beyond me).

I know that I don't write 25,000-line programs in one fell swoop, and I
haven't met many other people who do. So perhaps even if I'm right about
what he believes about extensibility, it still might not mean much in
practical terms (i.e., he may believe wrongly). I'm reminded of one of the
more telling comments in the source code. For those of you following along
in your copy of Volume B, it's at the beginning of Alignment (section 37).:

It's sort of a miracle whenever \.{\\halign} and \.{\\valign} work,
because
they cut across so many of the control structures of \TeX.

The reality is that it's a miracle that TeX works. Seriously, having worked
for a decade on a different piece of software (FrameMaker), in a more
conventional multiple-developer setting, I was surprised to discover that
you could tell who authored which piece of code just by looking at it. And
it wasn't because of indentation style, or how variables were named.
Different programs thought differently, and this was mirrored in their code.
One side-effect of this observation is that everyone was better at
bug-fixing as well as feature-enhancing their own code than that of others.
This did not seem to be a result of the language or environment we were
using. So, I don't think it's unusual that Knuth-code is harder for others
to modify than it is for him. The unfortunate thing about it is that he's
so many sigmas out that almost nobody can keep as big a big-picture in their
heads all at once as he can, so it's extra-hard. (Of course, the fortunate
thing is that Knuth could; it's not clear that writing TeX in a more mundane
way would have led to a small-enough, fast-enough, soon-enough system to
catch on the way TeX did.)

Since I've got the floor, let me repeat a story that Knuth has told in
public a number of times. Somewhere in the 1980's, the funding to support
the mainframe computer that Knuth had been using for years (and developed
all of TeX on) stopped. This was the DEC10 named "Sail" (for "Stanford
Artificial Intellegence Lab"; more than just the AI people got to use it).
It had wonderful features for the time, partly as a result of running a
home-grown operating system (called Waits), with home-grown graphics
terminals driven by home-grown hardware. When the funding stopped, the
staff of programmers who developed the custom OS, and its custom shell,
editor, etc., mostly went away, except for one person, and his job was to do
strictly maintenance, with no new development. Of course, we were all
upset, and feared that the system would wither away. Well, the punch line
was that the system became much more pleasant to use, as reliability went up
and up. Fewer and fewer crashes from the shell and editor and OS, since
they were only having bugs removed, and none added. The system stayed up
for months at a time, which had been previously unprecedented.

Evidently, this made a big impression on Knuth, and it is what he claims is
the basis for his desire to have "stability" at the top of the feature list
for TeX. (The Sail machine lasted a number of years more, until the
hardware started to fail. At one point, the insulation on the wires in the
wire-wrapped backplane started to fail, and what kept the machine going were
stratigically located tooth-picks to hold the wires off of the pins that
they took a corner around; how Don Coates managed to figure out which wires
were shorting out where, I can simply not fathom. Eventually, the machine
got a brain-transplant with a new DEC20 CPU, and the instrumented version of
TeX quickly found a bug in the microcode; oddly, virtually the same bug that
my work on FrameMaker found in the Sun's 68020-based "model 60" some years
later. But I digress...)

-David


David Fuchs

unread,
Jun 9, 2004, 3:26:24 AM6/9/04
to
While Knuth chose to implement TeX on top of Pascal, he knew he didn't want
to tie it in too closely, as it wasn't clear what new language(s) would
eventually win out. So, TeX was intentionally written in a dumbed-down
version of standard Pascal in an attempt to allow for the future. Quoting
from the Introduction section of Volume B (my favorite):

Indeed, a conscious effort has been
made here to avoid using several idiosyncratic features of standard
\PASCAL\ itself, so that most of the code can be translated
mechanically
into other high-level languages. For example, the `\&{with}' and
`\\{new}'
features are not used, nor are pointer types, set types, or enumerated
scalar types; there are no `\&{var}' parameters, except in the case of
files;
there are no tag fields on variant records; there are no assignments
|real:=integer|; no procedures are declared local to other procedures.

Having written a special-purpose Pascal->C translator back in the day, I can
report that one big problem was that if your C compiler thinks "int" is 16
bits, and you have to say "long" to get 32 bits, there are places in TeX
that say "integerVar:=sixteenBitVar+sixteenBitVar" which is defined in
Pascal to work correctly, and pretty much defined in C to work incorrectly.
So, you have to track expression types up and down your parse tree to tell
when you have to cast shorts that appear in the middle of expressions to be
(long). Of course, now all C compilers have 32-bit ints, so this is no
longer a problem. But the related problem of what
"longVar:=shortSignedVar+shortUnsignedVar" means in C is still with us, if I
remember correctly. I think that the standard Pascal->C stuff out there in
the free TeX distributions handled these problems by fixing them by hand
when they cropped up. Oh, there was also the issue that Pascal functions
essentially create a bogus variable that you assign to when you want to
return a value, and this doesn't really match how the return statement in C
works (though in TeX the "goto exit" usually isn't much after the bogus
assignment; but there is the problem that a few functions in TeX actually
have a little bit more executable code after the "exit:" label, so you can't
just turn the "goto exit" into a "return" statement).

-David


"Giuseppe Bilotta" <bilo...@hotpop.com> wrote in message
news:MPG.1b306afc3...@news.individual.net...

Achim Blumensath

unread,
Jun 9, 2004, 6:09:16 AM6/9/04
to
Frank Mittelbach wrote:
> but this is also why in principle i agree that for more radical
> research and experimentation, e.g., into questions of optimizing page
> layout (yes i know Michael Plass's thesis, but that is only sratching
> the surface), it would be nice to have a program that has better
> separated modules.
>
> the problem with that is, that separating modules means definining
> communication streams and interfaces and there (in my opinion) are the
> current attempts (ie all projects) not paying enough or no attention
> too.

Do you include ant in "all projects"? I admit it is poorly documented
and one has to read the source (and my mind) to see what is going on,
but what you describe above is actually the main goal of my project.
Of course, whether I will manage to finish it and how many years it will
take remains to be seen.

Jonathan Fine

unread,
Jun 9, 2004, 8:30:59 AM6/9/04
to
Was Re: Does DEK think TeX is fundamentally un-extendable?

"Frank Mittelbach" <frank.mi...@latex-project.org> wrote in message
news:ca58no$161$1...@online.de...


>
> TeX has whole has a lot of ideas and concepts on that and those internal
> interfaces have then be scrambled into spagetti code as Don once put it to
> account for speed and space improvements as you describe. however, my
claim
> here is that even if one undoes the spagetti part not much is really
gained
> if one doesn't start earlier and questions the conceptual ideas in TeX.
Not
> because they are not good (most of them are) but because they are based on
> an underlying model that has been restricted to what was feasibly doable
> with computers back then. to give an example: why start thinking about
> interaction between paragraph breaking and galley breaking or about
several
> parallel input streams when it was clear that the design target was to get
> "one page worth of data in the limited ram space". consequently, none of
> the internal concepts provide for any such interaction and this will not
> change if somebody produces an unscrambled TeX (sorry should say
> unscrambledTeX before the TeX is a fixed name guards awake :-)
>

['if' should be 'until'? - jfine]

==

Don was asked a question on precisely this topic at the 1995 TUG meeting.
(Limited memory, not 'scrambled into spaghetti'.)

See Digital Typography p594-6 (long Q + A).

Here are extracts:
Cameron Smith. [...] computer memories being what they were, it wasn't
practical to similarly [to line-breaking] accumulate several pages of
material and look for optimal page breaks. And sort of related to that,
there's the difficulty of communications between a line-breaking
algorithm and a page-breaking algorithm. [...]

Don: [...] [understanding] about what kinds of communications would be
useful and so on are becoming clearer. [...]
With respect to the memory situation ... I think the page-breaking
business is still ... it's not so much memory bound as maybe ---
you still want to do two passes --- [...]

==

Frank's view seems to be
1) Interaction between page-breaking and line-breaking
is required.
2) TeX does not provide such interaction.
3) Nothing can be done without extending TeX.
4) Such extensions require unscrambling the spaghetti.

Frank, have I understood you correctly?

==

My view is:
1) Communication from line-breaking to page-breaking
is sufficient.
2) TeX can be used for the line-breaking.
3) An external program can be used for the page-breaking.
4) Sadly, TeX lacks a \totaldemerits primitive.

My view was expressed in the paper I presented to TUG 2000.
Line breaking and page breaking - TUGboat 21 (pp210-221)
http://www.pytex.org/doc/tug2000.pdf
http://www.tug.org/TUGboat/Articles/tb21-3/tb68fine.pdf

==

Frank was also at that meeting, but we did not get to discuss
this at the time.

How about discussing it now.

==

David Kastrup

unread,
Jun 9, 2004, 9:07:39 AM6/9/04
to
"Jonathan Fine" <J.F...@open.ac.uk> writes:

> Frank's view seems to be
> 1) Interaction between page-breaking and line-breaking
> is required.
> 2) TeX does not provide such interaction.
> 3) Nothing can be done without extending TeX.
> 4) Such extensions require unscrambling the spaghetti.
>
> Frank, have I understood you correctly?
>
> ==
>
> My view is:
> 1) Communication from line-breaking to page-breaking
> is sufficient.

Even within the narrow confines of TeX's original goal (good
typesetting of the Art of Computer Programming) this is wrong. For
example, the total demerits and penalties could be lowered
considerably, and several bad situations avoided altogether, if the
influence of \widowpenalty and \clubpenalty and \brokenpenalty could
already been considered when doing the paragraph breaking.

> 2) TeX can be used for the line-breaking. 3) An external program
> can be used for the page-breaking.

Even if this were feasible (see my above explanation of why it is
not), this would hardly count as not being an extension to TeX.

> 4) Sadly, TeX lacks a \totaldemerits primitive.

It would be easy enough to implement such a thing. If you are into
the "I-want-the-binary-unchanged" craziness, you could even gather
all the demerit information from \tracingparagraphs output in the log
file or console.

But all of this is academical gameplay at best. If one really wants
to have global optimization, the data structures of TeX need to get
amended.

a) For coupling page break and line break decisions, you need
semibroken vertical boxes. Those have done all the necessary line
break decisions, have a minimal size (can't get smaller than that)
and maybe a maximum size. Splitting them will finalize the breaks
for the top part of the split.

b) For getting global page break optimization one has to be aware that
future page breaks depend on the output routine's work. So the output
routine must be allowed to run speculatively, and only the shipouts
and assignments and write statements and other side effects (like
recontributions to the vertical list) in the output routine runs that
are considered part of the optimal break sequence get retained.

Things like that simply require additions in data structures and
algorithms and control flow that are close to impossible to tack onto
the current code base, and which can't be solved by external programs
since the effects of such decisions percolate immediately to the
availability and importance of further decisions. Arriving at a
global optimum without keeping online track of a list of temporary
optimums (like the shortest path traversal for paragraph breaking in
TeX already does) is close to impossible with even remotely tolerable
efficiency.

Art Werschulz

unread,
Jun 9, 2004, 9:35:17 AM6/9/04
to
Hi.

"David Fuchs" <dfu...@comcast.net> writes:

> Since I've got the floor, let me repeat a story that Knuth has told in
> public a number of times. Somewhere in the 1980's, the funding to support
> the mainframe computer that Knuth had been using for years (and developed
> all of TeX on) stopped. This was the DEC10 named "Sail" (for "Stanford
> Artificial Intellegence Lab"; more than just the AI people got to use it).

ISTR that TeX78 was implemented in SAIL (the Stanford Artificial
Intelligence Language). Presumbably, SAIL ran on Sail?

--
Art Werschulz (8-{)} "Metaphors be with you." -- bumper sticker
GCS/M (GAT): d? -p+ c++ l u+(-) e--- m* s n+ h f g+ w+ t++ r- y?
Internet: a...@cs.columbia.edu<a href="http://www.cs.columbia.edu/~agw/">WWW</a>
ATTnet: Columbia U. (212) 939-7060, Fordham U. (212) 636-6325

Giuseppe Bilotta

unread,
Jun 9, 2004, 9:35:14 AM6/9/04
to
David Fuchs wrote:
> Having written a special-purpose Pascal->C translator back in the day, I can
> report that one big problem was that if your C compiler thinks "int" is 16
> bits, and you have to say "long" to get 32 bits, there are places in TeX
> that say "integerVar:=sixteenBitVar+sixteenBitVar" which is defined in
> Pascal to work correctly, and pretty much defined in C to work incorrectly.
> So, you have to track expression types up and down your parse tree to tell
> when you have to cast shorts that appear in the middle of expressions to be
> (long). Of course, now all C compilers have 32-bit ints, so this is no
> longer a problem. But the related problem of what
> "longVar:=shortSignedVar+shortUnsignedVar" means in C is still with us, if I
> remember correctly.

Wasn't the purpose of qi(..) and qo(..) that of handling these
cases?

Jonathan Fine

unread,
Jun 9, 2004, 10:00:55 AM6/9/04
to
"David Kastrup" <d...@gnu.org> wrote in message
news:x5wu2gy...@lola.goethe.zz...
> "Jonathan Fine" <J.F...@open.ac.uk> writes:
<snip>

> > My view is:
> > 1) Communication from line-breaking to page-breaking
> > is sufficient.
>
> Even within the narrow confines of TeX's original goal (good
> typesetting of the Art of Computer Programming) this is wrong. For
> example, the total demerits and penalties could be lowered
> considerably, and several bad situations avoided altogether, if the
> influence of \widowpenalty and \clubpenalty and \brokenpenalty could
> already been considered when doing the paragraph breaking.
>
<snip>

David, I don't yet understand what you are saying.

Could you give an example?


Jonathan


David Kastrup

unread,
Jun 9, 2004, 10:10:34 AM6/9/04
to
"Jonathan Fine" <J.F...@open.ac.uk> writes:

A \widowpenalty need not arise if you break the paragraph one line
shorter or longer. A \clubpenalty may not arise if you break the
preceding paragraph one line longer (which it could probably easily
afford in many cases). A \brokenpenalty need not arise if you don't
break at a hyphenation point but choose a slightly worse break point
with regard to paragraph optimization. It is pointless to score
\adjdemerits for lines which do not end up on the same page.
Similarly for \doublehyphendemerits.

I am working on typesetting books with quite long paragraphs, a fixed
line spacing and page layout, where widows on particular and also
clubs are frowned upon. Given the length of the paragraphs, if TeX
were to do combined pagebreak/linebreak optimization, most clubs and
widows could be avoided easily (by effectively using \looseness for
removing them). As it is, I can only avoid them on pages with
footnotes, and only by giving sufficiently stretchably interfootnote
spacing.

Giuseppe Bilotta

unread,
Jun 9, 2004, 10:20:13 AM6/9/04
to
David Kastrup wrote:
> I am working on typesetting books with quite long paragraphs, a fixed
> line spacing and page layout, where widows on particular and also
> clubs are frowned upon. Given the length of the paragraphs, if TeX
> were to do combined pagebreak/linebreak optimization, most clubs and
> widows could be avoided easily (by effectively using \looseness for
> removing them). As it is, I can only avoid them on pages with
> footnotes, and only by giving sufficiently stretchably interfootnote
> spacing.

Multipass processes?

William F. Adams

unread,
Jun 9, 2004, 10:43:44 AM6/9/04
to
David Kastrup wrote:
>> I am working on typesetting books with quite long paragraphs, a fixed
>> line spacing and page layout, where widows on particular and also
>> clubs are frowned upon. Given the length of the paragraphs, if TeX
>> were to do combined pagebreak/linebreak optimization, most clubs and
>> widows could be avoided easily (by effectively using \looseness for
>> removing them). As it is, I can only avoid them on pages with
>> footnotes, and only by giving sufficiently stretchably interfootnote
>> spacing.

and Giuseppe asked:
>Multipass processes?

Well, that's the traditional way to fix it.

Find a page w/ a widow or orphan line, look back through the pages, find a
paragraph which _might_ be amenable to gaining or losing a line, add
\looseness+/-1% before it and re-tex and see.

As I've noted in the past, I'd dearly love a mechanism whereby TeX could
evaluate a paragraph and note how many lines it could gain or lose and stick
that into the output (say as a .pdf annotation). Better still would be if LyX
(say) could grab that and add it as an ERT (Evil Red Text --- raw (La)TeX code)
comment above each paragraph.

It was funny, a while back I posted all that I'd learned of page makeup to the
TYPO-L list asking, ``Is there nothing more? Is this all there is to know /
do?'' and didn't get a single reply or comment. Later correspondence revealed a
number of people had printed out my post, and used it as a guideline in their
composition, or when outsourcing....

William
(who is also going to post this to the LyX list, in hope some developer there
will find it interesting)

--
William Adams
http://members.aol.com/willadams
Sphinx of black quartz, judge my vow.

It is loading more messages.
0 new messages