Google Groups no longer supports new Usenet posts or subscriptions. Historical content remains viewable.
Dismiss

Unrefereed papers

58 views
Skip to first unread message

lester l. helms

unread,
Sep 27, 2002, 1:36:44 PM9/27/02
to
You will be in competition with "Advances in Mathematics" and many other
journals.

L.L. Helms

Paul R. Chernoff

unread,
Oct 1, 2002, 3:53:03 PM10/1/02
to
In article <3D94972C...@math.uiuc.edu>,

This is a baffling remark. Unless things have changed drastically,
papers in "Advances in Mathematics" (the journal founded by Gian-
Carlo Rota, to avoid possible ambiguity) are certainly refereed.

(What I find annoying about "Advances" is its editorial policy
of omitting the date a paper was received.)


--
# Paul R. Chernoff cher...@math.berkeley.edu #
# Department of Mathematics # 3840 #
# University of California "Against stupidity, the gods themselves #
# Berkeley, CA 94720-3840 struggle in vain." -- Schiller #


Dan Christensen

unread,
Oct 2, 2002, 1:21:31 PM10/2/02
to
cher...@math.berkeley.edu (Paul R. Chernoff) writes:

> This is a baffling remark. Unless things have changed drastically,
> papers in "Advances in Mathematics" (the journal founded by Gian-
> Carlo Rota, to avoid possible ambiguity) are certainly refereed.

I know of at least one paper accepted by Rota without being refereed,
and I've heard that this was not uncommon...

Dan

--
Dan Christensen
j...@uwo.ca


Paul R. Chernoff

unread,
Oct 5, 2002, 1:42:36 AM10/5/02
to

I have now acquired more information, and it does appear that, at least
when the late Gian-Carlo Rota was Editor in Chief, many papers were
accepted (and presumably rejected) without proper refereeing -- quite
possibly including a paper of mine, which happens to refer to a very
interesting paper by R.R. Kallman and Rota himself.

(Ref: P.R. Chernoff, Adv in Math v 34 (1979), 137--144)

The topic is *not* combinatorics; it is operator inequalities. (Just
a bit of self-advertisement...)

Thinh Tran

unread,
Oct 9, 2002, 9:58:36 PM10/9/02
to
> I have now acquired more information, and it does appear that, at least
> when the late Gian-Carlo Rota was Editor in Chief, many papers were
> accepted (and presumably rejected) without proper refereeing -- quite
> possibly including a paper of mine, which happens to refer to a very
> interesting paper by R.R. Kallman and Rota himself.

> (Ref: P.R. Chernoff, Adv in Math v 34 (1979), 137--144)

> The topic is *not* combinatorics; it is operator inequalities. (Just
> a bit of self-advertisement...)

Question: What do you mean by "many papers were accepted (and
presumably rejected) without proper refereeing..."? Do you mean
decisions were made randomly at this particular journal? (If this was
the case) I assume it was a rare exception; am I right?

Thinh Tran (http://www.thinhtran.com)

tc...@lsa.umich.edu

unread,
Oct 10, 2002, 9:34:08 AM10/10/02
to
In article <54187164.02100...@posting.google.com>,

Thinh Tran <thinhv...@cs.com> wrote:
>Question: What do you mean by "many papers were accepted (and
>presumably rejected) without proper refereeing..."? Do you mean
>decisions were made randomly at this particular journal? (If this was
>the case) I assume it was a rare exception; am I right?

Gian-Carlo Rota, the former editor of _Advances_in_Math_, would often assess
the quality of a submitted paper himself, and decide whether to accept or
reject it without obtaining a referee report from a third party.

I don't think there is anything wrong with the editor of a journal acting
as his or her own referee on occasion, e.g., if finding a referee turns out
to be nearly impossible and the editor is an expert in the subject and has
no relationship with the author that might bias his objectivity. It's just
that Rota did this more frequently than most, and did not necessarily limit
himself to cases in which finding a referee was difficult.
--
Tim Chow tchow-at-alum-dot-mit-dot-edu
The range of our projectiles---even ... the artillery---however great, will
never exceed four of those miles of which as many thousand separate us from
the center of the earth. ---Galileo, Dialogues Concerning Two New Sciences

John Baez

unread,
Oct 10, 2002, 3:08:00 PM10/10/02
to
In article <ao3vkg$6r1$1...@galois.mit.edu>, <tc...@lsa.umich.edu> wrote:

>In article <54187164.02100...@posting.google.com>,
>Thinh Tran <thinhv...@cs.com> wrote:

>>Question: What do you mean by "many papers were accepted (and
>>presumably rejected) without proper refereeing..."? Do you mean
>>decisions were made randomly at this particular journal?

No, not at all.

>Gian-Carlo Rota, the former editor of _Advances_in_Math_, would often assess
>the quality of a submitted paper himself, and decide whether to accept or
>reject it without obtaining a referee report from a third party.

Right. Basically he acted as a benevolent despot using his own
judgement as he saw fit, rather than using the standard bureaucratic
methods that most journals employed. Since he had good mathematical
taste, this worked quite well most of the time. He also chose
people he trusted (for example me) to be editors of Advances in
Mathematics, and told us seek out good papers in our subjects of
expertise.

After he died, Advances in Mathematics became a normal refereed
journal. I eventually resigned my editorship when Academic Press
(which published the journal) was bought out by Kluwer. For some
reason my name still appears on the list of editors, which is nice,
because I get the prestige without doing any work. :-)

>I don't think there is anything wrong with the editor of a journal acting
>as his or her own referee on occasion, e.g., if finding a referee turns out
>to be nearly impossible and the editor is an expert in the subject and has
>no relationship with the author that might bias his objectivity. It's just
>that Rota did this more frequently than most, and did not necessarily limit
>himself to cases in which finding a referee was difficult.

Certainly what Rota did was risky, and not within the usual
standards of academic practice. That's one reason I liked him so much.


Greg Kuperberg

unread,
Oct 10, 2002, 5:23:26 PM10/10/02
to
In article <ao4j6g$skf$1...@glue.ucr.edu>, John Baez <ba...@galaxy.ucr.edu> wrote:
>After he died, Advances in Mathematics became a normal refereed
>journal. I eventually resigned my editorship when Academic Press
>(which published the journal) was bought out by Kluwer.

Actually Academic Press was bought by Elsevier.

>For some reason my name still appears on the list of editors, which is
>nice, because I get the prestige without doing any work. :-)

This raises an interesting point about the journal system. Not only
should you take refereeing with a grain of salt; you should also take
editorial boards with a grain of salt.

I have to wonder how many people on any given editorial board are there
for the sake of name recognition, but don't actually look at journal
submissions.
--
/\ Greg Kuperberg (UC Davis)
/ \
\ / Visit the Math ArXiv Front at http://front.math.ucdavis.edu/
\/ * All the math that's fit to e-print *

Hans Aberg

unread,
Oct 11, 2002, 6:57:36 AM10/11/02
to
In article <ao4j6g$skf$1...@glue.ucr.edu>, ba...@galaxy.ucr.edu (John Baez) wrote:

>>Gian-Carlo Rota, the former editor of _Advances_in_Math_, would often assess
>>the quality of a submitted paper himself, and decide whether to accept or
>>reject it without obtaining a referee report from a third party.

>Right. Basically he acted as a benevolent despot using his own
>judgement as he saw fit, rather than using the standard bureaucratic
>methods that most journals employed. Since he had good mathematical
>taste, this worked quite well most of the time.

>>I don't think there is anything wrong with the editor of a journal acting


>>as his or her own referee on occasion, e.g., if finding a referee turns out
>>to be nearly impossible and the editor is an expert in the subject and has
>>no relationship with the author that might bias his objectivity. It's just
>>that Rota did this more frequently than most, and did not necessarily limit
>>himself to cases in which finding a referee was difficult.

>Certainly what Rota did was risky, and not within the usual
>standards of academic practice. That's one reason I liked him so much.

First one should note that manipulations done in order to circumvent the
refereeing procedures seem to have been in widespread use at the time:

For example, Serre when he did not want to see a paper published submitted
to him as an editor of journal, would write back a reply that he would
pretend that the paper submitted had not been submitted at all, but if the
author would insist on the paper being submitted, it would not be
published anyhow. At the time this happened (like seventies to eighties),
one did not know that this Serre move was not motivated by the quality of
the paper submitted itself, but that Serre mainly was interested in
pushing for his own career interests, so those that was exposed to this
Serre method might end up with problems with their own careers.
(Specifically, people would assume that Serre would act on the basis of
scientific quality, when in fact this was not the case.)

This second part, the Serre statement that the paper would not be
published anyhow if the author insisted on the paper being submitted, was
explained to me in Princeton by insiders that there are several ways to
cause papers get unfavorable refereeing: One method quoted to me was to
send two papers to a referee of which one paper the referee wants to see
published, clearly stating that there is only room for one to be published
in the journal. The referee will then clearly bias his refereeing to see
the distinction between the paper he wants to see published and the one he
does not. This is probably a method used on new referees until they have
been trained to make the right kinds of refereeing, after which it will
not be needed. Referees that do not accept the method will be dropped from
the journal.

Otherwise, the simplest method is probably to merely send it to an
unfavorable referee: The name of the referee is secret, and their is no
independent quality control in place there.

As an excuse for not refereeing a paper, an editor at Crelle's journal
merely quoted a "large backlog". This backlog excuse was used at
Inventiones by say Borel, so that a friend of his might get a paper
published, including refereeing, in six months. However, others might get
a refereeing time of nine months, after which the paper then may become
rejected then, an effective deterrent for submitting any future papers not
okayed in advance by the editor. If the paper is published, additional
delays would add up to three years. The nine months refereeing time was
also used at Annals of Mathematics by Langlands, I recall.

So I did not see any signs that these "backlogs" existed except as a means
of competition limiting. :-) -- I think these problems are known to the
AMS, because the tried at times to fight such practises, I recall.

Some London Society of mathematics journal would not accept a paper after
refereeing but instead saying that the paper was "preliminary accepted",
then using the leverage so obtained for demanding extensive rewriting of
the paper.

A Crelle journal editor would ask for papers to be published on the
condition that the paper was rewritten as to look as though a paper in
number theory instead of merely pretending the results in a scientific
manner. -- Evidently, the idea that number theory is the "Queen of
mathematics" is somewhat of a fake. :-)

I also think that the same editor used a novel method to avoid papers
becoming submitted to the journal, namely, to notice the name on the
envelope and if that was not right, instead of opening it he would forward
it somewhere. After the submission had bounced around in the mail for
several months never arriving at the editor, most authors would give up.
The same editor was also rumored to admit new Ph.D.'s for sex, because at
every year at an annual Oberwolfach Group and Representation Theory
conference he would show up with a new female Ph.D. student. (I was at
this conference only one year, though, so I do not know the actual factual
background.)

Also the refereeing procedures at this time when they did take place
seemed to not have anything to do with ensuring scientific quality: I
recall Langlands suggesting me to read a (typed) preprint by Brylinski and
Labesse, and this preprint had paper slip corrections glued in, and on top
of those glued in paper slip corrections there were additional small paper
slip corrections glued in. I tried to parse this paper, verifying the
proofs as one always is supposed to do. I did such proof verification with
for example Deligne's second proof of the Weil hypothesis, and I
discovered independently the error that Gabber had discovered, and in
additional some typo that he had not pointed out, despite the fact that I
am not an expert in that mathematical work area. But I could not parse
this Brylinski-Labesse paper. I told Langlands this fact, and he became
very upset with me, telling me to leave Princeton, despite the fact that
he had no such authority (my money came from elsewhere, and I was not in
Princeton specifically in order to work with Langlands). The paper was
however published without a problem.

So my guess that this and other such workfields have a lot of faulty
papers in the sense that they are not logically correct. -- It probably
does not matter, much, because they write a lot of new papers in such
workfields all the time, so probably nobody ever looks into the old ones.
:-)

At the other end, somebody said that they had submitted a paper to
Hormander at Inventiones, and the referee had rejected it. Hormander then
said that evidently the referee evidently did not understand the paper and
accepted it.

About Hormander, I recall that there is a book from that period of time
called "Prospects in Mathematics", and Hormander perhaps was the one
trying to do exactly that, predict some future mathematical developments.
Serre in the same book, I recall, presented a summary of the work he had
done over the last few years and had at the time abandoned, the intent
evidently, realizing that he could do no more useful on the topic, pushing
others to take over by labelling as "prospects". -- There is a very strong
political component around that fellow Serre, evidently. :-)

So those problems with the refereeing procedures seems to be editor
related rather than journal related. Probably corruption happens with
editors that wish for great success in their careers, and use the
editorship as an instrument in order to achieve that.

It also seems to be more related to personality than actual talent: For
example I saw a recent History Channel program about Newton, and they said
Newton used his high position in society to edit the names of the
competition out of the annals of history. At the other end was Einstein,
that seems to have been completely oblivious to such manners. Both were
clearly exceptional minds.

But returning to Rota and refereeing procedures:

>From what I have seen and heard, like the things said above, I think it is
imperative that the true conditions for accepting a paper to a journal is
clearly spelled out and enforced. Transgressions should be prosecuted,
because people expect that refereeing has something to do with ensuring
scientific quality. It should not be possible to abuse the system by
pretending that papers are properly handled when in reality they are not.

One should also note that the refereeing procedure in itself has some
serious flaws, even if it is correct: In math, typically, the paper is
sent to two referees that formally should verify its logical contents and
make an objective statement of its scientific value.

But as for the verifying part, that takes a lot of time. So it means that
in reality, most referees will take a look at the name of the authors and
guess whether the paper is correct. Then this does not work for newcomers,
which in turn means it is extremely difficult to establish not only new
careers, but as well novel mathematical approaches.

Also, suppose that the paper in a novel way combines results from two
different workfields. Then there is not a single referee that can read the
paper, and it is not possible to combine the results of two referees in a
single judgment. So the paper can't be published.

So this would happen if one tries to go to the extreme mathematical
frontiers and seek out new novel approaches. This will not happen if one
writes of large amounts of fairly standard results.

So instead, the refereeing system as such promotes mass consumption math,
demoting the development of quality, just as we have seen in the general
development of market mass consumption elsewhere in society.

Hans Aberg * Anti-spam: remove "remove." from email address.
* Email: Hans Aberg <remove...@member.ams.org>
* Home Page: <http://www.matematik.su.se/~haberg/>
* AMS member listing: <http://www.ams.org/cml/>

Richard Green

unread,
Oct 14, 2002, 7:33:10 AM10/14/02
to
remove...@matematik.su.se (Hans Aberg) wrote in message news:<remove.haberg-1...@du132-226.ppp.su-anst.tninet.se>...

>
> For example, Serre when he did not want to see a paper published submitted
> to him as an editor of journal, would write back a reply that he would
> pretend that the paper submitted had not been submitted at all, but if the
> author would insist on the paper being submitted, it would not be
> published anyhow. At the time this happened (like seventies to eighties),
> one did not know that this Serre move was not motivated by the quality of
> the paper submitted itself, but that Serre mainly was interested in
> pushing for his own career interests, so those that was exposed to this
> Serre method might end up with problems with their own careers.
> (Specifically, people would assume that Serre would act on the basis of
> scientific quality, when in fact this was not the case.)

If this is true, I am appalled.

> Some London Society of mathematics journal would not accept a paper after
> refereeing but instead saying that the paper was "preliminary accepted",
> then using the leverage so obtained for demanding extensive rewriting of
> the paper.

I can assure you that this is not standard editorial policy for the
London Mathematical Society journals! The author of a paper requiring
extensive changes may be invited to resubmit the paper, but in this
case the resubmitted paper will be in competition with new first time
submissions and there is no guarantee of publication.

I agree with you that there are some bad editors around. There are
two journals whose editors have treated me so badly as an author that
I'll never have anything to do with them again.

In contrast, there are some excellent editors who seem to treat
everyone fairly. (I have in mind David Vogan here.)

Paul R. Chernoff

unread,
Oct 14, 2002, 8:22:08 PM10/14/02
to
Hans Aberg (hab...@matematik.su.se) has urged me to post the
following. (Names omitted to protect the guilty.)

A friend of mine wrote an extremely nice PhD thesis on
Galois theory. His results on the structure of inseparable
field extensions were quite beautiful. He had some
interaction with X -- an outstanding mathematician --
while he was working on his thesis. X seemed genuinely
interested. So he submitted the paper based on his thesis
to X, one of the editors of the leading journal Y.

X replied that he had lost interest in inseparable
Galois theory, and accordingly rejected the paper.

Brendan McKay

unread,
Oct 14, 2002, 9:14:06 PM10/14/02
to
Richard Green <rmgr...@yahoo.co.uk> wrote:

> remove...@matematik.su.se (Hans Aberg) wrote:
> > Some London Society of mathematics journal would not accept a paper after
> > refereeing but instead saying that the paper was "preliminary accepted",
> > then using the leverage so obtained for demanding extensive rewriting of
> > the paper.
>
> I can assure you that this is not standard editorial policy for the
> London Mathematical Society journals! The author of a paper requiring
> extensive changes may be invited to resubmit the paper, but in this
> case the resubmitted paper will be in competition with new first time
> submissions and there is no guarantee of publication.

I can't see the slightest thing wrong with accepting a paper subject to
extensive rewriting. Usually it means that the editor (presumably in
consultation with the referees) has decided that the mathematical
content is good but the presentation is very bad. Bad presentation
is a reasonable justification for not accepting a paper immediately.
Sometimes referees report something like "the result is great but it
took me days of work to figure out what the result actually was" and
in such a case requesting a rewrite is the correct decision for the
editor to make. Of course the editor has to be open with the author
about why the rewrite is required and what sort of rewrite is required.

In any case, I'm sure most authors would prefer to hear "rewrite the
paper then we'll publish it" rather than "if you rewrite the paper and
send it again, we'll think about it". Of course the author might not
agree that rewriting is required, but then he/she can always
withdraw the paper and send it somewhere else.

Brendan.

Hans Aberg

unread,
Oct 15, 2002, 6:30:22 AM10/15/02
to
In article <44416928.02101...@posting.google.com>,
rmgr...@yahoo.co.uk (Richard Green) wrote:

...


>If this is true, I am appalled.

I wondered what would happen if I mentioned those names in public -- and
one thing that happened was private emails, confirming those things said
even further as facts.

So I think that it is wholly unlikely for me or anybody else to hear
anything from those concerned -- they have done it extensively as a part
of a career and they know it.

And if I would hear that any of those concerned would even attempt to pull
some strings against me, I will do likewise with those or any of those
cooperating with those fellows... :-)

I think it would be good though if those matters could be thoroughly
investigated by an independent group (perhaps AMS can play a role in
this), in order to set the historic records right, and make sure similar
things does not happen in the future.

>I can assure you that this is not standard editorial policy for the
>London Mathematical Society journals! The author of a paper requiring
>extensive changes may be invited to resubmit the paper, but in this
>case the resubmitted paper will be in competition with new first time
>submissions and there is no guarantee of publication.

My guess is that a few celebrities charged ahead with using extensive
rewritings as an obstruction (competition limiting) and in order to
establish a pecking order, and others followed suit without thinking about
what they were doing.

Hans Aberg

unread,
Oct 15, 2002, 6:30:33 AM10/15/02
to
In article <1fk3c3p.5d3dj6172a64gN%b...@cs.anu.edu.au>, b...@cs.anu.edu.au
(Brendan McKay) wrote:

>> > Some London Society of mathematics journal would not accept a paper after
>> > refereeing but instead saying that the paper was "preliminary accepted",
>> > then using the leverage so obtained for demanding extensive rewriting of
>> > the paper.

...


>> I can assure you that this is not standard editorial policy for the
>> London Mathematical Society journals! The author of a paper requiring
>> extensive changes may be invited to resubmit the paper, but in this
>> case the resubmitted paper will be in competition with new first time
>> submissions and there is no guarantee of publication.

>I can't see the slightest thing wrong with accepting a paper subject to
>extensive rewriting.

One problem is that no one has made formally explicit the rules for what
constitutes a scientifically accurate publishing, so editors can bend the
rules pretty much as one pleases and nobody will ever say that it is not a
scientific journal they edit anymore because of that.

The rules, as I take it, that submitted papers must be properly and fairly
refereed, and the editors should base the accepted papers on that. The
editors then has to ask for minor editing not related to the scientific
presentation as such (but obvious mistakes can of course be pointed out).
If these editorial changes are thought to belong to the style of the
journal (like say typesetting details or spelling according to this or
another English dialect), and requires a great deal of work, then the
editor should put another editor to do those changes instead of the
author.

There are no provisions about "backlogs" and "extensive rewritings" and
"rewriting a paper as to look as it were number theory" in this.

> Usually it means that the editor (presumably in
>consultation with the referees) has decided that the mathematical
>content is good but the presentation is very bad.

The problem with admitting such rules where the editor can ask for
extensive rewriting is that it can be used as an obstruction and to
establish one authority in a pecking order, which hurts the creativity of
the scientific workfield. The latter problem is like there must be
enforceable laws in place for a market economy to work well.

> Bad presentation
>is a reasonable justification for not accepting a paper immediately.

I think that the referee has to judge whether he can parse the scientific
contents, and if he or she can do that, make an scientific evaluation of
it. If the referee cannot parse the scientific contents, then that amounts
to that the referee was not able to do the refereeing, as opposed to the
case when the referee can parse as much science as to detect that the
paper is scientifically incorrect. If the referee is unable to referee the
paper, then I gather the editor will have to make a judgement whether to
find another referee. If this happens several times, perhaps one must
accept the fact that it does not seem to be possible referee that paper.

Style as such is not a part of the presentation. In my own experience,
experimenting with interscience publications, when the editor and referee
complains about style, it usually means they have no scientific knowledge
about the stuff that is used. -- They simply do not want admit their
limitations and scientists and humans, and it shows up in the form of
complaints on style.

To make this explicit with an example, if you were to attempt to publish a
paper with elements of pure math in some applied sciences journal, then
the rules used to ensure the logical correctness of the mathematics will
result in such editor and referee complaints: Pure math is usually written
as a bottom up presentation, because that makes it easy to verify the
proofs. In applied sciences, one often do the reverse, top down, because
logical accuracy in the mathematical sense is not needed, one is usually
doing an empiric description of some experiments.

If you attend some computer science classes, then they will say that one
should do top down software development, because that will ensure the code
will become structured. But the result is often buggy code, so bottom up
is better if one can do it, that is, to add the appropriate structure as
one goes along. -- This is much harder, but mathematicians do that all the
time.

So, in reality, because editors and referees do use that non-scientific
style criteria, interscientific publishing is in reality very difficult.

>Sometimes referees report something like "the result is great but it
>took me days of work to figure out what the result actually was" and
>in such a case requesting a rewrite is the correct decision for the
>editor to make. Of course the editor has to be open with the author
>about why the rewrite is required and what sort of rewrite is required.
>
>In any case, I'm sure most authors would prefer to hear "rewrite the
>paper then we'll publish it" rather than "if you rewrite the paper and
>send it again, we'll think about it". Of course the author might not
>agree that rewriting is required, but then he/she can always
>withdraw the paper and send it somewhere else.

The problem with this is that asking for extensive rewrites can be used as
an obstruction. So some referees are known to make long lists and heavy
complaints on a paper, despite recommending it to be published just to
slow its publishing down. Then one gets some extra time to work with the
paper before other can do it, and it also slows down the competition, if
the author is a competitor. Some editors not only can but systematically
do use such a publishing tactics in order to keep the political control
over their workfield.

So if you are making use or admitting such extensive policies, then either
you are a clever predator, or a dumb copycat sucker. :-) -- It also
contradicts the principles of appropriate scientific publishing.

One way around this problem is to always file ones preprints quickly into
an official archive, like http://arxiv.org/. For the need of ensuring a
basic scientific quality, it would be best with a refereeing procedure
whose principles of fairness and accuracy are explicitly spelled out as
well as independently verified.

Jeffrey Shallit

unread,
Oct 15, 2002, 6:57:26 AM10/15/02
to
In article <aofn3g$2gbc$1...@agate.berkeley.edu>,

Paul R. Chernoff <cher...@math.berkeley.edu> wrote:
>Hans Aberg (hab...@matematik.su.se) has urged me to post the
>following. (Names omitted to protect the guilty.)

More refereeing horror stories. Maybe we need an article in the
_Intelligencer_.

1. My first significant result was submitted to an editor of a journal
in the town where I lived. I didn't know much about submitting papers
at the time, and the university I was associated with didn't provide
much help. So when six months went by and I didn't hear anything, I
thought this was normal. After a year went by, though, I wrote to the
editor I had submitted to, and asked about the status of the paper. He
replied he had never received it! So I sent another copy. Six more
months went by. I wrote again. He claimed he had not received it
again! This time I bicycled to his house and handed it to him. Six
months went by, and I still hadn't heard anything. I contacted the
editor-in-chief of the journal, who informed me that it was well-known
that the other editor was suffering from something like senility, and I
should have submitted to another editor. In the meantime the result
was discovered by someone else.

2. I once wrote a paper with about two dozen lemmas and theorems. It
wasn't very deep but it was amusing. It was rejected from a journal
with a referee report that began, "The main result of the paper, Lemma
13, is trivial." Except that Lemma 13 wasn't anywhere near the main
result of the paper. The paper itself said that Theorem 7 was the main
result.

3. Once I submitted a paper to a journal through an editor who was an
acquaintance. After six months went by and I heard nothing I wrote to ask
what was happening. No reply. I waited another month and wrote again.
No reply. This went on for months. I knew the guy was receiving my
inquiries because he was at his university and I sent them by e-mail,
mail, and phone message. After more than a year I contacted the editor-
in-chief, who responded that it was well-known that the other editor never
responded to any inquiries and rarely sent papers out to referee; he just
accepted what he felt like. All other papers were ignored. I asked why
he was still on the editorial board. The editor-in-chief replied that he
was a good friend and did not want to insult him.

4. Once I submitted a paper to a mathematics journal that was rejected
with the notation "This is computer science, not mathematics. Submit to
a computer science journal." Whereupon I took the proffered advice
and submitted it to a CS journal. I got the referee report: "This is
mathematics, not computer science. Submit to a mathematics journal..."

5. I once submitted a paper to a journal, and got an acknowledgement in
the mail, signed by the editor-in-chief: "Your paper has been sent to
a referee, and when we hear a response back, you will be notified..."
After nine months and no reply I asked about the status of the paper.
The managing editor's secretary said it hadn't been sent out yet; they
would do so shortly. I contacted the editor-in-chief, and asked him
about the letter he had signed, telling me my paper had already been sent
out to referee. His reply: "Oh, that's just a form letter."

6. I once submitted a paper to a journal and the editor rejected it,
saying this journal had an (unwritten!) policy that it does not publish
papers on topic X except under unusual circumstances. I responded by
providing a list of about a dozen papers in that journal on that topic.
He responded, "See: there have only been a dozen".

7. I once submitted a paper to a journal and after four and a half
months (!) I got a rejection letter saying they would not ***even send
the paper out to referee*** because it wasn't good enough. (It was later
published in a very decent journal and has led to papers by several
other people.)

8. I once submitted a paper to a conference. In retrospect the paper
admittedly had some flaws, in particular a misleading notation that
made some of the results technically wrong on some exceptional trivial
cases, but we didn't know that when we submitted it. One referee
accused me and my co-author of deliberate dishonesty, in that we
were trying to pass off a false result as a true one.
We revised the paper and it was eventually accepted elsewhere, but
getting accused of lying in a referee report is a bit much.

Greg Kuperberg

unread,
Oct 16, 2002, 12:57:14 PM10/16/02
to
In article <aogsam$3rr$1...@tabloid.uwaterloo.ca>,
Jeffrey Shallit <sha...@graceland.math.uwaterloo.ca> wrote:
>More refereeing horror stories.

We could spend all day reciting "horror stories" about peer review.
But it would be more useful to conclude that the system of peer review
is in need of reform, which it is, and to make a serious proposal
for reforming it. My hat in the ring is the arXiv article
"Scholarly mathematical communication at a crossroads",

http://www.math.leidenuniv.nl/~naw/serie5/deel03/sep2002/pdf/kuperberg.pdf

It is also in the arXiv as math.HO/0210144. The last section discusses
peer review.

In a nutshell, if we take the arXiv as the first layer of the permanent
mathematical literature (in the areas which have adopted it), the second
layer is the journals and the third layer is Math Reviews. I think that
these two layers should be combined into one, so that journal referees
would write public reviews of arXiv articles. (But you could still
refuse to write a review anonymously.)

Brendan McKay

unread,
Oct 16, 2002, 1:27:26 PM10/16/02
to
Jeffrey Shallit <sha...@graceland.math.uwaterloo.ca> wrote:
>
> More refereeing horror stories. Maybe we need an article in the
> _Intelligencer_.

I have had many similar experiences, both as author and editor.
In the latter case, a few of the long delays were mostly my fault.

Here is some more of the story.

Some people simply never reply when asked to referee a paper.
Others agree to do it but never do. Both of these circumstances
are a problem for the editor, who doesn't know whether to give
up on a particular referee or to just send another reminder.
In my experience, about 10-20% of papers generate problems
of this or similar nature.

A typical scenario leading to a long delay is that a referee agrees
to write a report then both the referee and the editor promptly
forget about it until the author starts complaining months later.
If the editor then decides to give up on the referee and try
another, the same thing might happen again. Obviously the
editor should be more proactive and the referee should be
more punctual, but then both of them are unpaid volunteers
who get no material assistance from their employers (in most
cases).

Long delays can also be caused by serious mistakes such as
misfiling a paper, not noticing that email to a referee bounced,
etc etc.. Once my spam filter captured something and I didn't
know it existed for several months. Unfortunately a common
situation is that something gets into the "do this next week"
basket and next week never comes. (Both editors and referees
do that.)

I don't believe that deliberate manipulation of delay for personal
gain such as in Dr. Aberg's conspiracy stories is more than a
very marginal and rare phenomenon (and I don't believe that
the moderator should have allowed such libelous accusations
against named people to be made in this group).

Everyone in academia is getting steadily more and more busy.
This is reflected in a steady increase in the difficulty of finding
people willing to referee papers, and a steady increase in the
average time it takes. Those wonderful people who write
prompt competent reports are in danger of being overloaded
with more requests. Editors also have steadily increasing
workloads and less free time. It seems to me that delays are
only going to get worse unless there is some basic change to
the system, but I don't know what that should be.

Brendan.

Hans Aberg

unread,
Oct 16, 2002, 2:27:14 PM10/16/02
to
In article <KTgr9.3049$US2....@vixen.cso.uiuc.edu>,
gr...@conifold.math.ucdavis.edu (Greg Kuperberg) wrote:

>In a nutshell, if we take the arXiv as the first layer of the permanent
>mathematical literature (in the areas which have adopted it), the second
>layer is the journals and the third layer is Math Reviews. I think that
>these two layers should be combined into one, so that journal referees
>would write public reviews of arXiv articles. (But you could still
>refuse to write a review anonymously.)

It is probably difficult to annul the old culture of journal refereeing.

But one could add a system of refereeing to the arXiv. If it is anonymous,
there should be a system of independent verification of accuracy and
fairness.

It should also be possible for referees to only referee part of a paper,
indicating which parts, facilitating interscience publications: Say a
paper in math and another science, the one referee might check only the
mathematical parts.

This latter is in fact common by referees to journals: But if a referee
says that he or she can only verify these or those portions of a paper, it
will not be published. (I.e., editors will not make an attempt to pick the
parts together by multiple refereeing.)

In the arXiv, that makes no difference, because the paper is already filed
for public view.

If papers should be refereed for grant applications, then one is allowed
to the parts that has been successfully refereed, i.e., it should be no
requirements that the whole paper should be refereed, because the latter
can be used as an excuse in order to stall grant applications, like
complaints over "style" etc.

Thomas B Ward

unread,
Oct 17, 2002, 10:27:23 AM10/17/02
to

This has been a fascinating discussion, to which
I can contribute nothing but one comment:

On Wed, 16 Oct 2002, Brendan McKay wrote:

> In my experience, about 10-20% of papers generate problems
> of this or similar nature.
>
> A typical scenario leading to a long delay is that a referee agrees
> to write a report then both the referee and the editor promptly
> forget about it until the author starts complaining months later.

The London Math Society have made a huge step forward
on this (in my view). They have a central office for
all their journals. The author(s) submit papers to the
centre, and they then send them on to the appropriate
editorial adviser. He or she finds a referee, and then the
centre corresponds with the referee.

What this means in practice is that various pieces of
the process are doing the things they should be:
an editorial adviser is looking at papers in their field
and locating referees, then trying to interpret what
the referee said. The centre meanwhile is keeping track
of how long papers are with referees, sending reminders
and so on.

This means the burden of keeping track of the administration
of refereeing is lifted from the editor.

Tom


Mario S. Mommer

unread,
Oct 17, 2002, 10:27:22 AM10/17/02
to
remove...@matematik.su.se (Hans Aberg) writes:
>
> One way around this problem is to always file ones preprints quickly into
> an official archive, like http://arxiv.org/. For the need of ensuring a
> basic scientific quality, it would be best with a refereeing procedure
> whose principles of fairness and accuracy are explicitly spelled out as
> well as independently verified.
>

But wouldn't this (filing ones preprints into arXiv) have as a
consequence that journals would not accept ones papers because the
have been already "published"? And then the day you send your CV along
a job application, you can only (and pardon the choice of words) show
off papers published in a medium known for publishing basically
anything... Isn't this an issue?

Regards,

Mario S. Mommer

David L. Johnson

unread,
Oct 17, 2002, 10:57:42 AM10/17/02
to
Brendan McKay wrote:

> I don't believe that deliberate manipulation of delay for personal
> gain such as in Dr. Aberg's conspiracy stories is more than a
> very marginal and rare phenomenon (and I don't believe that
> the moderator should have allowed such libelous accusations
> against named people to be made in this group).

I was wondering when someone would say something to this effect. Well done.

>
> Everyone in academia is getting steadily more and more busy.
> This is reflected in a steady increase in the difficulty of finding
> people willing to referee papers, and a steady increase in the
> average time it takes. Those wonderful people who write
> prompt competent reports are in danger of being overloaded
> with more requests. Editors also have steadily increasing
> workloads and less free time. I

Very good analysis that has a lot more truth than what others have reported.
I have been as guilty as most in terms of putting off refereeing (and
worse than most wrt Math Reviews), and one underlying rationalization is
that, once I send in a report, I'll just get more to do. It is a worthwhile
and important thing to do, but is a sometimes difficult task that never has
the immediacy of many other things that need to be done.

--

David L. Johnson

__o | And what if you track down these men and kill them, what if you
_`\(,_ | killed all of us? From every corner of Europe, hundreds,
(_)/ (_) | thousands would rise up to take our places. Even Nazis can't
kill that fast. -- Paul Henreid (Casablanca).

Steve Carlip

unread,
Oct 18, 2002, 3:57:14 PM10/18/02
to
Jeffrey Shallit <sha...@graceland.math.uwaterloo.ca> wrote:

> 1. My first significant result was submitted to an editor of a journal

> in the town where I lived. ...when six months went by and I didn't hear
> anything [...]

> 3. Once I submitted a paper to a journal through an editor who was an
> acquaintance. After six months went by and I heard nothing I wrote to ask
> what was happening. No reply. I waited another month and wrote again.

> No reply. This went on for months. [...]

> 5. I once submitted a paper to a journal, and got an acknowledgement in
> the mail, signed by the editor-in-chief: "Your paper has been sent to
> a referee, and when we hear a response back, you will be notified..."
> After nine months and no reply I asked about the status of the paper.

> The managing editor's secretary said it hadn't been sent out yet [...]

> 7. I once submitted a paper to a journal and after four and a half
> months (!) I got a rejection letter saying they would not ***even send

> the paper out to referee*** [...]

I wonder whether math journals are much less competitive (with each
other) than physics journals are. In the case of the journal whose
Editorial Board I serve on, Classical and Quantum Gravity, the Board
gets annual reports of average time-to-decision and time-to-publication
of submissions, along with a fairly detailed breakdown. And there's a
systematic effort to reduce these times, with automated reminders to
referees, an online author enquiry service to give authors up-to-date
information about their papers' status, and a system for sending papers
to Board members if the referees are too slow. Certainly not all physics
journals place this much emphasis on such matters, but my impression
is that very few papers run into such long delays, and there is a general
effort to make the process at least slightly transparent.

Steve Carlip

Hans Aberg

unread,
Oct 17, 2002, 2:31:09 PM10/17/02
to
In article <GdAr9.3447$US2....@vixen.cso.uiuc.edu>, "David L. Johnson"
<david....@lehigh.edu> wrote:

>Brendan McKay wrote:
>
>> I don't believe that deliberate manipulation of delay for personal
>> gain such as in Dr. Aberg's conspiracy stories is more than a
>> very marginal and rare phenomenon (and I don't believe that
>> the moderator should have allowed such libelous accusations
>> against named people to be made in this group).
>
>I was wondering when someone would say something to this effect. Well done.

This sounds rather naive: Jumping to conclusions without proper
examination of facts. I always keep wonder why scientists do not make use
of the scientific methods of proper examination of facts that they use in
science also in their social life.

As for the statement of deliberate delay, I was following different
authors that submitted papers to the same editor, and the total publishing
time was exactly the same, three years. The authors complained over
extensive rewrites required by the editor, not themselves understanding
why those rewrites were called for, and noticed that the rewrites did not
improve the quality of the paper. (I should perhaps also note that being
asked to do things demeaning one does not understand the reason for is
typical of a dictatorship, not a democracy: It is typically used in order
to establish a pecking order.)

In one case there were also overlaps, so that the followup paper of author
number two had appeared as a preprint before paper number one had appeared
in print in the journal. So it means that if those preprints are not
circulated publicly and one cannot get hold of them by some other means,
it is impossible to participate in that workfield: The idea of publishing
papers as a form of communication has ceased. Those papers was produced by
the editor in cooperating with another fellow at a university putting
those authors at work on the topic.

So you may draw your own conclusions that the delays were deliberate or
not: It just happens that the construction gives the editor and the ones
he worked with full control of the workfield, completely excluding
competition.

What evidence would you require for a proof?

I should note that I noticed those things in the eighties -- I realized
that it would be waste of time and effort following those workfields. So I
have no knowledge what the current state of affairs in those workfields
are.

Hans Aberg

unread,
Oct 17, 2002, 2:31:23 PM10/17/02
to
In article <eNzr9.3434$US2....@vixen.cso.uiuc.edu>, "Mario S. Mommer"
<m_mo...@yahoo.com> wrote:

Well, preprints have always been circulated before the publishing, so from
that point of view, such requirements are strange.

But it is quite clear that the current system is also a power base for
those established in it, and those established within it will fight
whatever alternatives that may threaten that power base. If preprints are
not circulated in public, that benefits those already established, because
people will need to send preprints in large numbers to the latter hoping
of favors from the top.

By contrast, if preprints are quickly filed in an archive, that excludes
any manipulations of proper acknowledgment.

-- The problem is again that no official rules have been formally laid
down of what constitutes a scientific journal. If say the AMS would do
that, it would be easier to establish those alternatives.

And if it is possible to get papers filed to the arXive refereed without
passing over a journal, does that not make journals unnecessary? -- The
use of printed journals is a historic thing, tied to the printing
technology.

One should note that publishing companies are probably not as such
interested in the copyright of math journal articles, simply because there
is not much money to make from such journals. So such requirements
probably show up because publishing companies have not thought the
economics through properly, or that the editors want it that way.

The situation is different with textbooks and the like, where one can
expect to do some more money.

Such economic facts probably make printed journals to be destined to
perish anyway.

Steven E. Landsburg

unread,
Oct 18, 2002, 5:26:41 PM10/18/02
to

A propos of this thread, it is perhaps worth mentioning that
economics journals *pay* referees, and (although I have no
actual data on this) this appears to be a very effective
way to get referee reports in quickly.

The pay scale is usually something on the order of $75 if you
get your referee report in within six weeks, $50 if you get it
in within twelve weeks, and beyond that they'll look for a new
referee. The amounts are small (and are covered by submission
fees from the authors), but enough, it seems, to appeal either
to the self-interest or the sense of obligation of the referees.

And of course the dates on which the fee drops serve as
psychological deadlines for the referees, so refereeing is less
likely to always be the lowest priority item on one's desktop...

- --

David L. Johnson

unread,
Oct 18, 2002, 7:38:57 PM10/18/02
to
Hans Aberg wrote:
> In article <GdAr9.3447$US2....@vixen.cso.uiuc.edu>, "David L. Johnson"
> <david....@lehigh.edu> wrote:
>
>
>>Brendan McKay wrote:
>>
>>
>>>I don't believe that deliberate manipulation of delay for personal
>>>gain such as in Dr. Aberg's conspiracy stories is more than a
>>>very marginal and rare phenomenon (and I don't believe that
>>>the moderator should have allowed such libelous accusations
>>>against named people to be made in this group).
>>
>>I was wondering when someone would say something to this effect. Well done.
>
>
> This sounds rather naive: Jumping to conclusions without proper
> examination of facts. I always keep wonder why scientists do not make use
> of the scientific methods of proper examination of facts that they use in
> science also in their social life.

Unsubstantiated claims are also not scientific.

>
> As for the statement of deliberate delay, I was following different
> authors that submitted papers to the same editor, and the total publishing
> time was exactly the same, three years. The authors complained over
> extensive rewrites required by the editor, not themselves understanding
> why those rewrites were called for, and noticed that the rewrites did not
> improve the quality of the paper. (I should perhaps also note that being
> asked to do things demeaning one does not understand the reason for is
> typical of a dictatorship, not a democracy: It is typically used in order
> to establish a pecking order.)

It may be that the referree found the arguments either unintelligible or
wrong, or simply poorly presented. To suggest that, because authors do not
understand the reasons for requested changes, some vast dictatorial
conspiracy is afoot to squelch serious work is ridiculous. I should also
point out that there is no democracy at work here, being accepted for
publication in a particular journal is not a basic human right. We have all
had papers rejected, or delayed for one of a number of reasons. Some
reasons do not always seem legitimate. But one certainly has the option of
submitting a paper elsewhere. If it is a reasonable paper, it will be
published. If journal after journal rejects it, that does not suggest
conspiracy, rather poor mathematics.

> The idea of publishing
> papers as a form of communication has ceased.

That is hyperbole.

> Those papers was produced by
> the editor in cooperating with another fellow at a university putting
> those authors at work on the topic.

There are many claims of stolen results, if that is what you mean by this,
but such claims do need to be substantiated. I am not suggesting it does
not occur, but it is a serious charge. If some sort of evidence can be
produced, it should be, and if not it amounts to libel.

> What evidence would you require for a proof?

Precisely that, evidence.

- --

David L. Johnson

__o | And what if you track down these men and kill them, what if you
_`\(,_ | killed all of us? From every corner of Europe, hundreds,
(_)/ (_) | thousands would rise up to take our places. Even Nazis can't
kill that fast. -- Paul Henreid (Casablanca).

[ Moderator's note: This will be the last word on this part of the thread.
Any further remarks on this subject should be about mathematical
refereeing, editing and journals in general, without any accusations
about particular people or journals. ]

Michael J Hardy

unread,
Oct 18, 2002, 9:34:24 PM10/18/02
to
Jeffrey Shallit (sha...@graceland.math.uwaterloo.ca) wrote:

> More refereeing horror stories. Maybe we need an article
> in the _Intelligencer_.


Once upon a time an editor got back a referee's report
consisting of a long list of objections to the paper, but which
concluded with a short sentence recommending publication. The
editor didn't think the paper should be published, so he deleted
the last sentence from the report and sent it to the author with
a rejection letter. I heard this story from the editor.

Mike Hardy

Mark

unread,
Oct 18, 2002, 5:25:56 PM10/18/02
to
remove...@matematik.su.se (Hans Aberg) writes:
>It is probably difficult to annul the old culture of journal refereeing.
>
>But one could add a system of refereeing to the arXiv. If it is anonymous,
>there should be a system of independent verification of accuracy and
>fairness.

The future lies away from human intervention entirely because of its
inefficiency, vulnerability to cliquishness and tendency to trip over
its own feet and to get things mangled up. Humans are simply not that
reliable.

The system of communication that supersedes journals is, and already has
been, long in place: USENET and the rest of the Internet. You already
have something that goes far above and beyond "peer review" -- universal
review: everything you submit is subject to review by the entire planet
almost literally in real time. Peer review, in comparison, doesn't even
come close to this kind of standard and really has to be considered
obsolete.

The key ingredient that's missing so far, but what's in store down the
line, is AUTOMATED CERTIFICATION.

Automated certification, especially for papers mainly consisting mostly
of mathematical content would consist of submitting actual mathematical
content to proof-verifiers.

It might also be possible to devise a system of automated verification
for papers in empirical sciences, as well, though what manner of
verification that would entail I can't foresee.

A proof-verifier is, itself, not really a major deal to write up. What
IS a major deal is writing on that bridges the gap between low-level
logic language and the idiomatic, quasi-formal language used in math texts
and papers. Such a verifier has to be powerful enough to enable
user-defined theorems, proof-snippets, rules of derivation, and so is
likely to be something couched in a suitably powerful, but decideable
subset of higher order logic. ("Decideable" meaning, here, that the "proof
verification" problem is decideable, not the "proof existence" problem).

With these extra facilities, you can then talk about incorporating much
of the body of current mathematical knowledge, proof techniques, etc. into
a library. It's in reference to the library knowledge-base that all
your "big steps" and "step-skipping" are done in terms of. Without the
library, you can't skip steps and do things at the level of informal
math language.

Toby Bartels

unread,
Oct 21, 2002, 3:22:51 AM10/21/02
to
David L. Johnson wrote in part:

>Hans Aberg wrote:

>>The idea of publishing papers as a form of communication has ceased.

>That is hyperbole.

I'm not interested in defending Hans' post in general,
but this much is IMO exactly right.

Mathematicians no longer read journals for the latest results
but instead read about them months earlier on the arXiv.
The only reason to get an arXived paper published
is to get people to take your work more seriously
(so that they'll give you tenure, cite your paper, etc).
But journals are no longer used for *communication*.


-- Toby

Hans Aberg

unread,
Oct 21, 2002, 8:16:38 AM10/21/02
to
In article <H478s...@research.att.com>, sh...@research.att.com (Peter
Shor) wrote:

>... I believe editors
>should be allowed to call for extensive rewrites. Consider the case of
>an author who submits a very important result in an almost completely
>incomprehensible paper, where the incomprehensibility is not due to the
>difficulty of the mathematics but to bad writing on the part of the author
>(I know this has happened, and I don't think it is that uncommon).
>Should the referee and editors

>(1) accept it as is,
>(2) ask for extensive rewrites,
>(3) reject it as being incomprehensible?

I think that the key point is that you have been able to determine that
the results as scientifically parsable and important, so that (3) is not
applicable because of that:

Matters of editorial style (not about the scientific matters), you can
clearly fix quickly by either indicating it to the author who might fix
it, or asking for permission from the author to get somebody else to fix
it.

As for your part:
"...the incomprehensibility is not due to the difficulty of the mathematics
but to bad writing on the part of the author..."
we have agreed that the scientific components of the paper are parsable,
and the editorial components can be fixed. So why not let the paper be as
it stands, letting the readership judge what is good or bad about it:

The thing is that honest editors and referees when confronted with a
wholly new style may view that as a poor style simply because they do not
recognize it. Also, the statement that "the incomprehensibility is not due
to the difficulty of the mathematics" seems to often be a way out of not
loosing personal prestige: One thing a mathematician never can do, it
seems, is admitting that something is difficult and intricate.

I gave one example of how this protection against personal loss of
prestige might work in practise, namely an editor in another scientific
field being confronted with pure math: Then the pure math bottom-up
presentation style (like less important lemmas before the important
theorems), which is used to ensure logical correctness, often appears
quite incomprehensible. The point here is that this bottom-up presentation
style is not only common in pure math, but also essential for making the
verification possible.

In some papers the style was first judged as awful, but after a couple of
years, people say that they now understand why things were written as they
were. So the paper was right all the time, just that people did not
understand that at first. Should then editors be able to intercept and cut
off such cases?

I have also encountered editors and others trying to enforce special
writing styles: For example, in one case, one fellow indicated that there
was a special style used by Hilbert, and he felt that a correct math
papers should be written in just that style.

Clearly this is not scientific, if the presentation is not linked to
making sure that the material is scientifically correct and of high
quality.

Also consider this example: Somebody has produced an ISO engineering
standard of the typography of tensors, which conflicts with what pure
mathematicians usually use, and there are engineering journals (I am told)
that enforce this standard. There is a simple scientific reason for pure
mathematicians to not use this ISO engineering tensor standard, because in
pure math one indicates carefully what the notation stands for, so it is
not needed therefore. Also, pure mathematicians are more used to pushing
the logical realities quite a bit more that engineers do, so it can be
quite difficult to pick together a paper around such an engineering
standard. But this journal style requirement means that no normal pure
mathematical paper can ever be published in this journal, even it matches
the scientific contents of the paper.

Then if journals are permitted to insert such style publishing
requirements, it cuts down on the choice that authors have for their
publishing.

I think one has ask oneself what functions scientific publishing should
fulfill: One is communication, another to ensure that the author gets the
proper credits and can build a career around that. Refereeing should be
used to ensure basic scientific criteria are met: Clearly, the full
evaluation can only be met by the full scientific community reviewing the
result, usually over period of time.

Journals and archives should be designed to support that process, ensuring
that basic scientific criteria are met, while on the same time making sure
that the authors are not obstructed in their careers.

Hans Aberg

unread,
Oct 21, 2002, 8:16:43 AM10/21/02
to
In article <aopu94$gin$1...@uwm.edu>, whop...@alpha2.csd.uwm.edu (Mark) wrote:

>>But one could add a system of refereeing to the arXiv. If it is anonymous,
>>there should be a system of independent verification of accuracy and
>>fairness.
>
>The future lies away from human intervention entirely because of its
>inefficiency, vulnerability to cliquishness and tendency to trip over
>its own feet and to get things mangled up. Humans are simply not that
>reliable.

Well, computers are programmed by humans and are by that limited in
reliability. :-)

>The system of communication that supersedes journals is, and already has
>been, long in place: USENET and the rest of the Internet. You already
>have something that goes far above and beyond "peer review" -- universal
>review: everything you submit is subject to review by the entire planet
>almost literally in real time. Peer review, in comparison, doesn't even
>come close to this kind of standard and really has to be considered
>obsolete.

It is not obsolete, certainly not in science: Even in computer science,
where one since very long has put up ones manuscripts on the Internet, one
still publish in refereed papers.

Therefore, I think the system will persist for a long time: Even if one
believes it is unnecessary, people will probably want to have it.

On the other hand, paper journals are expensive to produce, payed mainly
by the libraries. They are therefore moribund by that.

Taking these parts together, I think it is best if the arXive is augmented
with possibility of direct refereeing for those so want. Those that browse
the archive may get that additional refereeing information available in
some form.

>The key ingredient that's missing so far, but what's in store down the
>line, is AUTOMATED CERTIFICATION.
>
>Automated certification, especially for papers mainly consisting mostly
>of mathematical content would consist of submitting actual mathematical
>content to proof-verifiers.

This is surely desirable, but at this point in time pretty much science
fiction -- There was a long thread about that recently in this newsgroup,
so there is no point in repeating that.

>A proof-verifier is, itself, not really a major deal to write up. What
>IS a major deal is writing on that bridges the gap between low-level
>logic language and the idiomatic, quasi-formal language used in math texts
>and papers. Such a verifier has to be powerful enough to enable
>user-defined theorems, proof-snippets, rules of derivation, and so is
>likely to be something couched in a suitably powerful, but decideable
>subset of higher order logic. ("Decideable" meaning, here, that the "proof
>verification" problem is decideable, not the "proof existence" problem).

Yes, I know, as I am writing on such a system right now, based on
metalogic (which I call a Metalogic Inference language, and is rigged so
that one cannot pass an unproved result for a proved result). :-)

But as there is no system available that bridges the gap between actual
mathematical usage and the formal proof verification, it does not fit into
the current discussions about paper publishing.

I do not see any formal limitations for constructing such proof
verification system, though, so in the future, math papers will probably
be proof checked automatically, and referees will not have to worry about
that part.

Dale R Worley

unread,
Oct 21, 2002, 10:02:18 AM10/21/02
to
Hans Aberg wrote:
> The authors complained over
> extensive rewrites required by the editor, not themselves understanding
> why those rewrites were called for, and noticed that the rewrites did not
> improve the quality of the paper. (I should perhaps also note that being
> asked to do things demeaning one does not understand the reason for is
> typical of a dictatorship, not a democracy: It is typically used in order
> to establish a pecking order.)

I'm attempting to correlate this with my experiences as a referee.

Many, perhaps most, papers in my field were in need of revision before
they were publishable. "If you think what gets published is stupid,
you should see what gets rejected!" And not infrequently, the authors
never did the revisions. The only times that my decision as a referee
was overruled by an editor, I had rejected the paper.

If done correctly, the reviewer/editor should explain clearly to the
authors why the revisions are needed. However, I expect that what the
referee/editor considers an adequate explanation may leave the author
uninformed. Not only is the author closer to the substance and text
of the paper and may be unable to perceive its lack of clarity of
explanation, the referee/editor is far more familiar with *writing*
mathematics by virtue of his job. And clear writing, alas, is hard to
learn.

So when the editor says "You need to put in a lot more detail in the
discussion in section 3" the author may be left at sea as to why.
Every sentence of section 3 makes sense to *him*, and is clearly
deducible from what comes before!

If the author cannot understand what changes are requested or why they
would be useful, he should contact the editor. Indeed, I expect that
this would occur naturally to the author, since getting the paper
published is in his interest.

In regard to the pecking order, by virtue of the roles that the people
are taking in this transaction, yes, the editor is in a superior
position w.r.t. the author.

In a sense, this is a dictatorship, the editor is the dictator of the
journal. The proper solution is to have a sufficient number of
journals that no field is stifled by the dictatorship of one editor.
The other solution would be to add a complex oversight mechanism to
supervise the editor. But unlike a government, we do not have the
money to pay for that.

Dale
--
Dale Worley wor...@theworld.com

ArtflDodgr

unread,
Oct 21, 2002, 10:35:37 PM10/21/02
to
In article <ap0a0b$l9c$1...@glue.ucr.edu>,
Toby Bartels <toby...@math.ucr.edu> wrote:

By which you mean communication of the latest results. But this has
been true for as long as preprints have circulated. The delivery system
is much sleeker now, and you no longer need to be on the mailing lists
of the right people, so long as they post their preprints on the
Internet. Now, as then, journals are long term memory, not short term.

--
A.

Greg Kuperberg

unread,
Oct 22, 2002, 12:32:54 AM10/22/02
to
In article <artfldodgr-E944B...@news.west.cox.net>,

No, Toby has it exactly right in those areas of mathematics that have
adopted the arXiv. The arXiv is permanent. The only sense in which
it is "the latest results" is that it is only 11 years old (or 5 in
mathematics in its comprehensive form). In these areas, the "memory"
function of journals is an inefficient and disorganized alternative.
The remaining role of journals is peer review (for which purpose
they could be made much better, as I keep saying).

But it is too sweeping to say that "mathematicians" instead use the arXiv.
Many do; most don't.

Hans Aberg

unread,
Oct 22, 2002, 7:01:31 AM10/22/02
to
In article <ypv7kgb...@shell01.TheWorld.com>, Dale R Worley
<wor...@shell01.theworld.com> wrote:

>If done correctly, the reviewer/editor should explain clearly to the
>authors why the revisions are needed.

One would expect the editors/referees having complaints to be able to
spell out the reasons in a manner that the author can make use of that
feed back.

> However, I expect that what the
>referee/editor considers an adequate explanation may leave the author
>uninformed. Not only is the author closer to the substance and text
>of the paper and may be unable to perceive its lack of clarity of
>explanation, the referee/editor is far more familiar with *writing*
>mathematics by virtue of his job. And clear writing, alas, is hard to
>learn.

It is important that the rules over what can be complained over and what
cannot are spelled out, and in which category it belongs, scientific,
style, etc.

>If the author cannot understand what changes are requested or why they
>would be useful, he should contact the editor. Indeed, I expect that
>this would occur naturally to the author, since getting the paper
>published is in his interest.

Don't bite the hand that is feeding you: If the paper will be published,
the editor is viewed as important, and the author does not have a similar
position of power, the editor can ask for pretty much what he wants.

Then this situation of power can be abused; and if one knows something
about humanity, one will understand that it will also be cases where this
happens: There is no need to work through special cases really in order to
understand that such thing will happen.

>In regard to the pecking order, by virtue of the roles that the people
>are taking in this transaction, yes, the editor is in a superior
>position w.r.t. the author.
>
>In a sense, this is a dictatorship, the editor is the dictator of the
>journal. The proper solution is to have a sufficient number of
>journals that no field is stifled by the dictatorship of one editor.
>The other solution would be to add a complex oversight mechanism to
>supervise the editor. But unlike a government, we do not have the
>money to pay for that.

The best way is to provide alternatives: If there is competition between
the journals, then that breaks down the power structure that may be
tempting to abuse. If there are just a few journals that for some reason
authors feel compelled to publish in, without an effective control system,
one should expect to there to be a lot of abuse of power.

Hans Aberg

unread,
Oct 22, 2002, 7:01:25 AM10/22/02
to
In article <ap0a0b$l9c$1...@glue.ucr.edu>, Toby Bartels
<toby...@math.ucr.edu> wrote:

>>>The idea of publishing papers as a form of communication has ceased.
>
>>That is hyperbole.
>
>I'm not interested in defending Hans' post in general,

Why not. :-)

>but this much is IMO exactly right.
>
>Mathematicians no longer read journals for the latest results
>but instead read about them months earlier on the arXiv.
>The only reason to get an arXived paper published
>is to get people to take your work more seriously
>(so that they'll give you tenure, cite your paper, etc).
>But journals are no longer used for *communication*.

I merely noted that this was case on the top level already in the early
eighties when I happened to observe it: One used mostly preprints, and
described how important it was to know how to be able to speak and
describe math over the telephone in order to quickly get hold of the
latest information. This usage dated back even earlier, clearly, at least
a few decades.

There is nothing wrong in scientists trying to get hold of the latest
information.

There are two problems though:

One is problem can be studied viewing science as a market: If the system
is such that it favours some actors on the market to get hold of the
resources needed more quickly than others, then this is in effect a market
with limited competition which hurts quality and development. The market
system as such should be redesigned so that this effect is diminished.

One way to achieve this effect is that papers are quickly put into the
arXive. I think that one must achieve some fairly collective agreement, so
that such archiving does not interfere with the conception of publishers
and others about copyright.

One should also be aware of that the loosers of such a system are those
that now rely much on preprints circulated out of general public view,
some of which are among the most influential in math, so this gives an
idea of who might oppose it: It becomes unnecessary to send in preprints
to anyone, including those that now get a lot in the mail, so this opens
up and flattens the whole science system.

The other question is the moral/ethical/legal issue: Mathematicians are no
better here than say physicists which are no better than say CEO's or the
general public. But for some reason, mathematicians seem to pretend that
everybody in the field is one hundred percent upright, which is wholly
unlikely because that would probably mean that they would not be humans.

In the rest of society, one has some conception of moral/ethic, spells
this out in the form of laws which can be enforced effectively at need.
The reason one is doing it, using the market economy picture above, is to
ensure that the market does not become even more lopsided in the favor of
a few which would hurt quality and development even more.

I just watched a program about the developments of the stock market, which
was wholly unregulated until the crash in the 1920'ies. The politicians
then started to develop legislation, which was of course heavily opposed
by those that benefitted from an unregulated stock market -- one of those
that opposed such legislation the most was eventually put into jail for
his role of tampering with the stocks.

-- I do not see why there should be an absence of such a system in math or
science in general.

It would be good if one spelled out the rules over what consists fair
scientific publishing, and if there was a body that could review that
those rules are followed. If there should be a concept of a "scientific
journal", there ought to be a way to determine when a journal is not
scientific.

ArtflDodgr

unread,
Oct 22, 2002, 1:08:06 PM10/22/02
to
In article <ap2kdm$37h$1...@conifold.math.ucdavis.edu>,
gr...@conifold.math.ucdavis.edu (Greg Kuperberg) wrote:

Do you have an idea what fraction of the papers posted in the arXiv
later appear in refereed journals?

Also, what safeguards are in place to ensure the permanence of the
arXiv? Twenty years ago we were TeXless; who knows what we'll be using
twenty years from now. How is "backward compatibility" going to be
handled?

--
A.

Hans Aberg

unread,
Oct 22, 2002, 1:44:29 PM10/22/02
to
In article <ap2kdm$37h$1...@conifold.math.ucdavis.edu>,
gr...@conifold.math.ucdavis.edu (Greg Kuperberg) wrote:

>>Now, as then, journals are long term memory, not short term.
>
>No, Toby has it exactly right in those areas of mathematics that have
>adopted the arXiv. The arXiv is permanent. The only sense in which
>it is "the latest results" is that it is only 11 years old (or 5 in
>mathematics in its comprehensive form). In these areas, the "memory"
>function of journals is an inefficient and disorganized alternative.
>The remaining role of journals is peer review (for which purpose
>they could be made much better, as I keep saying).

One should be aware of that all physical media, paper, photo, electronics,
etc., that we use to store information are volatile and thus will be
destroyed after a while. Digital electronics physical media will be
destroyed fairly quickly (estimated at about a half century for CD's), but
the advantage is that the information itself can easily be duplicated over
to younger physical media.

Thus, in mathematics, because there is high traditional value in printed
journals, people tend to think about them in terms of as being permanent,
which they in reality are not, as the paper they are written on is
perishable. And it will be expensive to transfer it over to new copies:
The best way to preserve all these old printed texts is probably to make
digital photos so that they can be stored digitally in computers. But that
is expensive too relative merely filing a paper already available on an
electronic format.

So, if you want to write papers for eternity, then it is probably safest
to put them in the arXive, where duplicates and backup copies can easily
be made.

Hans Aberg

unread,
Oct 22, 2002, 1:50:52 PM10/22/02
to
In article <artfldodgr-6EC4F...@news.fu-berlin.de>,
ArtflDodgr <artfl...@aol.com> wrote:

>Also, what safeguards are in place to ensure the permanence of the
>arXiv?

Good question. Also, is the arXive geographically safe, so that the stuff
will be preserved if a meteorite or something destroys its computer?

> Twenty years ago we were TeXless; who knows what we'll be using
>twenty years from now. How is "backward compatibility" going to be
>handled?

This is not going to be a big problem, as one can easily keep the old
programs that generates the printed texts. And if needed, one can fairly
easily write new computer programs, converting old formats to new ones.

Greg Kuperberg

unread,
Oct 22, 2002, 4:09:34 PM10/22/02
to
>Do you have an idea what fraction of the papers posted in the arXiv
>later appear in refereed journals?

Here is what I said about it a year ago in a mailing list:

The SLAC SPIRES service at Stanford supplies [publication data]
to the arXiv in high-energy physics. For example, if you look at
the first 100 articles in the hep-th archive in December 1998,

http://arxiv.org/list/hep-th/9812

you will see that 77 of the 100 have journal-ref fields. I just did a
cursory review of the other 23. At least 4 of these are published in
journals but were missed by SPIRES. Of the other 19, 11 are labelled
by the authors as conference proceedings or invited lectures, and 2
are Ph.D. theses. Thus at least 94 of the 100 have been blessed by
some form of peer review.

It's harder to check this for math arXiv, but I'm sure that the picture
is similar.

Besides, some of few unpublished arXiv papers may well be just as good
as papers that are published. One of the most important papers ever
in quantum algebra is q-alg/9709040, by Maxim Kontsevich. It's not
published.

>Also, what safeguards are in place to ensure the permanence of the
>arXiv? Twenty years ago we were TeXless; who knows what we'll be using
>twenty years from now. How is "backward compatibility" going to be
>handled?

Necessity is the mother of invention. The arXiv has 200,000 papers,
including 20,000 in mathematics, and on the order of 50,000 readers.
The community is not just going to let this vital literature fade away.
If a new format comes along, the arXiv will be pressed to adopt it long
before the old format disappears. For instance PDF came along and the
arXiv adopted it. PDF is supposed to replace Postscript and it might
be replacing Postscript. If one day people no longer care to read the
Postscript, then the arXiv might no longer generate it. Since it is
an open system, any institution that wants the papers can step in to
provide service.

The question you should ask is what safeguards are in place to ensure
the permanence of published papers. That they are printed? That is
disappearing. That the publishers have said so? Since their data is
proprietary, there is no public oversight. That the publishers have
a financial incentive? If the financial underpinning ever disappears,
which it might well, then there will be no safeguards left. If they feel
like it, they could just keep copyright and refuse to distribute the papers.
I don't promise that they will be that nasty, but that is exactly what
many publishers do with out-of-print and unpublished books.
They sit on these books just in case they can make some money out of them
in the future.

Ilya Zakharevich

unread,
Oct 22, 2002, 7:22:59 PM10/22/02
to
[A complimentary Cc of this posting was sent to
Hans Aberg
<remove...@matematik.su.se>], who wrote in article <remove.haberg-2...@du130-226.ppp.su-anst.tninet.se>:

> > Twenty years ago we were TeXless; who knows what we'll be using
> >twenty years from now. How is "backward compatibility" going to be
> >handled?

Yes, this is a very serious problem, and I'm afraid that the
maintainers of the archive may have a little bit too rosy impression
about the simplicity of this.

> This is not going to be a big problem, as one can easily keep the old
> programs that generates the printed texts.

Anybody who has worked with old programs knows that this does not work
as simple as that - if any (on a time span larger than 5-10 year).
Programs rot: they assume some features of hardware not available
anymore; they assume some particular compilers not available anymore etc.

But the problem of preserving the archive has nevertheless a cheap and
easy solution:

a) Convert those texts in the archive which are still TeXable with
the current version of (La)TeX to Postscript *now*; keep the
postscript for permanent storage - together with the source of
the paper.

[The claims of backward compatibility of LaTeX upgrades are
grossly overstated. But most of the papers will use only simple
constructs, thus will have a very good chance of be convertable
as is. Those which needs the older versions of programs, may
need to require a manual intervention - but most of them will
again work as far as one uses a (I hope, preserved somewhere)
setup of several years ago].

b) Postscript, being frozen, has a much better chance to survive
5-10 more years; at this moment, disk space becomes dirt cheap.
At this moment, store 600dpi (or maybe even 1200dpi) *bitmaps* of
the pages in some format which is

1) widely published (better on paper, to be extra sure),

2) *really trivial* to convert to a printable form in any
programming language (to ensure forward compatibility with
possible dumbing down of programming languages).

> And if needed, one can fairly
> easily write new computer programs, converting old formats to new ones.

Depends on the old format. One cannot "fairly easy" write a program
which converts from postscript to GIF... But for 5-10 years
ghostscript should survive, and later the part 'b' will take care of
everything.

Hope this helps,
Ilya


P.S. If the current funding of the archive makes "a" prohibitive now,
one should at least keep the non-upgraded software/hardware
combination for as long as possible - until "a" becomes possible.

Dale R Worley

unread,
Oct 22, 2002, 11:31:38 PM10/22/02
to
gr...@conifold.math.ucdavis.edu (Greg Kuperberg) writes:
> The question you should ask is what safeguards are in place to ensure
> the permanence of published papers.

The multiple copies on high quality paper stored in a thousand
libraries. (Of course, to the degree that traditional journals now
exist only in electronic form, that protection is fading.)

arXive should be printing copies in some paper form of known
durability and stored in a few cooperating libraries scattered around
the world. It could probably be done at a very reasonable cost.

Dale

Hans Aberg

unread,
Oct 23, 2002, 6:03:10 AM10/23/02
to
In article <ap4mkj$ppe$1...@agate.berkeley.edu>, Ilya Zakharevich
<nospam...@ilyaz.org> wrote:

>[A complimentary Cc of this posting was sent to
>Hans Aberg
><remove...@matematik.su.se>]

Please do not send be complimentary email as I am on this list.

, who wrote in article
<remove.haberg-2...@du130-226.ppp.su-anst.tninet.se>:
>> > Twenty years ago we were TeXless; who knows what we'll be using
>> >twenty years from now. How is "backward compatibility" going to be
>> >handled?

I did not write this; someone else did. Please be careful with quoting.

>> This is not going to be a big problem, as one can easily keep the old
>> programs that generates the printed texts.
>
>Anybody who has worked with old programs knows that this does not work
>as simple as that - if any (on a time span larger than 5-10 year).
>Programs rot: they assume some features of hardware not available
>anymore; they assume some particular compilers not available anymore etc.

I think it would go too far discussing here how to update it, but one main
component is of course that humans stand by bothering to do it.

Otherwise, TeX itself is not changing, and there are variations in C, and
C compilers will be around for quite some time. When computers become more
powerful, it is easy to make emulators of old ones, so one can make old
programs run in such an emulation mode as well, if really needed.

>But the problem of preserving the archive has nevertheless a cheap and
>easy solution:
>
> a) Convert those texts in the archive which are still TeXable with
> the current version of (La)TeX to Postscript *now*; keep the
> postscript for permanent storage - together with the source of
> the paper.

And, as you suggest, one can convert to other formats, if one so like. --
But perhaps PDF will be preferred over PS.

Hans Aberg

unread,
Oct 23, 2002, 6:03:25 AM10/23/02
to
In article <ypv3cqx...@shell01.TheWorld.com>, Dale R Worley
<wor...@shell01.TheWorld.com> wrote:

>> The question you should ask is what safeguards are in place to ensure
>> the permanence of published papers.
>
>The multiple copies on high quality paper stored in a thousand
>libraries.

Printed paper is not as permanent as one would think, because modern paper
may contain acids (not the kind one gets high on!) which breaks it down.
So modern books will probably not last as long as older books.

Also, both maintaining a library and storing it are very expensive
relative storing the same amount of information electronically. Books are
best stored at even temperatures and humidity, and that adds aditional
costs to storing.

And there is a principle in computer science that computer memory doubles
every 12 to 18 months at about the same cost (even lowering a bit). Lately
is has been 12 months, so that is a factor 1000 in a decade. Curiously
enough, we already have technology for this development to continue for
decades.

So in a few decades time, you can have all worlds literature ever produced
in your own computer. :-)

> (Of course, to the degree that traditional journals now
>exist only in electronic form, that protection is fading.)
>
>arXive should be printing copies in some paper form of known
>durability and stored in a few cooperating libraries scattered around
>the world. It could probably be done at a very reasonable cost.

Any attempts trying to pass over paper will be prohibitively expensive.
Also, paper formats are not very usable, as it takes time to dig it out
from the library, and the text is not electronically searchable, whereas
in computer formats, that can be done automatically at a low cost.

What people do in order to ensure permanence is to do electronic backups
on safe locations, say at a mountain room built to be safe against atom
bomb hits. Businesses are already doing such safe offsite backups: I saw
they opened such a site here in Sweden, and I think there are such at some
locations in the US as well. The cost for setting up such a site is not
very high, as one makes use of places already available, and merely needs
to draw in a cable and put in some computers.

This is very undramatic, one merely has Internet access to the site, and
the computer system is doing the backup automatically. You can then do
backups from say the US to Sweden or vice versa.

One should note that apart from making backups, it is common technique is
to make mirrors, which is a backup which also can be used directly. In
this model, the Swedish mirror could be used by Swedes or Europeans
instead of taking it down from the US.

This is done in order to decrease long distance network traffic. In the
case of the arXive, I figure the traffic is not so high, so that model
might not needed. It is used for the CTAN (Comprehensive TeX archive),
though.

One other advantage with this model though is that if the originating site
for some reason goes down, one can still use a mirror site. If the
original site for some reason goes down, the mirrors will merely not
become updated until it comes up again.

So summing it up, there are many techniques already available that might
be used in order to ensure permanence and reliability of an archive like
the arXive:

There are really no practical problems here ensuring the permanence of the
arXive; the question is whether one has done so.

George Russell

unread,
Oct 23, 2002, 7:45:57 AM10/23/02
to
Ilya Zakharevich wrote:
[snip]

> > > Twenty years ago we were TeXless; who knows what we'll be using
> > >twenty years from now. How is "backward compatibility" going to be
> > >handled?
>
> Yes, this is a very serious problem, and I'm afraid that the
> maintainers of the archive may have a little bit too rosy impression
> about the simplicity of this.

Although I appreciate that the problem has to be taken seriously, it is
not insuperable. I have here an ordinary desktop Unix/Linux computer.
This computer has eqn/tbl/roff, like most other Unix/Linux computers.
eqn is TeX's predecessor and so is 20 years (or more) behind the times.
Yet it survives, mainly (I suppose) because roff is still used for
manual pages, and once you've got roff you might as well implement eqn.

>
> > This is not going to be a big problem, as one can easily keep the old
> > programs that generates the printed texts.
>
> Anybody who has worked with old programs knows that this does not work
> as simple as that - if any (on a time span larger than 5-10 year).
> Programs rot: they assume some features of hardware not available
> anymore; they assume some particular compilers not available anymore etc.

I think it depends on the program. The key component in modern
mathematical typesetting is TeX. TeX is written in something called WEB,
but WEB itself is simply a preprocessed version of Pascal. Pascal is
itself a very restricted language, but TeX deliberately confines itself
to an even more restricted subset. It does use a very few
extensions to the language which are provided by almost every Pascal
compiler (for example, it needs a way of opening a file by name).
I think that writing a Pascal implementation today sufficient
to run TeX, starting from any reasonable high-level language you like,
would take a competent programmer a year or possibly much less.
(If it had to run fast it would take longer of course.) The same
will probably be true in the year 3000 AD, provided there are still
programmers who know enough English to read the standard.
This means you could get TeX running in that time. You could also
compile TeX fonts (using METAFONT) since that uses the same dialect of
Pascal. Your remaining problem would be to convert TeX's DVI files and
METAFONT's GF and TFM files (containing the TeX fonts) into retinal
displays or whatever they are using then, but this again should not be
too difficult. (Equivalent projects have been completed as projects
by talented undergraduates.) So I'm not pretending you can do it with
the wave of a magic wand, but altogether it's not hard, and if there
is any interest at all in 3000 AD about what mathematicians now are doing,
it will happen.

[snip]


> a) Convert those texts in the archive which are still TeXable with
> the current version of (La)TeX to Postscript *now*; keep the
> postscript for permanent storage - together with the source of
> the paper.

Postscript has a number of serious disadvantages in my opinion. In
particular it has changed quite a lot in the last few decades, and
good public-domain interpreters do not seem to exist.

>
> [The claims of backward compatibility of LaTeX upgrades are
> grossly overstated. But most of the papers will use only simple
> constructs, thus will have a very good chance of be convertable
> as is. Those which needs the older versions of programs, may
> need to require a manual intervention - but most of them will
> again work as far as one uses a (I hope, preserved somewhere)
> setup of several years ago].

The problem of backward incompatibility of LaTeX upgrades is easily
addressed by keeping old copies of the LaTeX sources. Since LaTeX is
written in TeX, you will be able to run them once you have TeX going.


>
> b) Postscript, being frozen, has a much better chance to survive
> 5-10 more years; at this moment, disk space becomes dirt cheap.
> At this moment, store 600dpi (or maybe even 1200dpi) *bitmaps* of
> the pages in some format which is
>
> 1) widely published (better on paper, to be extra sure),
>
> 2) *really trivial* to convert to a printable form in any
> programming language (to ensure forward compatibility with
> possible dumbing down of programming languages).

I don't really see any point now in making bitmaps of all the LaTeX
sources. LaTeX and TeX are not going to suddenly vanish without warning.
At worst, their use will gradually dwindle.
[snip]

Hans Aberg

unread,
Oct 23, 2002, 1:34:29 PM10/23/02
to
In article <3DB68BF5...@tzi.de>, George Russell <g...@tzi.de> wrote:

>I think it depends on the program. The key component in modern
>mathematical typesetting is TeX. TeX is written in something called WEB,
>but WEB itself is simply a preprocessed version of Pascal.

Even though the original TeX Web sources expands to Pascal, somebody made
a Web2C translator, thus making TeX compilable under C. -- I would think
that most current TeX implementations rely on that, as good Pascal
compilers start to become scarce. (So this is one example on how to update
computer programs.)

Then C is a language originally developed to help the implementation of
the operative system UNIX (it is the successor of a language called B). In
effect, most computers from PC's and up are switching something like UNIX
-- certainly Mac's have it nowadays (a version called BSD). There is even
a UNIX unification project. There is a new C standard called C99.

So one can be pretty sure that as far as the original TeX is concerned, it
will last for a long time without changes. (When Knuth dies, they say his
wish is that the version number of TeX should be changed to pi, and bugs
should be renamed to be features. :-) )

TeX is outdating for another reason, though, the emergence of Unicode:

Unicode has been extended to contain all the math characters available via
TeX. Unicode even has mathematical semantics character styles (actually by
my suggestion, I think).

Unicode makes use of 21-bit characters in its current form, which is also
expected to be final. Original TeX used only 7 bit characters, and was
later extended to what an expert called "7 bit that pretends to be 8 bit".
And current TeX is not at all suited for coping with Unicode:

For example, this showed up as a problem for the development of LaTeX3.
The current LaTeX2e approach with input encodings seemed to take a lot of
time to develop, stealing time from proper typesetting development. TeX is
not well suited for dealing with multiple input encodings (I skip the
details). Perhaps, LaTeX3 is better off and will be based on a TeX
successor.

So if one should move ahead with this stuff, there seems to be the need
for some kind of universal TeX successor that can cope both with Unicode
and the multilingual problems connected with that. -- There are several
candidates.

So there probably will be some TeX successor for such reasons.

Dale R Worley

unread,
Oct 23, 2002, 1:59:53 PM10/23/02
to
remove...@matematik.su.se (Hans Aberg) writes:
> Printed paper is not as permanent as one would think, because modern paper
> may contain acids (not the kind one gets high on!) which breaks it down.
> So modern books will probably not last as long as older books.

Indeed. One has to use the de-acidified "archival paper". Or better,
use 100% rag paper. Actually, the thing to do is find the oldest
intact paper books and use whatever they are made of.

> What people do in order to ensure permanence is to do electronic backups
> on safe locations, say at a mountain room built to be safe against atom
> bomb hits.

The problem with thees strategies is that they depend on continuous
maintenance. Many computer media have lifetimes measured in decades,
and many would require a lot of specialized hardware to be
reconstructed to read the medium. Compare to books, where one can
find and read (immediately!) a book that is 1,000 years old in the
library of a monastery.

Dale

David Madore

unread,
Oct 23, 2002, 7:02:17 PM10/23/02
to
Hans Aberg in litteris
<remove.haberg-2...@du128-226.ppp.su-anst.tninet.se>
scripsit:

> TeX is outdating for another reason, though, the emergence of Unicode:
>
> Unicode has been extended to contain all the math characters available via
> TeX. Unicode even has mathematical semantics character styles (actually by
> my suggestion, I think).
>
> Unicode makes use of 21-bit characters in its current form, which is also
> expected to be final. Original TeX used only 7 bit characters, and was
> later extended to what an expert called "7 bit that pretends to be 8 bit".
> And current TeX is not at all suited for coping with Unicode:

There is something called Omega, which smells like TeX, tastes like
TeX and works very much like TeX, in fact, it comes from mostly the
same sources, but it has been extended to use 16 bits, if not full 31,
for characters, and to be programmable using finite state machines for
input. This means it can handle Unicode (or at least the basic
multilingual plane) as well as the fonts will allow it. There is also
Lambda, which is LaTeX-over-Omega. Many standard TeX distributions,
such as teTeX (which comes, for example, with RedHat Linux) include
Omega and Lambda.

Omega's main drawback, however, is its almost complete lack of
documentation. However, it does not seem unreasonable to think that
it can be made to work, if not now at least in the foreseeable future,
to mix, say, English, Russian, Greek and Japanese with mathematical
formulae all in the same document, without going through the crazy
hacks of Babel.

Personally I would rather see TeX die and MathML replace it (with XSLT
preprocessing to perform the automated preprocessing). MathML, being
based on XML, fully allows the use of Unicode on those systems that
support it. It remains, however, a terrible pain to write by hand,
especially if attention is paid to page formatting (through CSS or
XSL).

As for the risk of us not being able to read (La)TeX documents in the
future, I think the risk mainly comes from the fonts rather than the
documents themselves. TeX is pretty solid and stable and will not
bit-rot further than it already has, and the Computer Modern fonts are
also fixed, but more unusual fonts could cause a problem, I believe.
Even in PS or PDF files in which they are supposedly embedded, I have
seen strange problems of all sorts (mainly because MetaFont produces
bitmap fonts and not vector PostScript Type1 fonts).

MathML has the advantage over TeX that it is a format and not a
program: formats tend to bit-rot much slower than programs, especially
when, as is the case of MathML, they have well-written and readable
standards and documentations.

--
David A. Madore
(david....@ens.fr,
http://www.eleves.ens.fr:8080/home/madore/ )

David L. Johnson

unread,
Oct 23, 2002, 8:10:08 PM10/23/02
to
George Russell wrote:

> Although I appreciate that the problem has to be taken seriously, it is
> not insuperable.

Over the past 18 years, I have used everything from Jim Milgram's Tekprint
(sp?), Volkswriter Scientific, a few others, and TeX. I have old papers and
fragmented results in some of these old formats that are unrecoverable -- at
least by me. The oldest stuff is salvageable, primarily because it was a
marked-up ascii format, so at least the text, and some of the equations,
could be coaxed out. This is less true of the more proprietary file
formats, which are I believe to be avoided at all costs.

Any mathematical typesetting program which saves files in a marked-up ascii
will be recoverable -- with some work, for quite some time. But even that
is limited. Vector-graphics formats like postscript will be totally
dependent upon the availability of specialized reading software, and as such
are no guarantee of long-term readability. Even some postscript recently
produced causes problems for some postscript readers [Microsoft-produced
postscript has been unreadable on non-MS readers such as ghostscript]. PDF
is much worse, since it depends upon specialized and proprietary compression
software. Many articles are produced in Word, which seems to have a file
format lifetime of only a couple years. While they claim backward
compatibility, it will be a challenge to read 20-year old Word documents in
the future, and impossible to run the old software that produced them except
on an equally old machine, which will be unrepairable.

There is an interesting article about data archiving in the latest
Technology Review. The author is a bit alarmist about the timeframe, but
has some interesting points.

Ronald Bruck

unread,
Oct 23, 2002, 10:00:17 PM10/23/02
to
[[ This message was both posted and mailed: see
the "To," "Cc," and "Newsgroups" headers for details. ]]

In article <ap79pp$1gc9$1.re...@nef.ens.fr>, David Madore
<david....@ens.fr> wrote:
...


>
> MathML has the advantage over TeX that it is a format and not a
> program: formats tend to bit-rot much slower than programs, especially
> when, as is the case of MathML, they have well-written and readable
> standards and documentations.

Hmmm, yes, one would think that formats bit-rot more slowly.

But I have some spreadsheets from the days of CP/M, written by one of
the leading spreadsheet programs of that day, which are now totally
unreadable. Of course, in those days the number of personal computers
was numbered in the thousands, if not mere hundreds, instead of
millions.

And dvi is a format too, isn't it? One whose "device-independence" is
a wry joke, if you don't have the fonts.

--Ron Bruck

Eric Behr

unread,
Oct 23, 2002, 10:09:10 PM10/23/02
to
In article <ap79pp$1gc9$1.re...@nef.ens.fr>,
David Madore <david....@ens.fr> wrote:
>
>Personally I would rather see TeX die and MathML replace it (with XSLT
>preprocessing to perform the automated preprocessing).

Oh, no, please, NO!

>As for the risk of us not being able to read (La)TeX documents in the
>future, I think the risk mainly comes from the fonts rather than the
>documents themselves. TeX is pretty solid and stable and will not
>bit-rot further than it already has, and the Computer Modern fonts are
>also fixed, but more unusual fonts could cause a problem, I believe.

[parenthetically, how has TeX "bit-rotten" already?]

I wouldn't blame the program where the formats (rendering of symbols in
a given font etc.) are at fault. The only significant snags I've seen
where TeX is concerned had to do with styles; these are quite arbitrary
renderings of meaning to begin with.

One of the major strengths of TeX is, in my opinion, that anyone with
the ability to read ASCII files, a modest command of English, and some
degree of mathematical experience will be able to comprehend
\sigma_{n=1}^{\infty} a_n
even when all the fonts, LaTeX implemenations, DVI previewers die out
completely, and even if he hasn't heard of TeX before. I doubt the same
holds for MathML. I find even slightly complicated MathML source code
impossible for a human to deal with. When you remove the more-or-less
direct link between the meaning and the code, you are introducing a
new layer of complexity and indirection which is precisely the problem
in "bit-rot" and long-term archiving.

I claim that in 2087 even a total computer ignoramus will be able to
understand, perhaps with some effort and wrong margins, a TeX or LaTeX
manuscript produced today. Barring some disaster in which ASCII and C
(or its successors) and PostScript (or its successors) disappear from
the face of the Earth, he will even be able to get a good printout with
correct margins. So when it comes to permanence, I don't see what is
broken and needs fixing.

>MathML has the advantage over TeX that it is a format and not a
>program: formats tend to bit-rot much slower than programs, especially
>when, as is the case of MathML, they have well-written and readable
>standards and documentations.

I have a 20 year old program (vi) that isn't very well documented, but
reads and renders ASCII accurately. I'm going through some of my old but
important personal Word 4, Word 6, Clarisworks, Appleworks files and I'm
simply rescuing what I can by converting the information that matters to
ASCII, because that has been my only faithful friend. I know that the next
iteration of my "personal productivity" [sic!] software will make these
old files pretty much useless without expensive converters.

In this context the "formats die slower" remark sounds suspicious. Yes,
they die more slowly than abandoned programs, as long as these formats
are based on _accepted, commonly accessible lowest common denominators,
and convey the meaning in a straightforward way_. In the reality I see,
and the near future I can predict, MathML and/or Unicode don't qualify.

--
Eric Behr | NIU Mathematical Sciences | (815) 753 6727
be...@math.niu.edu | http://www.math.niu.edu/~behr/ | fax: 753 1112

Hans Aberg

unread,
Oct 24, 2002, 5:10:57 AM10/24/02
to
In article <ap79pp$1gc9$1.re...@nef.ens.fr>, david....@ens.fr (David
Madore) wrote:

>There is something called Omega, which smells like TeX, tastes like
>TeX and works very much like TeX, in fact, it comes from mostly the
>same sources, but it has been extended to use 16 bits, if not full 31,
>for characters, and to be programmable using finite state machines for
>input. This means it can handle Unicode (or at least the basic
>multilingual plane) as well as the fonts will allow it. There is also
>Lambda, which is LaTeX-over-Omega. Many standard TeX distributions,
>such as teTeX (which comes, for example, with RedHat Linux) include
>Omega and Lambda.

I didn't go into the details of successors, because that is discussed
extensively at other places. -- I followed some of it in the LaTeX3
development list, but there are other places where TeX and specific
successors are discussed.

Just some remarks that may be of interest to mathematicians:

Omega is a TeX successor that is mainly geared at allowing multilingual support.

Frank Mittelbach, one of the main developers of LaTeX over the past couple
of years, had some complaints over using Omega straight off as a
successor: One reason was that it was not stable enough, I think.

I think that Omega may only be for 16-bit, which would insufficient for
the full Unicode in general and for math in particular: The Unicode
mathematical semantic styles all lie without the 16-bit range. (This
speaking about n-bit Unicode is in actuality "abuse of language": Unicode
in actuality only assigns numbers to symbols. When one squeezes these
numbers into computer n-bit representations, one gets various encodings,
like UTF-n. Unicode is however, I am told, now 21-bit, and will never be
extended beyond that. This then includes provisions for adding user
characters, which may be needed for specialty purposes.)

Therefore, I felt that the best way forward might be to extract a stable
portion of say Omega and some other successors that would be suitable for
LaTeX3. This might then become the Unicode TeX successor.

-- But I do not know what they in actuality will do. I haven't heard much
about it for some time.

>As for the risk of us not being able to read (La)TeX documents in the
>future, I think the risk mainly comes from the fonts rather than the
>documents themselves. TeX is pretty solid and stable and will not
>bit-rot further than it already has, and the Computer Modern fonts are
>also fixed, but more unusual fonts could cause a problem, I believe.
>Even in PS or PDF files in which they are supposedly embedded, I have
>seen strange problems of all sorts (mainly because MetaFont produces
>bitmap fonts and not vector PostScript Type1 fonts).

I had a discussion with the maintainer of the arXive a couple of years
ago, and he said then that they are saving all TeX definition files and
fonts they can get hold of. Thus the stuff needed to compile and read the
papers is in the arXive, and should be there forever, as long as the
arXive exists.

However, I think he said they only save in general one version of every
format. This means that articles using older formats may indeed become
outdated and unreadable.

But at that point, he said it had been easy to fix those problems with all
old papers: It is fairly easy to run through the whole archive from time
to time to see if it compiles.

So there are some updating problems, but these seem to be minor, not
major, hurdles.

George Russell

unread,
Oct 24, 2002, 8:17:03 AM10/24/02
to
Ronald Bruck wrote:
[snip]

> Hmmm, yes, one would think that formats bit-rot more slowly.
>
> But I have some spreadsheets from the days of CP/M, written by one of
> the leading spreadsheet programs of that day, which are now totally
> unreadable. Of course, in those days the number of personal computers
> was numbered in the thousands, if not mere hundreds, instead of
> millions.
Hint: open-source software. Where you use a closed-source "leading
spreadsheet program" and the company that owns the source stops supporting
it, you're in trouble. TeX would not have this problem.

>
> And dvi is a format too, isn't it? One whose "device-independence" is
> a wry joke, if you don't have the fonts.
You can build your own copies of the TeX fonts, using METAFONT (which
is compilable in the same way TeX is, using pretty well any Pascal
compiler) and their sources. This is a lot for one individual to do,
but it's perfectly reasonable to expect someone to do it in 3000AD, if
there is much interest at all in mathematical papers written now. It's
certainly requires a lot less work than has been put in, say, to
maintaining and making available copies of the manuscript of Beowulf.

George Russell

unread,
Oct 24, 2002, 8:25:03 AM10/24/02
to
"David L. Johnson" wrote:
>
> George Russell wrote:
>
> > Although I appreciate that the problem has to be taken seriously, it is
> > not insuperable.
>
> Over the past 18 years, I have used everything from Jim Milgram's Tekprint
> (sp?), Volkswriter Scientific, a few others, and TeX. I have old papers and
> fragmented results in some of these old formats that are unrecoverable -- at
> least by me. The oldest stuff is salvageable, primarily because it was a
> marked-up ascii format, so at least the text, and some of the equations,
> could be coaxed out. This is less true of the more proprietary file
> formats, which are I believe to be avoided at all costs.
Quite. TeX is not proprietary, and nor is the DVI format it produces
(which is deliberately designed to be fairly simple to interpret).
The only format you really need to understand is the restricted dialect
of standard Pascal TeX relies on, but it would not be difficult to write an
interpreter for this now, and shouldn't be difficult in 1000 years.

[snip]


> Many articles are produced in Word, which seems to have a file
> format lifetime of only a couple years. While they claim backward
> compatibility, it will be a challenge to read 20-year old Word documents in
> the future, and impossible to run the old software that produced them except
> on an equally old machine, which will be unrepairable.

It certainly does not surprise me if Word software is totally hopeless,
not just for archiving.
[snip]

Russell Easterly

unread,
Oct 24, 2002, 3:43:40 PM10/24/02
to

"George Russell" <g...@tzi.de> wrote in message
news:3DB7E4BF...@tzi.de...

> Ronald Bruck wrote:
> [snip]
> > Hmmm, yes, one would think that formats bit-rot more slowly.
> >

This is not really true. I have found that the life expectancy
of a format is directly proportional to how widely adopted it is.

> > But I have some spreadsheets from the days of CP/M, written by one of
> > the leading spreadsheet programs of that day, which are now totally
> > unreadable. Of course, in those days the number of personal computers
> > was numbered in the thousands, if not mere hundreds, instead of
> > millions.
> Hint: open-source software. Where you use a closed-source "leading
> spreadsheet program" and the company that owns the source stops supporting
> it, you're in trouble. TeX would not have this problem.
> >

See above. A popular format will be around for decades.
An unpopular one might be gone in months.

> > And dvi is a format too, isn't it? One whose "device-independence" is
> > a wry joke, if you don't have the fonts.

There is an old computer science joke:
Device independent means it doesn't run on any known device.

> You can build your own copies of the TeX fonts, using METAFONT (which
> is compilable in the same way TeX is, using pretty well any Pascal
> compiler) and their sources. This is a lot for one individual to do,
> but it's perfectly reasonable to expect someone to do it in 3000AD, if
> there is much interest at all in mathematical papers written now. It's
> certainly requires a lot less work than has been put in, say, to
> maintaining and making available copies of the manuscript of Beowulf.

I was involved in a project at Microsoft Museum.
We had copies of some of the original paper tape codes written by Bill Gates
for the Altair microprocessor. We wanted to write an emulator for the
Altair.
It took several developers weeks to complete the project.
One problem was trying to make the emulator run as slow as the original
Altair.
So, unless you are Bill Gates, trying to emulate obsolete hardware and
operating
systems can be very expensive and time consuming.

I do a lot of "conversions". If I wanted something to last a long time I
would
put it into a text file. "Human readable" is better than any machine only
type format.
Word documents are machine only documents (until the next release).
XML is human readable, barely. Since XML is being widely adopted,
it may the best bet for long term compatibility.

Perhaps the best way to preserve your words for posterity is to post them on
Usenet.
I believe there will be financial incentives for business's to archive
Usenet and
the accumulated knowledge it represents. A hundred years from now, people
will still be able to "google ur post".


Russell
- 2 many 2 count


RL

unread,
Oct 24, 2002, 7:33:36 PM10/24/02
to
"Russell Easterly" <logi...@attbi.com> writes:

>
> I do a lot of "conversions". If I wanted something to last a long
> time I would put it into a text file. "Human readable" is better
> than any machine only type format. Word documents are machine only
> documents (until the next release). XML is human readable,
> barely. Since XML is being widely adopted, it may the best bet for
> long term compatibility.
>

The .tex source for LaTeX'd articles is a human readable text file.
As is the source code for TeX (and indeed for the compiler to compile
TeX)

Paul R. Chernoff

unread,
Oct 24, 2002, 8:59:06 PM10/24/02
to

I've heard from someone who seems to know what he's talking
about that it's quite dangerous to store data on CDs, as is
moore & more the case. Seems that CD data decays significantly
within 25 years.

Old-fashioned rag paper is very good; most reliable -- baked
clay tablets.

--
# Paul R. Chernoff cher...@math.berkeley.edu #
# Department of Mathematics # 3840 #
# University of California "Against stupidity, the gods themselves #
# Berkeley, CA 94720-3840 struggle in vain." -- Schiller #

George Russell

unread,
Oct 25, 2002, 6:22:42 AM10/25/02
to
Russell Easterly wrote:
[snip]

> I was involved in a project at Microsoft Museum.
> We had copies of some of the original paper tape codes written by Bill Gates
> for the Altair microprocessor. We wanted to write an emulator for the
> Altair.
> It took several developers weeks to complete the project.
> One problem was trying to make the emulator run as slow as the original
> Altair.
> So, unless you are Bill Gates, trying to emulate obsolete hardware and
> operating
> systems can be very expensive and time consuming.
[snip]
Fortunately for TeX you don't need to emulate hardware or operating
systems, just a basic version of Pascal, plus some trivial extensions
to do file IO. If someone would bet me sufficient money I would be
tempted to undertake to get TeX and METAFONT running (slowly) inside a
month starting with some completely different language like Haskell.
Actually doing something with the output (like turning it into the 3000AD
equivalent of PostScript) might be trickier, but shouldn't be harder than
it has been to write any of the (numerous) DVI drivers around today.

George Russell

unread,
Oct 25, 2002, 6:34:23 AM10/25/02
to
"Paul R. Chernoff" wrote:
[snip]

> I've heard from someone who seems to know what he's talking
> about that it's quite dangerous to store data on CDs, as is
> moore & more the case. Seems that CD data decays significantly
> within 25 years.
>
> Old-fashioned rag paper is very good; most reliable -- baked
> clay tablets.
[snip]
Yes, any electronic archive needs to be regularly
copied from one medium to another. I did wonder about using rag
paper tape once, but I think that might be impractical, for various
reasons.

Paper also has disadvantages though; it is flammable and expensive to
copy. This is why there aren't many surviving Shakespeare manuscripts,
and one reason why subscriptions to paper journals are hideously
expensive.

I think the best way of ensuring permanence of the arXiv is to mirror
it across several sites and continents (as now happens) so that even if
some political or natural catastrophe should wipe it out in America, it
will still be maintained in Europe.

Hans Aberg

unread,
Oct 25, 2002, 1:29:53 PM10/25/02
to
In article <3DB91B72...@tzi.de>, George Russell <g...@tzi.de> wrote:

>Fortunately for TeX you don't need to emulate hardware or operating
>systems, just a basic version of Pascal, plus some trivial extensions
>to do file IO. If someone would bet me sufficient money I would be
>tempted to undertake to get TeX and METAFONT running (slowly) inside a
>month starting with some completely different language like Haskell.

I already pointed out that there is something called Web2C which
translates the original Knuthian Web sources to the computer programming
language C instead of Pascal. Also the Haskell compiler GHC produces C
sources as output, so unless one is a professional computing fakir, it
would be useless to translate it to Haskell: One can get TeX work together
with Haskell code by simpler means, if one so wishes.

Note that one the one hand, there are no problems in making sure that the
current TeX version will be available for a long time. So mathematicians
need not worry that people won't look into their papers for that reason.
:-)

On the other hand, there is a need for a TeX successor, even though it is
not exactly clear what this successor might look like. One requirement for
a successor would be to be able to handle Unicode, and by that also
international typesetting.

TeX as a language is excellent for what it was originally intended for,
but if one tries to go beyond that, it has some serious limitations,
making programming difficult. One such limitation, in my view, is the lack
of generality, stemming from the fact that its core is a simple macro
language. Some such TeX limitations and problems are deeply wired into the
programs core, and are not easily fixable.

So beyond the immediate need, getting a TeX version that can work with
Unicode and international typesetting, down the road one might perhaps
write a wholly new program with a more powerful kernel. -- And it would be
nice if such a program could support some nicer input syntaxes, for
example, more math like when needed.

As for math typesetting, on the one hand there is no need to worry that
TeX should suddenly become unavailable. But on the other hand, I think one
should not raise the program to the sky so as to exclude needed
successors.

George Russell

unread,
Oct 25, 2002, 4:42:54 PM10/25/02
to
Hans Aberg wrote:
[snip]

> I already pointed out that there is something called Web2C which
> translates the original Knuthian Web sources to the computer programming
> language C instead of Pascal. Also the Haskell compiler GHC produces C
> sources as output, so unless one is a professional computing fakir, it
> would be useless to translate it to Haskell: One can get TeX work together
> with Haskell code by simpler means, if one so wishes.
I know. But the point is that in 3000AD they may have neither C nor
Pascal. We can be sure however that (assuming the continual existence
of civilisation) they will either have ways of programming computers,
or at least of getting computers to program themselves. Also it is
reasonable to assume these ways are at least as efficient as the ways
we have now. So getting TeX going should be easy enough for them.
[snip]
> On the other hand, there is a need for a TeX successor ...
I agree entirely. People have been talking about the need for a TeX
successor for the last 10 years at least, but none of the attempts I
have heard of seem to have come to much. They can be divided to
superficial (if important) attempts to modify TeX to allow Unicode
(and so on), and completely new systems which, however, lack TeX's
power or portability.

I don't know what the solution is, but I fear it requires someone like
Donald E. Knuth, only 20 years younger, to spend a couple of years
designing and implementing the darn thing. And even then it would be
a big job persuading the mathematicians of the world to convert.
For most mathematical typesetting problems TeX does a pretty good job,
and not many are going to want to learn another typesetting language
just so that they can have a fancy GUI allowing them to construct
arrow diagrams without trial-and-error and graph-paper, or get slightly
better handling of skylines and rivers. It's hard to see how anyone
could come up with a new mathematical typesetting system now which would
represent as big an advance over TeX, as TeX was over its predecessors.
So it doesn't seem impossible to me that mathematicians may still be
using TeX in 100 years.

Brendan McKay

unread,
Oct 25, 2002, 10:26:42 PM10/25/02
to
George Russell <g...@tzi.de> wrote:

> I think the best way of ensuring permanence of the arXiv is to mirror
> it across several sites and continents (as now happens) so that even if
> some political or natural catastrophe should wipe it out in America, it
> will still be maintained in Europe.

It isn't quite so simple. A set of mirrors which regularly communicate
form a single system and can be subject to a single cause of failure.
For example, a major corruption of the main site might be mirrored to
all the other sites before it is noticed. This can be largely avoided
if the mirroring software never overwrites anything previously copied,
but it is not possible to positively identify all things that might go
wrong. Another possibility is some type of virus/worm that gets into
one mirror then spreads to the others via the mirroring software.
That is impossible to rule out as we know by hard experience.

A better approach is to make regular snapshots that are stored away
from the main archives, preferably on write-once media. This has
other problems such as the need to refresh the data regularly to
avoid the physical decay of the media.

Brendan.

David C. Ullrich

unread,
Oct 29, 2002, 10:06:09 AM10/29/02
to
On 23 Oct 2002 23:02:17 GMT, david....@ens.fr (David Madore) wrote:

[...]


>
>Personally I would rather see TeX die and MathML replace it (with XSLT
>preprocessing to perform the automated preprocessing). MathML, being
>based on XML, fully allows the use of Unicode on those systems that
>support it.

Yes.

>It remains, however, a terrible pain to write by hand,
>especially if attention is paid to page formatting (through CSS or
>XSL).
>

[...]


>
>MathML has the advantage over TeX that it is a format and not a
>program: formats tend to bit-rot much slower than programs, especially
>when, as is the case of MathML, they have well-written and readable
>standards and documentations.

It has (or can have, if done right) the further advantage that parts
of the file have labels attached explaining what they are. Getting
MathML domuments into a searchable database should be much easier
than with TeX documents, and allow much smarter and more extensive
searches, for example.


David C. Ullrich

David C. Ullrich

unread,
Oct 29, 2002, 10:19:05 AM10/29/02
to
On Wed, 23 Oct 2002 19:00:17 -0700, Ronald Bruck <br...@math.usc.edu>
wrote:

>[[ This message was both posted and mailed: see
> the "To," "Cc," and "Newsgroups" headers for details. ]]
>
>In article <ap79pp$1gc9$1.re...@nef.ens.fr>, David Madore
><david....@ens.fr> wrote:
>...
>>
>> MathML has the advantage over TeX that it is a format and not a
>> program: formats tend to bit-rot much slower than programs, especially
>> when, as is the case of MathML, they have well-written and readable
>> standards and documentations.
>
>Hmmm, yes, one would think that formats bit-rot more slowly.
>
>But I have some spreadsheets from the days of CP/M, written by one of
>the leading spreadsheet programs of that day, which are now totally
>unreadable. Of course, in those days the number of personal computers
>was numbered in the thousands, if not mere hundreds, instead of
>millions.

Yeah, but those spreadsheet files are in an opaque binary format.
On the other hand the logical structure of an XML document is
immediately apparent from looking at the document in a text
editor - in a hundred years a programmer who had never heard of
XML could easily write a program to parse XML documents just
by looking at a few and noting how they're put together:

<example status="snipped" reason="too long">
<theorem>
<name>Pythagorean</name>
<statement> If a, b, c are the sides of a right triangle then
etc
</statement>
<proof>
etc
</proof>
</theorem>
</example>

No fun to write or even read by hand, but it's immensely clear what
it all means and trivial to write a program to read it.

>And dvi is a format too, isn't it? One whose "device-independence" is
>a wry joke, if you don't have the fonts.
>
>--Ron Bruck


David C. Ullrich

David C. Ullrich

unread,
Oct 29, 2002, 10:22:40 AM10/29/02
to
On Thu, 24 Oct 2002 19:43:40 GMT, "Russell Easterly"
<logi...@attbi.com> wrote:

>
[...]


>
>I do a lot of "conversions". If I wanted something to last a long time I
>would
>put it into a text file. "Human readable" is better than any machine only
>type format.
>Word documents are machine only documents (until the next release).
>XML is human readable, barely.

It may be only barely human readable, but it's readable enough to
make it trivial to write a program to read it 100 years from now
when everyone's forgotten how it works.

>Since XML is being widely adopted,
>it may the best bet for long term compatibility.


David C. Ullrich

David C. Ullrich

unread,
Oct 29, 2002, 10:43:20 AM10/29/02
to
On Tue, 15 Oct 2002 12:30:33 +0200, remove...@matematik.su.se
(Hans Aberg) wrote:

>In article <1fk3c3p.5d3dj6172a64gN%b...@cs.anu.edu.au>, b...@cs.anu.edu.au
>(Brendan McKay) wrote:
>
>>> > Some London Society of mathematics journal would not accept a paper after
>>> > refereeing but instead saying that the paper was "preliminary accepted",
>>> > then using the leverage so obtained for demanding extensive rewriting of
>>> > the paper.
>...
>>> I can assure you that this is not standard editorial policy for the
>>> London Mathematical Society journals! The author of a paper requiring
>>> extensive changes may be invited to resubmit the paper, but in this
>>> case the resubmitted paper will be in competition with new first time
>>> submissions and there is no guarantee of publication.
>
>>I can't see the slightest thing wrong with accepting a paper subject to
>>extensive rewriting.
>
[...]
>
>>Sometimes referees report something like "the result is great but it
>>took me days of work to figure out what the result actually was" and
>>in such a case requesting a rewrite is the correct decision for the
>>editor to make. Of course the editor has to be open with the author
>>about why the rewrite is required and what sort of rewrite is required.
>>
[...]
>
>The problem with this is that asking for extensive rewrites can be used as
>an obstruction. So some referees are known to make long lists and heavy
>complaints on a paper, despite recommending it to be published just to
>slow its publishing down. Then one gets some extra time to work with the
>paper before other can do it, and it also slows down the competition, if
>the author is a competitor. Some editors not only can but systematically
>do use such a publishing tactics in order to keep the political control
>over their workfield.

The first time I submitted a paper to a journal the report said that
they'd publish the paper if I rewrote it - the result was fine but
the writing was awful. I couldn't see any problems with the writing -
I was very irritated. The paper went through several versions before
finally being accepted.

These days when I look at the version that was finally published I'm
embarassed at how awfully written it is! Throughout the entire paper
I see things that are just barely English sentences, and which
really don't say at all what I meant, I have to figure out what I
_must_ have meant when I wrote it, instead of just reading what's
written. I shudder to think what the first version I submitted must
have looked like.

It may be that in some cases requesting extensive rewriting is
done for inappropriate reasons as you suggest. But it seems to
me that it's often _perfectly_ appropriate. (It's happened that
_I_ have been a referee who said the result was fine but the
writing was awful. This was not for any of the reasons you
suggest, it was done because the paper was very hard to read
because of the poor writing.)

>So if you are making use or admitting such extensive policies, then either
>you are a clever predator, or a dumb copycat sucker. :-)

I don't believe I'm either of those. Nor do I think the referee
and editor for my first paper was either of those.

>-- It also
>contradicts the principles of appropriate scientific publishing.
>
>One way around this problem is to always file ones preprints quickly into
>an official archive, like http://arxiv.org/. For the need of ensuring a
>basic scientific quality, it would be best with a refereeing procedure
>whose principles of fairness and accuracy are explicitly spelled out as
>well as independently verified.


>
> Hans Aberg * Anti-spam: remove "remove." from email address.
> * Email: Hans Aberg <remove...@member.ams.org>
> * Home Page: <http://www.matematik.su.se/~haberg/>
> * AMS member listing: <http://www.ams.org/cml/>


David C. Ullrich

David desJardins

unread,
Oct 29, 2002, 1:12:53 PM10/29/02
to
David C. Ullrich writes:
> Yeah, but those spreadsheet files are in an opaque binary format.
> On the other hand the logical structure of an XML document is
> immediately apparent from looking at the document in a text
> editor - in a hundred years a programmer who had never heard of
> XML could easily write a program to parse XML documents just
> by looking at a few and noting how they're put together:

Parsing arbitrary XML is far from trivial. Sure, the easy cases are
easy. But there are a lot of complicated or obscure cases. You also
have the problem that some people may give you XML that's defective
according to the official standard, but you don't really want to throw
up your hands and do nothing at all. A lot more people use existing XML
parsers than go to all the trouble of writing their own.

My employer both generates and parses XML (and HTML) and we certainly
make mistakes in both. And not for lack of trying hard or being skilled
at it.

It's true that you might easily be able to write a subset parser that
could parse a particular set of XML documents generated in a particular
way using only a relatively nice subset of XML.

David desJardins

David Madore

unread,
Oct 29, 2002, 3:10:31 PM10/29/02
to
David desJardins in litteris <vohlm4h...@blue3.math.berkeley.edu>
scripsit:

> Parsing arbitrary XML is far from trivial. Sure, the easy cases are
> easy. But there are a lot of complicated or obscure cases. You also
> have the problem that some people may give you XML that's defective
> according to the official standard, but you don't really want to throw
> up your hands and do nothing at all.

Quite the contrary: you _want_ to throw up your hands and do nothing
at all. More than that: the XML Standard _requires_ that, if a
conforming XML parser encounters a document that is not well-formed,
the parser should _not_ attempt to correct any errors, but abort
normal processing and report errors to the user (see the definition of
"fatal error" in the XML Norm, <URL: http://www.w3.org/TR/REC-xml >).
This is to avoid the situation of HTML where the leniency of parsers
has led to a proliferation of bad syntax which in turn require
horrible bugware to be parserd.

This is getting off-topic for this group, however. To give an idea of
the readability (ahem) of MathML for the benefit of those who have
never seen it, I give a sample that I wrote by hand. In comparison,
here is the (plain) TeX version:

--- cut after ---
{\bf A little test of MathML}\par
In a letter to Godfrey Harold Hardy, S\b{r}\={\i}\b{n}iv\={a}sa
R\={a}m\={a}\b{n}uja\b{n} Aiya\.{n}k\={a}r asserts that
${\textstyle 1\over\textstyle 1+
{\textstyle e^{-2\pi\sqrt{5}}\over\textstyle 1+
{\textstyle e^{-4\pi\sqrt{5}}\over\textstyle 1+
{\textstyle e^{-6\pi\sqrt{5}}\over\textstyle\ddots}}}}
\penalty-100=\penalty-100\left(
{\textstyle\sqrt{5}\over\textstyle
1+\root5\of{5^{3/4}\left({\sqrt{5}-1\over 2}\right)^{5/2}-1}}
-{\textstyle\sqrt{5}+1\over\textstyle 2}
\right)e^{2\pi/\sqrt{5}}$
\bye
--- cut before ---

and here is the MathML version:

--- cut after ---
<?xml version="1.0" encoding="us-ascii"?>

<!DOCTYPE html PUBLIC "-//W3C//DTD XHTML 1.1 plus MathML 2.0//EN"
"http://www.w3.org/TR/MathML2/dtd/xhtml-math11-f.dtd"
[
<!ENTITY htmlns "http://www.w3.org/1999/xhtml">
<!ENTITY mathmlns "http://www.w3.org/1998/Math/MathML">
]>

<html xmlns="&htmlns;" xml:lang="en">

<head>
<title>A little test of MathML</title>
<meta http-equiv="Content-Type" content="text/xml; charset=us-ascii" />
<meta http-equiv="Content-Language" content="en" />
</head>


<body>

<h1>A little test of MathML</h1>

<p>In a letter to Godfrey Harold Hardy,
S&#x1e5f;&#x12b;&#x1e49;iv&#x101;sa
R&#x101;m&#x101;&#x1e49;uja&#x1e49; Aiya&#x1e45;k&#x101;r asserts that
<math xmlns="&mathmlns;">
<mfrac>
<mstyle scriptlevel="0">
<mn>1</mn>
</mstyle>
<mstyle scriptlevel="0">
<mrow>
<mn>1</mn>
<mo>+</mo>
<mfrac>
<mstyle scriptlevel="0">
<msup>
<mi>&ExponentialE;</mi>
<mrow>
<mo>-</mo>
<mrow>
<mn>2</mn>
<mo>&InvisibleTimes;</mo>
<mi>&pi;</mi>
<msqrt>
<mn>5</mn>
</msqrt>
</mrow>
</mrow>
</msup>
</mstyle>
<mstyle scriptlevel="0">
<mrow>
<mn>1</mn>
<mo>+</mo>
<mfrac>
<mstyle scriptlevel="0">
<msup>
<mi>&ExponentialE;</mi>
<mrow>
<mo>-</mo>
<mrow>
<mn>4</mn>
<mi>&pi;</mi>
<msqrt>
<mn>5</mn>
</msqrt>
</mrow>
</mrow>
</msup>
</mstyle>
<mstyle scriptlevel="0">
<mrow>
<mn>1</mn>
<mo>+</mo>
<mfrac>
<mstyle scriptlevel="0">
<msup>
<mi>&ExponentialE;</mi>
<mrow>
<mo>-</mo>
<mrow>
<mn>6</mn>
<mi>&pi;</mi>
<msqrt>
<mn>5</mn>
</msqrt>
</mrow>
</mrow>
</msup>
</mstyle>
<mstyle scriptlevel="0">
<mi>&dtdot;</mi>
</mstyle>
</mfrac>
</mrow>
</mstyle>
</mfrac>
</mrow>
</mstyle>
</mfrac>
</mrow>
</mstyle>
</mfrac>
<mo>=</mo>
<mfenced>
<mrow>
<mfrac>
<mstyle scriptlevel="0">
<msqrt>
<mn>5</mn>
</msqrt>
</mstyle>
<mstyle scriptlevel="0">
<mrow>
<mn>1</mn>
<mo>+</mo>
<mroot>
<mrow>
<mrow>
<msup>
<mn>5</mn>
<mrow>
<mn>3</mn>
<mo>/</mo>
<mn>4</mn>
</mrow>
</msup>
<mo>&InvisibleTimes;</mo>
<msup>
<mfenced>
<mfrac>
<mrow>
<msqrt>
<mn>5</mn>
</msqrt>
<mo>-</mo>
<mn>1</mn>
</mrow>
<mn>2</mn>
</mfrac>
</mfenced>
<mrow>
<mn>5</mn>
<mo>/</mo>
<mn>2</mn>
</mrow>
</msup>
</mrow>
<mo>-</mo>
<mn>1</mn>
</mrow>
<mn>5</mn>
</mroot>
</mrow>
</mstyle>
</mfrac>
<mo>-</mo>
<mfrac>
<mstyle scriptlevel="0">
<mrow>
<msqrt>
<mn>5</mn>
</msqrt>
<mo>+</mo>
<mn>1</mn>
</mrow>
</mstyle>
<mstyle scriptlevel="0">
<mn>2</mn>
</mstyle>
</mfrac>
</mrow>
</mfenced>
<msup>
<mi>&ExponentialE;</mi>
<mrow>
<mn>2</mn>
<mo>&InvisibleTimes;</mo>
<mi>&pi;</mi>
<mo>/</mo>
<msqrt>
<mn>5</mn>
</msqrt>
</mrow>
</msup>
</math>
</p>

</body>
</html>
--- cut before ---

And, to remain fully on topic, I ask: has this remarkable statement by
Ramanujan ever been proven rigorously? And, if so, how complicated is
it?

Hans Aberg

unread,
Oct 29, 2002, 2:19:43 PM10/29/02
to
In article <unatrug1jie5gnq4b...@4ax.com>,
ull...@math.okstate.edu wrote:

>It may be that in some cases requesting extensive rewriting is
>done for inappropriate reasons as you suggest. But it seems to
>me that it's often _perfectly_ appropriate. (It's happened that
>_I_ have been a referee who said the result was fine but the
>writing was awful. This was not for any of the reasons you
>suggest, it was done because the paper was very hard to read
>because of the poor writing.)

Well, assuming that the paper is parsable from the scientific point of
view, do you view those remarks as _scientific_?

And if those remarks are merely style comments, can they be used as a
basis for the paper being refuted?

What happens if the author says, well, the referee has those opinions, but
those are clearly not scientific, and by that decides to do nothing about
them? Can the paper then be refuted?

After all, the paper is published in a scientific journal, which called so
not because it should be a competition in style, but in scientific
results. -- If an author is simply not interested in the style
competition, why should he or she being punished by the views of the
editors and referees on the subject.

If style is a major issue, there should probably special "stylistic
journals". :-)

-- One can also note that editors and referees that have a lot of opinions
on style do not necessarily have a great deal of insight into the subject.
Often they merely enforce a style they happen to be acquainted with and
like. In some cases they do not know how to write good papers themselves,
they just happen to have a lots of opinions on the matter. :-)

Thus very far from the idea that one should promote the communication of
scientific facts.

Nathan Dunfield

unread,
Oct 29, 2002, 7:26:19 PM10/29/02
to
In article <remove.haberg-2...@du136-226.ppp.su-anst.tninet.se>,

Hans Aberg <remove...@matematik.su.se> wrote:
>>It may be that in some cases requesting extensive rewriting is
>>done for inappropriate reasons as you suggest. But it seems to
>>me that it's often _perfectly_ appropriate.
...

>Well, assuming that the paper is parsable from the scientific point of
>view, do you view those remarks as _scientific_?

I didn't write the original passage that Hans was responding to, but
in my experience a request for extensive rewriting means:

"The paper was parsable from the scientific point of view, but parsing
it took _way too long_. With a good rewrite, the comprehension time
could probably been cut in, say, half or more."

It seems reasonable to me for a journal to request extensive rewriting
in such circumstances. Overall, I think mathematicians could do a lot
better in the exposition department; I think a lot of us (myself
included) write papers until they are logically correct but lacking
enough guideposts to make them easy to understand.

Nathan Dunfield

Dept of Math
--
Dept. of Mathematics email: nat...@math.harvard.edu
Harvard University web: http://www.math.harvard.edu/~nathand/
One Oxford St. phone: (617) 495 - 5340
Cambridge, MA 02138 fax: (617) 495 - 5132

Hans Aberg

unread,
Oct 30, 2002, 5:47:14 AM10/30/02
to
In article <apn8vb$akv$1...@news.fas.harvard.edu>, nat...@math.harvard.edu
(Nathan Dunfield) wrote:

>>Well, assuming that the paper is parsable from the scientific point of
>>view, do you view those remarks as _scientific_?

...


>in my experience a request for extensive rewriting means:

>"The paper was parsable from the scientific point of view, but parsing
>it took _way too long_. With a good rewrite, the comprehension time
>could probably been cut in, say, half or more."

Well, the question is that view has anything to do with the quality of the
scientific contents or not:

It may take long time to parse a paper, but it also may take long time to
rewrite the paper.

One should be reminded of that we do not speak about cases where the
editor and/or referee has some opinions that are fixable by the author say
within a few months, but rewrites that takes several years and by that are
a major intrusion into the authors career.

It is also deceptive request elegant papers, because it is easy to write
an elegant paper when the foundations already have been laid out and one
merely steps in, doing things within well-known domains. If one needs an
elegant paper on some type of result, then it is probably better to write
another, once the new scientific facts have been established, rather than
wasting time rewriting the original.

>It seems reasonable to me for a journal to request extensive rewriting
>in such circumstances. Overall, I think mathematicians could do a lot
>better in the exposition department; I think a lot of us (myself
>included) write papers until they are logically correct but lacking
>enough guideposts to make them easy to understand.

I think that the main point is that we deal here with journals that are
called "scientific", and by that label are supposed to fulfill a special
purpose, namely, the communication of scientific facts.

Clearly, if you have just any journal, not with the label "scientific",
then you can demand whatever you want in the quarters of style. But then
people do not have the expectations that the main function of the journal
is to be scientific.

You may have the opinion that mathematicians overall could do a lot better
in the area exposition, but as an editor or referee of a scientific
journal, does that opinion have anything to do with the quality of the
scientific result?

And what says that your personal opinions about style and exposition are
in way universally correct the way scientific facts are thought be? --
There is a problem if one applies the same principles to opinions that are
used in determining scientific facts:

If the editor says that the sky is green, and you go out and look at it
and it looks blue, does it mean that the sky is green? -- Some editors and
referees will impose their opinions that way, not only about style and
exposition, but also about the scientific contents, and the author
realizes that in order to get the paper published, there is no point in
arguing, but merely to comply.

This happens especially when the editor and/or referee are exposed to
facts and styles they haven't encountered before: For example, if one has
to learn new things then quite clearly, it takes a long time to parse the
stuff, but it will show at as complaints about style as though the author
could do something about the editor/referee doing their homework. If the
style of the exposition is motivated by the type of scientific facts
presented, but the editor or referee does not know about it, then that
will also show up as complaints over style, even if changing the style
will make the scientific parsing impossible. -- This is most apparent in
interscientific publishing.

Thus, the editorship and refereeing may even decrease the scientific
quality, style and refereeing. Is that how it should be?

Hiu Chung Law

unread,
Oct 30, 2002, 10:18:25 AM10/30/02
to
Hans Aberg <remove...@matematik.su.se> wrote:

[ snip ]

> I think that the main point is that we deal here with journals that are
> called "scientific", and by that label are supposed to fulfill a special
> purpose, namely, the communication of scientific facts.

^^^^^^^^^^^^^

If rewriting can facilitate communication, IMHO it is worth doing. It is
common that one cannot afford spending a lot of time reading a
paper -- particular when the paper is outside the key area of the researcher.
If a paper is written more clearly, more people are willing to read it,
and it is more likely that the paper can have larger impact.

However, since I am from Computer Science, I do not know if this
applies to mathematics or not.

[ snip ]

David C. Ullrich

unread,
Oct 30, 2002, 10:05:34 AM10/30/02
to
On Tue, 29 Oct 2002 20:19:43 +0100, remove...@matematik.su.se
(Hans Aberg) wrote:

>In article <unatrug1jie5gnq4b...@4ax.com>,
>ull...@math.okstate.edu wrote:
>
>>It may be that in some cases requesting extensive rewriting is
>>done for inappropriate reasons as you suggest. But it seems to
>>me that it's often _perfectly_ appropriate. (It's happened that
>>_I_ have been a referee who said the result was fine but the
>>writing was awful. This was not for any of the reasons you
>>suggest, it was done because the paper was very hard to read
>>because of the poor writing.)
>
>Well, assuming that the paper is parsable from the scientific point of
>view, do you view those remarks as _scientific_?
>
>And if those remarks are merely style comments, can they be used as a
>basis for the paper being refuted?

I don't think the distinction is clear. The example that springs
to mind comes from sci.math instead of from research papers:

"How does one prove that there are infinite primes?"

Now in fact there is no problem here with the style or the
syntax - the question is perfectly clear, and the answer is
that one does not prove that there are infinite primes,
because in fact all primes are finite. But what the writer
actually meant was

"How does one show that there are infinitely many primes?"

Suppose a paper contains a proof of this theorem: There are
infinite primes. That's _not_ just a "style" error, the
theorem stated in the paper is _false_. But it's also not
a problem with the theorem that the author meant to state;
the theorem the author had in mind is true and the proof
is correct.

So rejecting the paper because the theorem is false is
not appropriate, because the _intended_ theorem is true.
Accepting the paper is inappropriate, because the _stated_
theorem is false - the reader has to figure out what the
author meant to say instead of just reading what he wrote
(and in an analogous example in the real world figuring
out what the author meant to say may be much harder
than in this example.) So the paper should be accepted,
conditional on rewriting.

This is the sort of bad writing I was referring to; it's
much more serious than just bad _style_. (This is the
sort of bad writing that embarasses me when I look at
that old paper of mine, and the sort that came up when
I refereed a paper and said it should be accepted after
rewriting.)

>What happens if the author says, well, the referee has those opinions, but
>those are clearly not scientific, and by that decides to do nothing about
>them? Can the paper then be refuted?

If the author insists that "there exist infinite primes" is an
acceptable way to state the theorem then yes, I would not
reccomend that the paper be accepted.

>After all, the paper is published in a scientific journal, which called so
>not because it should be a competition in style, but in scientific
>results. -- If an author is simply not interested in the style
>competition, why should he or she being punished by the views of the
>editors and referees on the subject.
>
>If style is a major issue, there should probably special "stylistic
>journals". :-)

But I was talking about bad writing as above, not bad style.

>-- One can also note that editors and referees that have a lot of opinions
>on style do not necessarily have a great deal of insight into the subject.
>Often they merely enforce a style they happen to be acquainted with and
>like. In some cases they do not know how to write good papers themselves,
>they just happen to have a lots of opinions on the matter. :-)
>
>Thus very far from the idea that one should promote the communication of
>scientific facts.
>
> Hans Aberg * Anti-spam: remove "remove." from email address.
> * Email: Hans Aberg <remove...@member.ams.org>
> * Home Page: <http://www.matematik.su.se/~haberg/>
> * AMS member listing: <http://www.ams.org/cml/>


David C. Ullrich

David Eppstein

unread,
Oct 30, 2002, 10:49:45 AM10/30/02
to
In article
<remove.haberg-3...@du131-226.ppp.su-anst.tninet.se>,
remove...@matematik.su.se (Hans Aberg) wrote:

> >"The paper was parsable from the scientific point of view, but parsing
> >it took _way too long_. With a good rewrite, the comprehension time
> >could probably been cut in, say, half or more."
>
> Well, the question is that view has anything to do with the quality of the
> scientific contents or not:

...


> I think that the main point is that we deal here with journals that are
> called "scientific", and by that label are supposed to fulfill a special
> purpose, namely, the communication of scientific facts.
>
> Clearly, if you have just any journal, not with the label "scientific",
> then you can demand whatever you want in the quarters of style. But then
> people do not have the expectations that the main function of the journal
> is to be scientific.
>
> You may have the opinion that mathematicians overall could do a lot better
> in the area exposition, but as an editor or referee of a scientific
> journal, does that opinion have anything to do with the quality of the
> scientific result?

...


> Thus, the editorship and refereeing may even decrease the scientific
> quality, style and refereeing. Is that how it should be?

You speak a lot about "scientific quality" as if it's an absolute that
can be measured independent of writing quality. I don't think so.
Certainly, for math papers, correctness is an absolute (although often
bad writing can obscure even that) but a correct paper can still be
unreadable or uninteresting. Quality to me involves work that makes an
impact, that convinces a lot of people that they should care about the
results or that enables them to build their own results on yours.
Sometimes other people have done the convincing for you already (you're
working on a well known problem and don't care about reaching people who
haven't already worked on it) so you can get away with dry writing and
inelegant proofs, but in general you can't convince with bad writing.

--
David Eppstein UC Irvine Dept. of Information & Computer Science
epps...@ics.uci.edu http://www.ics.uci.edu/~eppstein/

Larry Hammick

unread,
Oct 30, 2002, 11:18:44 AM10/30/02
to

"Nathan Dunfield"
> "Hans Aberg"

> >>It may be that in some cases requesting extensive rewriting is
> >>done for inappropriate reasons as you suggest. But it seems to
> >>me that it's often _perfectly_ appropriate.
> ...
> >Well, assuming that the paper is parsable from the scientific point of
> >view, do you view those remarks as _scientific_?
>
> I didn't write the original passage that Hans was responding to, but
> in my experience a request for extensive rewriting means:
>
> "The paper was parsable from the scientific point of view, but parsing
> it took _way too long_. With a good rewrite, the comprehension time
> could probably been cut in, say, half or more."
>
> It seems reasonable to me for a journal to request extensive rewriting
> in such circumstances. Overall, I think mathematicians could do a lot
> better in the exposition department; I think a lot of us (myself
> included) write papers until they are logically correct but lacking
> enough guideposts to make them easy to understand.

Some journals tabulate "tips for authors", consisting for the most part of
taboos from English 101. But it might be well for a journal, or some bigger
organization such as AMS, to draw up a real style manual for authors. Some
newspapers have such manuals for their reporters. OT, you may have heard
that the great Hemingway praised the style manual of his onetime employer,
the Kansas City Star.

Offhand it looks as if, in writing mathematics, nothing matters except
rigour and clarity. Surely clarity does matter, whether that's scientific or
not. But on closer look, there are a lot of other things a journal might
well object to, such as (to cite a mild, unpolitical example) the
introduction of unnecessary new terminology.

Larry Hammick
Vancouver


George Russell

unread,
Oct 30, 2002, 11:47:21 AM10/30/02
to
Hans Aberg wrote:
[snip]

> After all, the paper is published in a scientific journal, which called so
> not because it should be a competition in style, but in scientific
> results. -- If an author is simply not interested in the style
> competition, why should he or she being punished by the views of the
> editors and referees on the subject.
[snip]

I think what you have to remember is that scientific journals exist in a free
market. The editor has a legitimate interest in ensuring that the articles
are readable, as otherwise people will not read the articles, libraries will not
stock the journal, and the journal will go bust. If the main result of a paper
cannot be precisely described in an abstract (and often it cannot), it is going to
deter readers if they have to torture their brain through 30 pages to get to it,
since at page 29 they will still not know if the result will be useful to them or not.

Of course since it is a free market, it is open to you to start up a new journal,
announcing publicly that it pays no attention to the style of the articles. If there
are papers with good mathematics and bad style that aren't getting published, you should
be able to find enough authors. Whether you will find readers is another question.

Dan Grayson

unread,
Oct 31, 2002, 9:12:02 AM10/31/02
to

This open-ended discussion seems to have strayed gradually from the topic the
newsgroup is devoted to (discussions of current mathematical research), so
we've stopped accepting followups on this thread.

-- the moderator

0 new messages