Google Groups no longer supports new Usenet posts or subscriptions. Historical content remains viewable.
Dismiss

pdflatex or dvipdf ?

9 views
Skip to first unread message

Didier Verna

unread,
Apr 5, 2004, 12:34:34 PM4/5/04
to

Hi !

When I have to produce both pdf and ps formats, I usually build my documents
directly: one version with latex and the other with pdflatex (always in
different directories, to avoid problems with aux files that are different in
both cases, most notably when using hyperref).

Now, let's assume I want the exact same document in pdf and ps. Are there any
reason why I should prefer either pdflatex or dvipdf (or even pstopdf) ?

--
Didier Verna, did...@lrde.epita.fr, http://www.lrde.epita.fr/~didier

EPITA / LRDE, 14-16 rue Voltaire Tel.+33 (1) 44 08 01 85
94276 Le Kremlin-Bicętre, France Fax.+33 (1) 53 14 59 22 did...@xemacs.org

LEE Sau Dan

unread,
Apr 5, 2004, 1:06:02 PM4/5/04
to
>>>>> "drv" == Didier Verna <did...@lrde.epita.fr> writes:

drv> Hi !

drv> When I have to produce both pdf and ps formats, I usually
drv> build my documents directly: one version with latex and the
drv> other with pdflatex (always in different directories, to
drv> avoid problems with aux files that are different in both
drv> cases, most notably when using hyperref).

You can still get unmatching .pdf and .ps files. Read the pdftex
manual. It tells you that pdftex does not necessary generate the same
layout as TeX. pdftex allows more flexibility in adjusting the
character spacing, etc., and hence may break lines differently than
Knuth's TeX. It doesn't occur that often, though.

Moreover, some LaTeX packages load different codes (drivers) depending
on whether you're using TeX or pdftex. The hyperref and graphic[sx]
packages are typical examples. That means again that you may get
different typesetting results using pdflatex and latex.
(e.g. \url{a_very_very_long_url} cannot be broken into different lines
when using latex, but can be broken when using pdflatex.
)

drv> Now, let's assume I want the exact same document in pdf and
drv> ps. Are there any reason why I should prefer either pdflatex
drv> or dvipdf (or even pstopdf) ?

Owing to the reasons give above, my preferred way to achieve that goal
is:

latex --> dvi --[1]--> ps --[2]--> pdf

where
[1] = dvips -Ppdf -G0 -z doc.dvi -o
[2] = ps2pdf doc.ps


I also "\usepackage[ps2pdf]{hyperref}" so that the resulting PDF has
hyperlinks for the internal cross references. That package also gives
the \url command, which is handy. I also
"\usepackage[dvips]{geometry}" to get papersize specials, so that
dvips knows what paper size the output is expected to fit. The paper
size is specified as an option ([a4paper]) to \documentclass.

Of course, an alternative is:

latex --> pdf --> ps

You can convert PDF to PS using Acrobat Reader or pdftops (xpdf) or
pdf2ps (Ghostscript).


But I prefer the former method, because I still like using EPS
figures, which pdfTeX does not support. (However, METAPOST *is*
supported by pdfTeX.)


--
Lee Sau Dan 李守敦(Big5) ~{@nJX6X~}(HZ)

E-mail: dan...@informatik.uni-freiburg.de
Home page: http://www.informatik.uni-freiburg.de/~danlee

Guy Worthington

unread,
Apr 6, 2004, 8:14:39 AM4/6/04
to
LEE Sau Dan wrote:

> [snipped]

Hello Lee,

The way your text editor justifies text makes
my teeth ache. As anyone who is reading in
their native tongue, I don't read words,
but instead read chunks of text. And I can't
do that with your compositions.

Because your text editor is completely
fucking the visual flow.

David Kastrup

unread,
Apr 6, 2004, 8:34:38 AM4/6/04
to
nothung...@spamgourmet.com (Guy Worthington) writes:

Yes, justification of texts that are supposed to be read
by other humans (or even computers, for what it's worth)
is a bad idea. Consider a language such as Python where
leading spaces are significant. Now try to imagine what
nuisance it would be if the right border of program text
would also carry some meaning. Programming would become
a really awful task: you'd have to try finding some more
or less obscure wording in order to make your program do
the desired task. Justification on plain text terminals
really is something that should never be done. It fails
to be worth the inflicted pain on both writer as well as
reader.

--
David Kastrup, Kriemhildstr. 15, 44793 Bochum
UKTUG FAQ: <URL:http://www.tex.ac.uk/cgi-bin/texfaq2html>

Robin Fairbairns

unread,
Apr 6, 2004, 9:21:37 AM4/6/04
to
David Kastrup <d...@gnu.org> writes:
>Yes, justification of texts that are supposed to be read
>by other humans (or even computers, for what it's worth)
>is a bad idea. Consider a language such as Python where
>leading spaces are significant. Now try to imagine what
>nuisance it would be if the right border of program text
>would also carry some meaning. Programming would become
>a really awful task: you'd have to try finding some more
>or less obscure wording in order to make your program do
>the desired task. Justification on plain text terminals
>really is something that should never be done. It fails
>to be worth the inflicted pain on both writer as well as
>reader.

no imagination necessary. fortran is (or was) just so. however, the
1re was no need to line up the right hand edge of the text because f
2ortran card punches did the lining-up for you and also inserted the
3 necessary continuation marks. sadly, we never had card punches --
4all our programs were prepared on punched tape. still, odd to have
5 a language designed for the convenience of card punches.

(mind you, i suspect my memory of the correct column numbers. i'm
fairly sure "nothing beyond 72" is right, but the label column and the
continuation mark column numbers seem to have slipped my memory.)
--
Robin (http://www.tex.ac.uk/faq) Fairbairns, Cambridge

David Kastrup

unread,
Apr 6, 2004, 9:27:46 AM4/6/04
to
r...@cl.cam.ac.uk (Robin Fairbairns) writes:

Continuation mark column was fine, but you went beyond column 72
(columns start at 1).

As to "odd to have a language designed for card punches": practically
all terminals nowadays have converged on 80 characters per line, the
metric of punch cards.

A Kerr-Munslow [Mad Hatter]

unread,
Apr 6, 2004, 10:06:18 AM4/6/04
to
> David Kastrup <d...@gnu.org> writes:
> (mind you, i suspect my memory of the correct column numbers. i'm
> fairly sure "nothing beyond 72" is right, but the label column and the
> continuation mark column numbers seem to have slipped my memory.)

<delurking>
As a person who is probably technically* a "fortran programmer" I can
reassure you that you are right when you say "nothing beyond 72". In
fact, you can have stuff beyond line 72 but it is ignored, often without a
message. To refresh your memory, C, ! or * in line one indicates the rest
of the line is a comment. Statement numbers are in columns 1-5.
Continuation mark is column 6.

Amanda
*I'm an atmospheric physicist.

--
White Rabbit: I'm so late! I'm so very, very late!
Mad Hatter: Well no wonder you're late...Why, this clock is exactly
two days slow!
-- Lewis Carroll

Scott Pakin

unread,
Apr 6, 2004, 1:32:57 PM4/6/04
to
David Kastrup wrote:
> Consider a language such as Python where
> leading spaces are significant. Now try to imagine what
> nuisance it would be if the right border of program text
> would also carry some meaning.

There's no need to continue imagining what it would be like if the
right border also carried meaning. And the left border. And the
absense of anything inbetween:

http://compsoc.dur.ac.uk/whitespace/index.php

Enjoy,
-- Scott

David Kastrup

unread,
Apr 6, 2004, 1:52:33 PM4/6/04
to
Scott Pakin <scot...@pakin.org> writes:

There is no perversion that mankind is incapable of.

Dan Luecking

unread,
Apr 6, 2004, 1:54:03 PM4/6/04
to
On 06 Apr 2004 14:34:38 +0200, David Kastrup <d...@gnu.org> wrote:

>nothung...@spamgourmet.com (Guy Worthington) writes:
>
>Yes, justification of texts that are supposed to be read
>by other humans (or even computers, for what it's worth)
>is a bad idea. Consider a language such as Python where
>leading spaces are significant. Now try to imagine what
>nuisance it would be if the right border of program text
>would also carry some meaning. Programming would become
>a really awful task: you'd have to try finding some more
>or less obscure wording in order to make your program do
>the desired task. Justification on plain text terminals
>really is something that should never be done. It fails
>to be worth the inflicted pain on both writer as well as
>reader.

Now I'm curious: how long did it take you to write this
paragraph?


Dan

--
Dan Luecking Department of Mathematical Sciences
University of Arkansas Fayetteville, Arkansas 72701
To reply by email, change Look-In-Sig to luecking

David Kastrup

unread,
Apr 6, 2004, 2:05:36 PM4/6/04
to
Dan Luecking <Look-...@uark.edu> writes:

> On 06 Apr 2004 14:34:38 +0200, David Kastrup <d...@gnu.org> wrote:
>
> >nothung...@spamgourmet.com (Guy Worthington) writes:
> >
> >Yes, justification of texts that are supposed to be read
> >by other humans (or even computers, for what it's worth)
> >is a bad idea. Consider a language such as Python where
> >leading spaces are significant. Now try to imagine what
> >nuisance it would be if the right border of program text
> >would also carry some meaning. Programming would become
> >a really awful task: you'd have to try finding some more
> >or less obscure wording in order to make your program do
> >the desired task. Justification on plain text terminals
> >really is something that should never be done. It fails
> >to be worth the inflicted pain on both writer as well as
> >reader.
>
> Now I'm curious: how long did it take you to write this
> paragraph?

Not more than twice as long as it would without artificial
constraints. English offers an awful lot of redundancy if
you need it. I think there is some PostScript code flying
around on the net that does block justification by changes
in phrasing only. Haven't seen it myself, though.

Robin Fairbairns

unread,
Apr 6, 2004, 3:54:09 PM4/6/04
to
David Kastrup <d...@gnu.org> writes:

>Scott Pakin <scot...@pakin.org> writes:
>> http://compsoc.dur.ac.uk/whitespace/index.php
>
>There is no perversion that mankind is incapable of.

i wonder if we can write a web->whitespace compiler (using 8bit
integers in place of characters)? tex in whitespace would be quite
something. for once, the tangled code would look less opaque than the
woven.

David Kastrup

unread,
Apr 6, 2004, 4:53:26 PM4/6/04
to
r...@cl.cam.ac.uk (Robin Fairbairns) writes:

> David Kastrup <d...@gnu.org> writes:
> >Scott Pakin <scot...@pakin.org> writes:
> >> http://compsoc.dur.ac.uk/whitespace/index.php
> >
> >There is no perversion that mankind is incapable of.
>
> i wonder if we can write a web->whitespace compiler (using 8bit
> integers in place of characters)? tex in whitespace would be quite
> something. for once, the tangled code would look less opaque than the
> woven.

Depends on whether you load the `color' package.

Brian Blackmore

unread,
Apr 6, 2004, 5:43:09 PM4/6/04
to

I have to say I was indeed happy to see that you were using two
spaces after your periods. That got me thinking about how many
TeX users do that and how many adhere to the contemporary style
of only putting one space after the period. I think I may have
to add that to my list of projects, i.e., `Determine the number
of TeX users that use one/two spaces after the period'. I will
add that to the list for August, I think. Yes, August 1, 2053.

--
Brian Blackmore
blb8 at po dot cwru dot edu

David Kastrup

unread,
Apr 6, 2004, 6:26:25 PM4/6/04
to
Brian Blackmore <bl...@po.cwru.edu> writes:

> David Kastrup <d...@gnu.org> wrote:
>
> > Not more than twice as long as it would without artificial
> > constraints. English offers an awful lot of redundancy if
> > you need it. I think there is some PostScript code flying
> > around on the net that does block justification by changes
> > in phrasing only. Haven't seen it myself, though.
>
> I have to say I was indeed happy to see that you were using two
> spaces after your periods. That got me thinking about how many
> TeX users do that and how many adhere to the contemporary style
> of only putting one space after the period. I think I may have
> to add that to my list of projects, i.e., `Determine the number
> of TeX users that use one/two spaces after the period'. I will
> add that to the list for August, I think. Yes, August 1, 2053.

The default Emacs convention sets sentence-end-double-space to t.
Since it then does not break lines after just a single space, you
get into the habit of typing double spaces in every of the plenty
of text modes Emacs provides. As maintainer of AUCTeX, I had for
a few days set this variable to nil in default TeX modes. But as
this caused Emacs to remove all my own double spaces when filling
a text, it made me quite unhappy, so I removed this setting again
even though TeX has no qualms breaking a line after single spaces
following periods. So I provide no different default, and people
who want otherwise, can override the setting easily enough.

Tom Micevski

unread,
Apr 7, 2004, 2:48:27 AM4/7/04
to
A Kerr-Munslow [Mad Hatter] wrote:
>>David Kastrup <d...@gnu.org> writes:
>>(mind you, i suspect my memory of the correct column numbers. i'm
>>fairly sure "nothing beyond 72" is right, but the label column and the
>>continuation mark column numbers seem to have slipped my memory.)
>
>
> <delurking>
> As a person who is probably technically* a "fortran programmer" I can
> reassure you that you are right when you say "nothing beyond 72". In
> fact, you can have stuff beyond line 72 but it is ignored, often without a
> message. To refresh your memory, C, ! or * in line one indicates the rest
> of the line is a comment. Statement numbers are in columns 1-5.
> Continuation mark is column 6.

that is correct for fortran 77 ("FORTRAN"). but fortran has moved on: "Fortran" (f90, f95, and soon f2003) has free form source, with a maximum line width of 132 characters.

Guy Worthington

unread,
Apr 7, 2004, 3:31:33 AM4/7/04
to
David Kastrup <d...@gnu.org> wrote in message news:<x5ekr1z...@lola.goethe.zz>...

> nothung...@spamgourmet.com (Guy Worthington) writes:
>
> > LEE Sau Dan wrote:
> >
> > > [snipped]
> >
> > [snipped]

> >

> Yes, justification of texts that are supposed to be read
> by other humans (or even computers, for what it's worth)
> is a bad idea. Consider a language such as Python where
> leading spaces are significant. Now try to imagine what
> nuisance it would be if the right border of program text
> would also carry some meaning. Programming would become
> a really awful task: you'd have to try finding some more
> or less obscure wording in order to make your program do
> the desired task. Justification on plain text terminals
> really is something that should never be done. It fails
> to be worth the inflicted pain on both writer as well as
> reader.

And they marvelled at him.

Didier Verna

unread,
Apr 7, 2004, 3:38:58 AM4/7/04
to
David Kastrup <d...@gnu.org> wrote:

> Scott Pakin <scot...@pakin.org> writes:
>>
>> http://compsoc.dur.ac.uk/whitespace/index.php
>
> There is no perversion that mankind is incapable of.


While we're at it, you might be interested in reading Stroustrup about
whitespace overloading: http://www.research.att.com/~bs/whitespace98.pdf

Giuseppe Bilotta

unread,
Apr 7, 2004, 8:44:51 AM4/7/04
to
David Kastrup wrote:
> r...@cl.cam.ac.uk (Robin Fairbairns) writes:
>
> > David Kastrup <d...@gnu.org> writes:
> > >Scott Pakin <scot...@pakin.org> writes:
> > >> http://compsoc.dur.ac.uk/whitespace/index.php
> > >
> > >There is no perversion that mankind is incapable of.
> >
> > i wonder if we can write a web->whitespace compiler (using 8bit
> > integers in place of characters)? tex in whitespace would be quite
> > something. for once, the tangled code would look less opaque than the
> > woven.
>
> Depends on whether you load the `color' package.

But would it be transparent to the user?

--
Giuseppe "Oblomov" Bilotta

Can't you see
It all makes perfect sense
Expressed in dollar and cents
Pounds shillings and pence
(Roger Waters)

LEE Sau Dan

unread,
Apr 7, 2004, 8:32:50 AM4/7/04
to
>>>>> "drv" == Didier Verna <did...@lrde.epita.fr> writes:

>> There is no perversion that mankind is incapable of.

drv> While we're at it, you might be interested in reading
drv> Stroustrup about whitespace overloading:
drv> http://www.research.att.com/~bs/whitespace98.pdf

That's crazy! When will they allow comment-overloading so that
comments become treated as function calls with the comment contents as
arguments?

Operator overload already creates a lot of confusions to many C++
programmers (because of those automatically (and autotragically)
inserted function calls to certain automatically created functions
(default/copy constructors and the like)). Whitespace over-loading
would certainly worsen the matter.


--
Lee Sau Dan +Z05biGVm-(Big5) ~{@nJX6X~}(HZ)

David Kastrup

unread,
Apr 7, 2004, 8:57:41 AM4/7/04
to
LEE Sau Dan <dan...@informatik.uni-freiburg.de> writes:

> >>>>> "drv" == Didier Verna <did...@lrde.epita.fr> writes:
>
> >> There is no perversion that mankind is incapable of.
>
> drv> While we're at it, you might be interested in reading
> drv> Stroustrup about whitespace overloading:
> drv> http://www.research.att.com/~bs/whitespace98.pdf
>
> That's crazy! When will they allow comment-overloading so that
> comments become treated as function calls with the comment contents as
> arguments?
>
> Operator overload already creates a lot of confusions to many C++
> programmers (because of those automatically (and autotragically)
> inserted function calls to certain automatically created functions
> (default/copy constructors and the like)). Whitespace over-loading
> would certainly worsen the matter.

The paper was released 6 years and 6 days(!) ago, so it is old news.
No use getting all upset about spilled milk.

Scott Pakin

unread,
Apr 7, 2004, 12:01:40 PM4/7/04
to
LEE Sau Dan wrote:
>>>>>>"drv" == Didier Verna <did...@lrde.epita.fr> writes:
>>>>>
>
> >> There is no perversion that mankind is incapable of.
>
> drv> While we're at it, you might be interested in reading
> drv> Stroustrup about whitespace overloading:
> drv> http://www.research.att.com/~bs/whitespace98.pdf
>
> That's crazy!

Yep. Note that the paper refers to a technical report from April 1st
and ends by referring to "Project April Fool". It's a joke.

-- Scott

LEE Sau Dan

unread,
Apr 7, 2004, 9:36:24 AM4/7/04
to
>>>>> "David" == David Kastrup <d...@gnu.org> writes:

>> That's crazy! When will they allow comment-overloading so
>> that comments become treated as function calls with the comment
>> contents as arguments?

David> The paper was released 6 years and 6 days(!) ago, so it is
David> old news.

Maybe, it's 6 years, 6 days, 6 minutes.... something evil? :D


David> No use getting all upset about spilled milk.

Spoilt milk should be wiped away quickly, or it rots there! :)


--
Lee Sau Dan 李守敦(Big5) ~{@nJX6X~}(HZ)

Dan Luecking

unread,
Apr 7, 2004, 5:01:36 PM4/7/04
to
On 06 Apr 2004 19:52:33 +0200, David Kastrup <d...@gnu.org> wrote:

>Scott Pakin <scot...@pakin.org> writes:
>
>> David Kastrup wrote:
>> > Consider a language such as Python where
>> > leading spaces are significant. Now try to imagine what
>> > nuisance it would be if the right border of program text
>> > would also carry some meaning.
>>
>> There's no need to continue imagining what it would be like if the
>> right border also carried meaning. And the left border. And the
>> absense of anything inbetween:
>>
>> http://compsoc.dur.ac.uk/whitespace/index.php
>
>There is no perversion that mankind is incapable of.

I don't see why linefeeds are needed. Other languages get along
perfectly well with only 0s and 1s...

Dan Luecking

unread,
Apr 7, 2004, 5:06:04 PM4/7/04
to
On 07 Apr 2004 14:32:50 +0200, LEE Sau Dan
<dan...@informatik.uni-freiburg.de> wrote:

>>>>>> "drv" == Didier Verna <did...@lrde.epita.fr> writes:
>
> >> There is no perversion that mankind is incapable of.
>
> drv> While we're at it, you might be interested in reading
> drv> Stroustrup about whitespace overloading:
> drv> http://www.research.att.com/~bs/whitespace98.pdf
>
>That's crazy! When will they allow comment-overloading so that
>comments become treated as function calls with the comment contents as
>arguments?
>
>Operator overload already creates a lot of confusions to many C

>programmers (because of those automatically (and autotragically)
>inserted function calls to certain automatically created functions
>(default/copy constructors and the like)). Whitespace over-loading
>would certainly worsen the matter.

Note the uneven whitespace in the above. This is a VIRUS written
in whitespace! Whatever you do DO NOT read or even download this
message or any message that quotes it. It will cause your machine
to freez

Robin Fairbairns

unread,
Apr 7, 2004, 5:22:17 PM4/7/04
to
Scott Pakin <scot...@pakin.org> writes:
>LEE Sau Dan wrote:
>>>>>>>"drv" == Didier Verna <didier+AEA-lrde.epita.fr> writes:
>> >> There is no perversion that mankind is incapable of.
>>
>> drv> While we're at it, you might be interested in reading
>> drv> Stroustrup about whitespace overloading:
>> drv> http://www.research.att.com/+AH4-bs/whitespace98.pdf

>>
>> That's crazy!
>
>Yep. Note that the paper refers to a technical report from April 1st
>and ends by referring to "Project April Fool". It's a joke.

but like all the best jokes, it's perfectly believable. after all, if
any language was going to do anything so daft as that, it would have
to be c++ (does c## inherit c++'s inherent insanity?).

Scott Pakin

unread,
Apr 7, 2004, 6:19:53 PM4/7/04
to
Robin Fairbairns wrote:
> (does c## inherit c++'s inherent insanity?).

No, C# is, for all intents and purposes, Java. The key difference is
that Microsoft controls it.

-- Scott

Robin Fairbairns

unread,
Apr 8, 2004, 10:23:39 AM4/8/04
to

i was invited to look at some electronic book experiments at micro$oft
research, a while back. (rather nice, even though the box they used
was engineered by tearing a laptop in half and then sticking it back
together rather differently).

the (very senior) researcher who was my host seemed to want me to
offer an opinion about his view that java's control by sun was a real
problem, because _he_ knew how to put it "right". i declined to
comment (my experience of java is limited to writing a few library
routines for our research os, and reading the first ed of the nutshell
book).

George N. White III

unread,
Apr 8, 2004, 1:52:40 PM4/8/04
to
On Mon, 5 Apr 2004, LEE Sau Dan wrote:

> >>>>> "drv" == Didier Verna <did...@lrde.epita.fr> writes:
>
> drv> Hi !
>
> drv> When I have to produce both pdf and ps formats, I usually
> drv> build my documents directly: one version with latex and the
> drv> other with pdflatex (always in different directories, to
> drv> avoid problems with aux files that are different in both
> drv> cases, most notably when using hyperref).
>
> You can still get unmatching .pdf and .ps files. Read the pdftex
> manual. It tells you that pdftex does not necessary generate the same
> layout as TeX. pdftex allows more flexibility in adjusting the
> character spacing, etc., and hence may break lines differently than
> Knuth's TeX. It doesn't occur that often, though.

Pdftex can produce visually more even margins (by allowing some glyphs to
protrude), which in turn allows you to use slightly narrower gutters in
multi-column layouts. Not only does this save trees, it also gives
effectively longer lines and so reduces the number of bad breaks, rivers,
etc. This is especially helpful if you are trying to use a CM-based font
in a layout originally intended for Times-Roman.

> [ ... discussion of latex --> dvi --> [ps|pdf] ]
>
> Of course, an alternative is:
>
> latex --> pdf --> ps
> You can convert PDF to PS using Acrobat Reader or pdftops (xpdf) or
> pdf2ps (Ghostscript).
>
> But I prefer the former method, because I still like using EPS figures,
> which pdfTeX does not support. (However, METAPOST *is* supported by
> pdfTeX.)

Here are the arguments in favor of using "(pdf)latex --> pdf --> ps"
over methods that use dvi files:

1. Unless you have a tightly controlled source of eps figures, the
conversion of eps to pdf is a tricky step, and can require tweaks (and
even bug fixes to the conversion tool) to deal with the idiosyncracies in
individual files. This is much easier to get right and to debug if you
convert each eps to pdf separately than if you have problems with a
document level conversion.

2. TeX has information that gets discarded in the dvi file but
which can be used by pdftex.

3. pdf-->ps conversions are needed by many more people than use TeX,
while conversions involving dvi files are only useful to a limited
audience. There are more and better tools for pdf-->ps than for dvi-->X.
As a case in point, the most common tool for dvi-->ps is dvips, which
is based on a raster graphics model and so can have problems (even when
using scalable outline fonts) if the ps file is scaled.

4. If all the programs with 'dvi' in their names stopped working, a few
mathematicians would be annoyed but would soon learn to use pdf. If all
the programs that work with 'pdf' files stopped working, CNN would cover
the disaster 7/24. If we all stop using dvi files, a big whack of TeX
code can be discarded and the people who have been maintaining programs
with 'dvi' in the names can get back to solving more important problems.

--
George N. White III <aa...@chebucto.ns.ca>
Head of St. Margarets Bay, Nova Scotia, Canada

Stefan Ulrich

unread,
Apr 8, 2004, 2:05:15 PM4/8/04
to
George N White <aa...@chebucto.ns.ca> writes:

> Here are the arguments in favor of using "(pdf)latex --> pdf --> ps"
> over methods that use dvi files:

> 1. Unless you have a tightly controlled source of eps figures, the
> conversion of eps to pdf is a tricky step, and can require tweaks (and
> even bug fixes to the conversion tool) to deal with the idiosyncracies in
> individual files. This is much easier to get right and to debug if you
> convert each eps to pdf separately than if you have problems with a
> document level conversion.

> 2. TeX has information that gets discarded in the dvi file but
> which can be used by pdftex.

> 3. pdf-->ps conversions are needed by many more people than use TeX,
> while conversions involving dvi files are only useful to a limited
> audience. There are more and better tools for pdf-->ps than for dvi-->X.
> As a case in point, the most common tool for dvi-->ps is dvips, which
> is based on a raster graphics model and so can have problems (even when
> using scalable outline fonts) if the ps file is scaled.

> 4. If all the programs with 'dvi' in their names stopped working, a few
> mathematicians would be annoyed but would soon learn to use pdf. If all
> the programs that work with 'pdf' files stopped working, CNN would cover
> the disaster 7/24. If we all stop using dvi files, a big whack of TeX
> code can be discarded and the people who have been maintaining programs
> with 'dvi' in the names can get back to solving more important problems.

5. People could stop bashing DVI and the people maintaining the related
programs and get back to caring about more important matter.

--
Stefan Ulrich

David Kastrup

unread,
Apr 8, 2004, 2:30:45 PM4/8/04
to
"George N. White III" <aa...@chebucto.ns.ca> writes:

> On Mon, 5 Apr 2004, LEE Sau Dan wrote:
>
> > >>>>> "drv" == Didier Verna <did...@lrde.epita.fr> writes:
> >
> > drv> When I have to produce both pdf and ps formats, I usually
> > drv> build my documents directly: one version with latex and the
> > drv> other with pdflatex (always in different directories, to
> > drv> avoid problems with aux files that are different in both
> > drv> cases, most notably when using hyperref).
> >
> > You can still get unmatching .pdf and .ps files. Read the pdftex
> > manual. It tells you that pdftex does not necessary generate the same
> > layout as TeX. pdftex allows more flexibility in adjusting the
> > character spacing, etc., and hence may break lines differently than
> > Knuth's TeX. It doesn't occur that often, though.
>
> Pdftex can produce visually more even margins (by allowing some glyphs to
> protrude), which in turn allows you to use slightly narrower gutters in
> multi-column layouts. Not only does this save trees, it also gives
> effectively longer lines and so reduces the number of bad breaks, rivers,
> etc. This is especially helpful if you are trying to use a CM-based font
> in a layout originally intended for Times-Roman.

All of the above is completely irrelevant for the decision whether to
use DVI or PDF since it is all available by using

pdflatex "\pdfoutput=0" "\input whateverfile"

Mike Oliver

unread,
Apr 8, 2004, 2:35:38 PM4/8/04
to
George N. White III wrote:

> Here are the arguments in favor of using "(pdf)latex --> pdf --> ps"
> over methods that use dvi files:

I'd love to use pdflatex, if only it would support eepic.

I have trouble understanding, though, why people still
want PS files if they already have the PDF. PS files are
bulkier, the tools for viewing them are not as good, and
they're more likely to send a printer's poor little processor
into tilt and make it consume half an hour per printed
page.

H. S.

unread,
Apr 8, 2004, 2:59:31 PM4/8/04
to
Mike Oliver wrote:

> they're more likely to send a printer's poor little processor
> into tilt and make it consume half an hour per printed
> page.
>

I do all my work on Linux machines. Your above comment created this
doubt in my mind. Isn't is the case that in Unix/Linux, when you print
to a printer from a PDF file in acroread, the file is first converted to
a ps file and only then sent to the printer? If so, your comment above
would be wrong.

Just wanted to clarify this.
thanks,
->HS
--
(Remove all underscores,if any, from my email address to get the correct
one. Apologies for the inconvenience but this is to reduce spam.)

Mike Oliver

unread,
Apr 8, 2004, 3:12:13 PM4/8/04
to
H. S. wrote:
> Mike Oliver wrote:
>
>> they're more likely to send a printer's poor little processor
>> into tilt and make it consume half an hour per printed
>> page.
>>
>
> I do all my work on Linux machines. Your above comment created this
> doubt in my mind. Isn't is the case that in Unix/Linux, when you print
> to a printer from a PDF file in acroread, the file is first converted to
> a ps file and only then sent to the printer? If so, your comment above
> would be wrong.

I believe that this is the case, but no, it doesn't make the comment
wrong. Remember that PostScript is a programming language and
can code an arbitrary Turing machine. It's entirely possible that
acroread converts the PDF into PostScript -- but simple PostScript
that's easy for your printer's processor to deal with, whereas other
PostScript might be arbitrarily complex.

David Kastrup

unread,
Apr 8, 2004, 3:23:36 PM4/8/04
to
"H. S." <g_reate_...@yahoo.com> writes:

> Mike Oliver wrote:
>
> > they're more likely to send a printer's poor little processor
> > into tilt and make it consume half an hour per printed
> > page.
> >
>
> I do all my work on Linux machines. Your above comment created this
> doubt in my mind. Isn't is the case that in Unix/Linux, when you
> print to a printer from a PDF file in acroread, the file is first
> converted to a ps file and only then sent to the printer? If so,
> your comment above would be wrong.

No, since a PDF file is converted into a PostScript file known to be
computationally inexpensive.

If I convert a program into English, I don't get Shakespeare, but
Cobol. A subset of English fit for morons and guaranteed not to make
use of its expressive powers.

H. S.

unread,
Apr 8, 2004, 3:30:13 PM4/8/04
to
Mike Oliver wrote:

>
> I believe that this is the case, but no, it doesn't make the comment
> wrong. Remember that PostScript is a programming language and
> can code an arbitrary Turing machine. It's entirely possible that
> acroread converts the PDF into PostScript -- but simple PostScript
> that's easy for your printer's processor to deal with, whereas other
> PostScript might be arbitrarily complex.
>

That clarifies it. Thanks.

Michele Dondi

unread,
Apr 9, 2004, 3:01:02 AM4/9/04
to
On 7 Apr 2004 21:22:17 GMT, r...@cl.cam.ac.uk (Robin Fairbairns) wrote:

>>> drv> http://www.research.att.com/+AH4-bs/whitespace98.pdf


>>
>>Yep. Note that the paper refers to a technical report from April 1st
>>and ends by referring to "Project April Fool". It's a joke.
>
>but like all the best jokes, it's perfectly believable. after all, if
>any language was going to do anything so daft as that, it would have

^^^^^^^^^^^^^


>to be c++ (does c## inherit c++'s inherent insanity?).

^^^^^^^^^


What about Perl?!?


Michele
--
>It's because the universe was programmed in C++.
No, no, it was programmed in Forth. See Genesis 1:12:
"And the earth brought Forth ..."
- Robert Israel on sci.math, thread "Why numbers?"

George N. White III

unread,
Apr 9, 2004, 8:26:28 AM4/9/04
to
On Thu, 8 Apr 2004, David Kastrup wrote:

> "H. S." <g_reate_...@yahoo.com> writes:
>
> > Mike Oliver wrote:
> >
> > > they're more likely to send a printer's poor little processor
> > > into tilt and make it consume half an hour per printed
> > > page.
> > >
> >
> > I do all my work on Linux machines. Your above comment created this
> > doubt in my mind. Isn't is the case that in Unix/Linux, when you
> > print to a printer from a PDF file in acroread, the file is first
> > converted to a ps file and only then sent to the printer? If so,
> > your comment above would be wrong.
>
> No, since a PDF file is converted into a PostScript file known to be
> computationally inexpensive.
>
> If I convert a program into English, I don't get Shakespeare, but
> Cobol. A subset of English fit for morons and guaranteed not to make
> use of its expressive powers.

A final note of clarification: PostScript Level 3 supports PDF with
minimal translation. Older printers with Level 1 interpreters often choke
on PS files created from PDF, and there are sometimes problems with Level
2 printers. In some circles PDF has a bad reputation based on bugs in
early software and problems rendering PDF using old rasterizers. When
a PDF file is translated to PS, the driver generally just loads PS code
to define the PDF primitives. With current rasterizers this PS code is
fairly simple, but with older rasterizers the code is considerably more
complex and almost sure to give problems under stress.

H. S.

unread,
Apr 9, 2004, 3:56:57 PM4/9/04
to
Apparently, _Mike Oliver_, on 04/08/04 15:12,typed:

>
> I believe that this is the case, but no, it doesn't make the comment
> wrong. Remember that PostScript is a programming language and
> can code an arbitrary Turing machine. It's entirely possible that
> acroread converts the PDF into PostScript -- but simple PostScript
> that's easy for your printer's processor to deal with, whereas other
> PostScript might be arbitrarily complex.

Forgive me if this question doesn't belong here, but I am hazarding a
guess that maybe the behaviour I am observing on my Debian machine may
be related to this. I had posted this earlier to this newsgroup, but got
no response.

When I convert a dvi document to ps using: dvips -Ppdf -t letter -z -G0
infile.dvi -o infile.ps, and if I try to magnify an area in the
resulting ps file to a large magnification, it takes quite a while for
gv to render the magnified area. But if I leave the "-t letter" option
out, and magnified area rendering is okay. Does the driver have
something to do with this (which may be affected by "-t letter" option)?
And also, can somebody try this and see if this can be reproduced?

I am running Debian Sarge (2.4.24), tetex-base 2.0.2-6, gv 3.5.8-31, and
gs 7.07-1 and dvips(k) 5.92b.

thanks,
->HS

--
(Please remove all underscores from my email address to get the correct
one. Apologies for the inconvenience, but this is to reduce spam.)

Patrick TJ McPhee

unread,
Apr 11, 2004, 10:23:33 PM4/11/04
to
In article <bg7b70ptenbstjltq...@4ax.com>,
Michele Dondi <bik....@tiscalinet.it> wrote:
% On 7 Apr 2004 21:22:17 GMT, r...@cl.cam.ac.uk (Robin Fairbairns) wrote:
%
% >>> drv> http://www.research.att.com/+AH4-bs/whitespace98.pdf
% >>
% >>Yep. Note that the paper refers to a technical report from April 1st
% >>and ends by referring to "Project April Fool". It's a joke.
% >
% >but like all the best jokes, it's perfectly believable. after all, if
% >any language was going to do anything so daft as that, it would have
% ^^^^^^^^^^^^^
% >to be c++ (does c## inherit c++'s inherent insanity?).
% ^^^^^^^^^
%
%
% What about Perl?!?

Perl would have to do it twice, getting it wrong the first time.
--

Patrick TJ McPhee
East York Canada
pt...@interlog.com

Brooks Moses

unread,
Apr 11, 2004, 11:52:55 PM4/11/04
to

I thought that was TeX, and only if the whitespace characters were made
active and were used to encode references to counters.

- Brooks


--
The "bmoses-nospam" address is valid; no unmunging needed.

LEE Sau Dan

unread,
Apr 12, 2004, 5:45:08 AM4/12/04
to
>>>>> "Mike" == Mike Oliver <mike_s...@verizon.net> writes:

Mike> ... but simple PostScript that's easy for your printer's
Mike> processor to deal with, whereas other PostScript might be
Mike> arbitrarily complex.

Yes, they can be arbitrarily complex. That's the point of Postscript
and hence why it is fun to write Postcript by hand!

But out of 1000 Postscript print jobs, perhaps less than 5 contain
really complex Postscript code. Most app. programs generate simple
Postscript. Only hand-crafted code would reach the complexity that
you're imagining. And how many people would spend time like I do to
hand-craft Postscript code for fun (see the ray-tracer code on my
homepage).


So, your argument in <c5461m$2o4boe$1...@ID-136402.news.uni-berlin.de>

they're more likely to send a printer's poor little processor
into tilt and make it consume half an hour per printed page.

is only applicable once in a blue moon (not once in a blue screen).

LEE Sau Dan

unread,
Apr 12, 2004, 5:48:12 AM4/12/04
to
>>>>> "H" == H S <g_reate_...@yahoo.com> writes:

H> When I convert a dvi document to ps using: dvips -Ppdf -t letter -z
H> -G0 infile.dvi -o infile.ps, and if I try to magnify an area in the
H> resulting ps file to a large magnification, it takes quite a while for
H> gv to render the magnified area. But if I leave the "-t letter" option
H> out, and magnified area rendering is okay. Does the driver have
H> something to do with this (which may be affected by "-t letter"
H> option)? And also, can somebody try this and see if this can be
H> reproduced?

That's mysterious to me. What does 'diff' say to the Postscript files
generated with and without "-t letter"?

Does your 'gv' always show all the text and graphics, or are they
somehow chopped off (because they're outside the paper margin that
'gv' assumes)? That can make a difference, because clipped text and
graphics are not rendered and hence doesn't consume much processing
time. (The rendering code still need to decide that they're clipped.)

LEE Sau Dan

unread,
Apr 12, 2004, 5:40:57 AM4/12/04
to
>>>>> "Mike" == Mike Oliver <mike_s...@verizon.net> writes:

Mike> George N. White III wrote:
>> Here are the arguments in favor of using "(pdf)latex --> pdf --> ps"
>> over methods that use dvi files:

Mike> I'd love to use pdflatex, if only it would support eepic.

That's also a biggest barrier for me. The lack of support of literal
Postscript code and EPS figures (yes, I know epstopdf) is irritating.
I'm switching most of my drawings, etc. to METAPOST for its elegance,
and it's good news to learn that pdfTeX can include METAPOST figures
directly (as long as I don't insert literal Postscript with the
'special' command in METAPOST).

I like the elegance of Postscript, and I like writing Postscript
directly. Too bad that the EPS->PDF conversion means a loss of this
elegance. Compact, repetitive code gets expanded, and hence file size
gets inflated. That's ugly.


Mike> I have trouble understanding, though, why people still
Mike> want PS files if they already have the PDF.

lpr -Ppostcript_printer thefile.ps

and I get a printed copy. Or to save trees:

a2ps --sides=2 -4 -Ppostcript_printer thefile.ps


That's much easier with PS files than PDF.


Mike> PS files are bulkier,

only if you don't know how to gzip them.


Mike> the tools for viewing them are not as good,

I find them better. I hate Acrobat reader (on X11). It doesn't
accept the standard Xlib parameters "-fg" and "-bg" for me to specify
the default BG and FG colours of the window. It's keyboard shortcuts
are not configurable. It doesn't have as much keyboard shortcuts as
in 'gv'. etc. etc.

Moreover, 'gv' can display gzipped Postscript files directly. (I
think 'gsview' on Windows can do the same.)


Mike> and they're more likely to send a printer's poor little
Mike> processor into tilt and make it consume half an hour per
Mike> printed page.

This is simply FUD. I've been using Postscript for 10 years. Most of
the time, the printer's processor is waiting for the sheet-feeding
machinery. Only occasionally does the sheet-feeding machinery stop to
wait for the processor for a couple of seconds during the print job.

LEE Sau Dan

unread,
Apr 12, 2004, 5:32:36 AM4/12/04
to
>>>>> "George" == George N White <aa...@chebucto.ns.ca> writes:


George> Here are the arguments in favor of using "(pdf)latex -->
George> pdf --> ps" over methods that use dvi files:

One big disadvantage for "advanced" Postscript programmers:

Compact Postscript code (such as fractals) will be expanded in this
final Postscript file, thanks to PDF's Turing-incompleteness. This
means an inflated final file size.

George> 2. TeX has information that gets discarded in the dvi file but
George> which can be used by pdftex.

Such as? hyperref can handle most of it already. You just need to
use the correct options of dvips.


George> 3. pdf-->ps conversions are needed by many more people
George> than use TeX, while conversions involving dvi files are
George> only useful to a limited audience. There are more and
George> better tools for pdf-->ps than for dvi-->X.

dvi->X? You mean X11, or X->anything?

Tools for dvi->ps conversion are very good, stable and versatile.
(e.g. the embedded T1 fonts contain only the glpyhs actually used in
the document.)


George> As a case in point, the most common tool for dvi-->ps is
George> dvips, which is based on a raster graphics model and so
George> can have problems (even when using scalable outline fonts)
George> if the ps file is scaled.

You're wrong. With -Pcmz or -Ppdf in tetex, you get Type1 fonts
instead of rasterized fonts.


George> 4. If all the programs with 'dvi' in their names stopped
George> working, a few mathematicians would be annoyed but would
George> soon learn to use pdf.

The antecedant of this statement is so unrealistic. When that occurs,
pdftex could also stop to work.

Mike Oliver

unread,
Apr 13, 2004, 10:36:12 AM4/13/04
to
LEE Sau Dan wrote:

> So, your argument in <c5461m$2o4boe$1...@ID-136402.news.uni-berlin.de>
>
> they're more likely to send a printer's poor little processor
> into tilt and make it consume half an hour per printed page.
>
> is only applicable once in a blue moon (not once in a blue screen).

It's not a theoretical argument; it's what I've observed. But
admittedly on a rather small sample size -- it could be a coincidence.

Michele Dondi

unread,
Apr 13, 2004, 11:25:55 AM4/13/04
to
On Mon, 12 Apr 2004 04:23:33 +0200 (MEST), pt...@interlog.com (Patrick
TJ McPhee) wrote:

>% >but like all the best jokes, it's perfectly believable. after all, if
>% >any language was going to do anything so daft as that, it would have
>% ^^^^^^^^^^^^^
>% >to be c++ (does c## inherit c++'s inherent insanity?).
>% ^^^^^^^^^
>%
>%
>% What about Perl?!?
>
>Perl would have to do it twice, getting it wrong the first time.

Huh?!? most probably if it were to do it, then it would allow one to
do it in at least ten different ways...


Michele
--
#!/usr/bin/perl -lp
BEGIN{*ARGV=do{open $_,q,<,,\$/;$_}}s z^z seek DATA,11,$[;($,
=ucfirst<DATA>)=~s x .*x q^~ZEX69l^^q,^2$;][@,xe.$, zex,s e1e
q 1~BEER XX1^q~4761rA67thb ~eex ,s aba m,P..,,substr$&,$.,age
__END__

George N. White III

unread,
Apr 14, 2004, 11:05:43 AM4/14/04
to
On Mon, 12 Apr 2004, LEE Sau Dan wrote:

> >>>>> "George" == George N White <aa...@chebucto.ns.ca> writes:
>
>
> George> Here are the arguments in favor of using "(pdf)latex -->
> George> pdf --> ps" over methods that use dvi files:
>
> One big disadvantage for "advanced" Postscript programmers:
>
> Compact Postscript code (such as fractals) will be expanded in this
> final Postscript file, thanks to PDF's Turing-incompleteness. This
> means an inflated final file size.

Yes, even simple things like a scatter-plot with symbols suffers this
inflation. The problem is that the inflation will still occur when the
file is rendered. PDF files tend to have more predictable rendering times
than PS files, so typesetter operators avoid PS files that aren't created
by well-known applications (Photoshop, Illustrator) which produce flat PS
code similar to PDF.

> George> 2. TeX has information that gets discarded in the dvi file but
> George> which can be used by pdftex.
>
> Such as? hyperref can handle most of it already. You just need to
> use the correct options of dvips.

Information available to TeX macros can be put into \specials for
dvips, but pdftex can also get information from TeX's internals.

> George> 3. pdf-->ps conversions are needed by many more people
> George> than use TeX, while conversions involving dvi files are
> George> only useful to a limited audience. There are more and
> George> better tools for pdf-->ps than for dvi-->X.
>
> dvi->X? You mean X11, or X->anything?

$X$, the variable meaning whatever device or file format someone wants to
use. The common cases are $X$ = "Windows GDI", X11, PS, or PDF.

> Tools for dvi->ps conversion are very good, stable and versatile.
> (e.g. the embedded T1 fonts contain only the glpyhs actually used in
> the document.)
>
> George> As a case in point, the most common tool for dvi-->ps is
> George> dvips, which is based on a raster graphics model and so
> George> can have problems (even when using scalable outline fonts)
> George> if the ps file is scaled.
>
> You're wrong. With -Pcmz or -Ppdf in tetex, you get Type1 fonts
> instead of rasterized fonts.

Sure, but dvips lays out the page using a raster grid determined
by the resolution you specify. -Ppdf sets a high resolution, but
if you need to scale a PS file created with dvips this causes problems.

Y&Y's (commercial) dvipsone does produce scalable PS.

> George> 4. If all the programs with 'dvi' in their names stopped
> George> working, a few mathematicians would be annoyed but would
> George> soon learn to use pdf.
>
> The antecedant of this statement is so unrealistic. When that occurs,
> pdftex could also stop to work.

No, because pdftex is important enough that it would be fixed. The real
crunch will come for useful programs (tex4ht) that rely on .dvi
technology. At some point it becomes easier to change tex4ht, etc. than
to maintain the dvi-->X tools. People who use TeX mostly rely on
volunteer efforts, which means that the future development of TeX will
tend to follow a path of least effort. New tools will be created to solve
problems and old tools that no longer solve problems (because the best new
tools also solve older problems) will not always be maintained.

LEE Sau Dan

unread,
Apr 14, 2004, 5:14:41 PM4/14/04
to
>>>>> "George" == George N White <aa...@chebucto.ns.ca> writes:

George> 2. TeX has information that gets discarded in the dvi file but
George> which can be used by pdftex.
>>
>> Such as? hyperref can handle most of it already. You just need to
>> use the correct options of dvips.

George> Information available to TeX macros can be put into \specials for
George> dvips, but pdftex can also get information from TeX's internals.

You still haven't specified which particular \specials are causing
problems. I have been using the hyperref package for some time. With
this package, I can insert document infos such as author, title, etc
(displayed in Acrobat Reader when you pop up the Document Info window
(Ctrl-D in some versions)). The dvips driver of hyperref will insert
appropriate pdfmark operators so that ps2pdf can generate it in the
final PDF file. When you use pdflatex instead (thus using the pdftex
driver of hyperref), the macros are defined in such a way that the
same info is generated on the output PDF file directly.

In either case, the document info are there in the final PDF. The
same is true for hyperlinks, crossreference likes, PDF form entry
fields, etc. Also thumbnails and bookmarks.


So, what special \special's do you need, that aren't already covered
by some LaTeX packages?

(I'm even restricting myself to what tetex gives me: hyperref, color,
graphicx, etc. I hate installing things from CTAN, because that
decreases the portability of my documents.)


George> $X$, the variable meaning whatever device or file format
George> someone wants to use. The common cases are $X$ = "Windows
George> GDI", X11, PS, or PDF.

Unfortunately, X is also a short form for "X11", "the X window system".


George> Sure, but dvips lays out the page using a raster grid
George> determined by the resolution you specify. -Ppdf sets a
George> high resolution, but if you need to scale a PS file
George> created with dvips this causes problems.

Digital technology has quantization errors anyway.


George> Y&Y's (commercial) dvipsone does produce scalable PS.

Well... I haven't used dvipsone. But it's still limited by the
*finite* precision of floating point numbers. I may create better PS,
but how often do people really need it?


George> No, because pdftex is important enough that it would be
George> fixed. The real crunch will come for useful programs
George> (tex4ht) that rely on .dvi technology.

I believe there are more tools that rely on Postscript technology than
DVI. pstricks, EPS diagrams, etc. come to mind. (Yes, epstopdf is
helpful. But how about pstricks? I sometimes do \special{"{some
Postscript code}"} for some special effects that wouldn't be achieved
easily otherwise.) Until pdftex can support Postscript specials, many
users would stay with DVI+EPS. But that would be a big project.


George> At some point it becomes easier to change tex4ht,
George> etc. than to maintain the dvi-->X tools. People who use
George> TeX mostly rely on volunteer efforts, which means that the
George> future development of TeX will tend to follow a path of
George> least effort. New tools will be created to solve problems
George> and old tools that no longer solve problems (because the
George> best new tools also solve older problems) will not always
George> be maintained.

True. That's why I believe in the philosophy of free softwares. They
undergo natural selection. Only the best traits can survive through
the generations (==versions).


--
Lee Sau Dan +Z05biGVm-(Big5) ~{@nJX6X~}(HZ)

0 new messages