On 27/03/12 19:51, Gabriel wrote:
> With this little ranting about LaTeX & Co, I would just like to
> spark the discussion as to where Latex & Co should go.
It's always good to start a rant, unless you're a troll :-)
> I have long hesitated whether or not I should release these
> ramblings into the public, because I didn't want to tread on
> anybody's toes, or frustrate anybody (or get flamed ;-) ).
This is Usenet. Since when did anyone worry about that? :-)
> [...] I'm afraid that won't happen unless some development efforts
> for TeX/LaTeX et al. are focused on the "real" problems.
Have you looked at the LaTeX3 Project?
> In a nutshell, how do you convince someone that they should use
> LaTeX? (instead of Framemaker, InDesign, Word, etc.) Especially
> someone, whose documents do *not* contain math (or very little)?
> Yeah, right, there is the superior typographic quality of TeX's
> paragraph typesetting algorithm -- but read on ...
Automation. I use LaTeX as an API for creating PDFs. All the other
systems have some degree of pattern-following ability, but none of them
are actually typographic programming languages. Frame, ID, Quark, etc
are fine if you want each page to be designed individually: magazines,
for example. 3B2 (or whatever it's called now) has the programmability
(originally based on TeX) but without the widespread community support.
30 years of GUIs has left the majority of computer users believing that
direct-intervention synchronous typographic interfaces (what people
inaccurately call WYSIWYG) are the *only* interface in existence. Even
some so-called "IT professionals" are entirely ignorant of things like
command lines and character-cell editors.
The result is that users assume that you have to see a document
displayed in its final form on the screen in order to do anything with
it. I have even had Linux users (!) open a PDF in Acrobat Reader or
equivalent, just in order to print it, instead of using the lp or lpr
command.
If you believe that you have to work that way, and your pages are
individually designed, you are probably happier with one of the standard
commercial page-design programs. Once you start finding that you are
doing the same task over and over again, you *might* discover macros and
keystroke-recorders, and at that stage you are ripe for discovering
[La]TeX. But many users dislike learning, and prefer to continue to do
stuff the long way round by hand instead; and many organisations are so
IT-illiterate that they tolerate this approach.
> 1) Insufficient figure placement algorithm.
This is a known defect. It should give a *much* higher preference to
placing them [h] before considering [t], [b], or [p]. In fact, judicious
re-setting of the page-fraction values can often fix this.
> The idea could be to allow figures to move forward and backward while
> being attached to the first reference (or some specially designated
> "anchor" point). The attachment to the anchor would be a "rubber
> band". Then, the user could have a few optimization parameters,
> like: "stiffness" of the rubber bands, number of figures allowed on a
> page, etc.
That sounds like a useful start to a new algorithm.
> In addition, Latex should swap floats if that allows for better
> placement in the sense of the above sketched optimization task.
If that means changing the order, then I think that's A Bad Idea. An
author finding the figure or table she expected to be 1 coming out as 2
is going to ditch LaTeX pretty fast.
> Maybe, figures & images should even become "first-class citizens" in
> TeX, not just Latex.
That would mean implementing the concept of a float in TeX, and I don't
think that's going to happen.
> This might (I guess) also help with other figure-related problems,
> such as wrapping figures.
I have long considered that floats should have a controllable width, so
that small (narrow) figures and tables could be wrapped in text. This
would mean that they need to be re-instantiated as equivalent to a
character, like a tabular environment is, and that the presence of such
an object in mid-textstream (suitably labelled for l/c/r positioning)
should make it wrap automatically.
> 2) Stagnant typographic quality.
>
> It seems to me that there are no major advancements of TeX/LaTeX in
> the typographic direction. Please let me know if I'm wrong.
The work on microtype adjustments in pdflatex is one example.
> For instance, what about the HZ algorithms?
I think someone has implemented this in LuaTeX.
> What about hanging punctuation?
Do-able by hand from the start, but I think the microtype adjustments in
pdflatex include it.
> What about context-sensitive kerning?
Difficult when the engine (TeX) only knows the height and width of the
character. Doesn't XeTeX implement this?
> What about full support for OpenType (including features such as
> Titling Alternates, Superscripts/Subscripts, or Fractions)? What
> about 3-letter kerning?
XeTeX. Coming to a distribution near you.
> But let's face it, how many users of LaTeX (who just want to get
> their job done) know about CTAN, let alone consider installing an
> extra package to improve the typographic quality?
Most of the commonly-used ones are preinstalled in modern distributions,
so it's just the effort of typing \usepackage -- and the knowledge of
what each of the 4,233 packages currently listed at
http://ctan.org/pkg
can do.
Jim Hefferon maintains a page at
http://ctan.org/edit_keywords/front/
where you can suggest additional characterisations for a package (for
example "typographic").
> 4) TeX's native graphical capabilities are pitiful.
>
> I believe this is one of the major reasons why graphics never really
> has been integrated with TeX/LaTeX.
Right. But it's a typesetter, not a drawing package. Line art, and even
more, halftone art, has no place in a typesetter. I think what you are
looking for is an editing interface which lets you add artwork
seamlessly as if it was native. That's perfectly do-able (even Word
manages it, rather crudely). It just needs someone to write the code.
> The same holds for bitmap graphics. TeX has never been able to import
> bitmap graphics, which is a problem in terms of smooth workflow.
I'm not clear what this means. \includegraphics has no problems with
bitmaps, either EPS or PDF/JPG/PNG depending on your output engine.
If you mean you want the bitmap binary code embedded in the source, then
you'd need a kludge like Word originally tried with the first cut at
OOXML (Word 2003), where it embedded the Base64 in the XML. Even they
saw the light and removed it to a media subdirectory when they
implemented the .docx zip format.
> However, the world is becoming more and more graphical. Why shouldn't
> bitmap and line art graphics be incorporated into TeX and/or LaTeX
> native?
Technically they could, of course, using Base64 or equivalent. I just
don't want to have to scroll past that lot when editing in a
non-typographic editor.
> IMHO, TeX & Co., will eventually become extinct if they won't be
> able to import and understand the common graphics file formats
> natively(!), like SVG, EMF, JPEG, PNG, maybe even some proprietary
> binary formats like CorelDraw or Visio.
SVG should be no technical problem as it's XML. EMF is a pain in the
butt, and easy to convert to PDF, but I take your point. In either case
someone just needs to write the code. JPG and PNG have already worked
for years in pdflatex. Proprietary binary formats usually need a license
fee paying, and may be encumbered with legal restrictions. I'm
unconvinced about that.
I'm still worried about your use of the word "import", though. It
implies that the image would become part of the .tex file, which is
probably A Bad Idea (see note about .docx files above).
> I believe, so long as TeX doesn't understand line graphics natively,
> including annotated line art will always be cumbersome and just not
> quite 100% functional.
Again, this is an interface problem. There is no reason why someone
can't write an editor that does precisely this, so that callouts can be
attached and will remain with the image when you move it.
> Especially, if you want to exchange documents together with drawings
> and bitmaps across platforms and TeX distributions.
Perhaps a .texx format would be a good idea: a zip file like .docx and
.odt, containing the .tex file, any non-CTAN classes or packages, and
any images or other data.
> The PGF package might be a good starting point. That way, other
> software would have a standard way to export graphics to LaTeX (such
> as Gnuplot, Matlab, drawing editors, etc. etc.).
The nice thing about standards is that there are so many to choose from.
I don't use PGF but I believe it is very good. My concern would be that
it is the format du jour, and in 2020 will need to be scrapped in favour
of something else, thereby invalidating millions of documents.
There are already robust and well-supported vector formats in EPS and
PDF, which are exportable and importable from graphics packages like
Corel Draw and Inkscape. I can't speak as to GNUplot and MatLab, though,
but surely the mathematicians and engineers of the world could come up
with something.
> And, in the long term, it would even be possible to integrate a
> drawing editor into a "LaTeX IDE". (I am envisioning something like:
> I double click on a \begin{drawing} envrinoment, and up pops a
> drawing editor, and when I close the drawing program, the latex code
> reflecting the drawing is changed in the original Latex source ...)
As I said, this is an editor interface problem, solvable by someone just
writing it.
> 5) Aging of TeX:
>
> Today, many more optimizations could be performed (possibly
> optionally switched on by the user).
I think both XeTeX and pdflatex do some of this already.
> In addition, some typographic points might not have been known to
> Knuth, or they might have emerged only in the recent past. For
> instance, hanging punctuation.
That has been around since the early days of printing.
> (TeX's age is probably also the reason why it never has gotten
> powerful graphical capabilities --- in an era when X and Postscript
> etc. were a long time away, such things were just not practically
> possible...)
TeX: 1978
PS: 1982
X: 1984
> Another problem resulting from this "cast-in-stone" "license"
This really is the core of it, and I think it's the principal reason
behind NTS and LaTeX3.
> Hyperref [...] But, in the 21st century, hyperlinks should be an
> integral part of any kind of document preparation system, shouldn't
> it?
Yep. Like \includegraphics{
http://www.foo.bar/image.png}
> Another issue is the extreme "backwards compatibility" (I would like
> to say "paranoia"). Here is a quote that expresses this quite nicely
> with the example of the bitmapped fonts: "Bitmapped fonts are to
> typesetting what punch-card machines are to digital storage. They
> were necessary at one time, when no other viable technology was
> available, but they have long since been made obsolete. That they
> are still the default ... is at best a sign of laziness and
> conservatism among the latex crowd, or at worst an inept expression
> of adoration for Knuth."
I think this demonstrates a fairly fundamental misunderstanding: TeX
neither knows nor cares how your output driver instantiates a glyph. All
it wants is the height and width, and some ancillary information like
kerning, and that comes from the .tfm file.
It's the output driver that worries about font file formats, and as I
understand it, all the current distributions of pdflatex come with Type
1 outlines as standard, and the generation of bitmaps hasn't been the
default for some considerable time.
> The same is true for the 8.3 naming scheme everywhere.
Nothing in TeX requires an 8.3 filename as far as I know. It probably
*does* assume a dot between the filename and the extension, though.
What does need stamping on hard, though, is spaces in filenames :-)
> 6) Exessive diversification:
>
> Another consequence of the "cast in stone" license of TeX
> seems to be an excessive diversification.
>
> There is TeX/LaTeX, and then there are all these other derivatives
> like Omega, eTeX, pdfTeX, ConTeXt, ant, Alpha, and what not.
This is experimentation. It's endemic to free software. The successful
implementations survive, and the unsuccessful die off.
> 7) Fonts:
>
> Over the years, I have literally spent several man weeks in total to
> install fonts for Latex.
I'm sorry to hear that. I install new fonts frequently, according to
customer requirements. My bash font installer for Type 1 fonts (cdvf)
has been available for years, and I wrote about it in TUGboat. OK, so
it's Linux only, and it's based on the ancient Bitstream 500-font
CD-ROM, but that's a couple of changes to directory names in the code.
This not only makes the font files and installs them, but writes the
.sty and .fd files, and updates the font cache.
> Yes, there is the excellent fontinst manual and macros.
>
> Still, how many people succeed in installing a new font?
Very few.
> And of those that do, how many would succeed without comp.text.tex?
> ;-) Looking at comp.text.tex, it doesn't seem too many, IMHO.
> [Footnote: Actually, I think, without all these lots of helpful and
> kind people on comp.text.tex, LaTeX et al. would be pretty much dead
> by now ...]
Yep.
>
> Is it really necessary to read and understand 30+ pages of a manual,
> just to install a font? I shouldn't think so ....
No. Just typing cdvf <foundry> <fontname> <filename>,<filename>,...
should be enough.
> Or, as another example, take the font naming scheme: why on earth
> should one rename fonts and font files?! Just because of the 8.3 file
> name length limit?
It has absolutely nothing whatsoever to do with the 8.3 filename
restriction (which I agree should be trashed). It's to do with the idea
of embedding the font attributes in the font filename, instead of in Yet
Another Ancillary File (eg .info).
Karl would be the first to admit that one letter for the foundry (maker)
and two letters for the typeface is probably inadequate now.
No problem: write an updated spec, instantiate it in code, and post it
somewhere for people to try.
But bear in mind that XeTeX simply makes this problem go away.
> I think it would be about time for LaTeX/TeX/xdvi to use and support
> some standards natively, like afm/pfb, ttf, OpenType, and all the
> fonts that are installed in my platform (Mac, Windows, Linux) already.
> The later, in particular, should be supported by default.
You do appear to be unaware of XeTeX.
> 8) Umlauts.
>
> We are well in the 21st century, and I really think, it is high time
> that LaTeX, TeX, Bibtex, et al., use Unicode (UTF-8) once and for
> all and everywhere and from the ground up.
Yes, I couldn't agree more. Unfortunately it's only recently that some
operating systems have deigned to use UTF8 instead of their native crap
(I mention no names, MacRoman8 and Windows-1252 :-) and a large number
of computer scientists are only vaguely aware of anything other than
US-ASCII.
Knuth was very accepting of the plea to go to 8bit, way back in the dawn
of time (Exeter Conference?). Perhaps an approach to move to native
multibyte would also work. But then, NTS and LaTeX3 will have a view on
this already.
> 9) "Standard" packages.
>
> One of the big strengths of Latex is CTAN, its maintainers,
> and, in particular, all the wonderful people who contribute packages
> and help on comp.text.tex!
>
> However, for a newbie it is pretty time-consuming to find the package
> needed for the problem at hand.
The latest changes to the CTAN interface make it much easier.
But as I said, all current full distributions include pretty much
everything that most users want.
> Most of the time, that newbie has no idea how to install a package
> (where *is* that "suitable place where latex can find the .sty
> file"?).
This is unaccountably missing from most documentation. Now that TDS has
stopped moving around so much, it's time all documenters agreed:
Unix and GNU/Linux: ~/texmf/tex/latex/<packagename>
Mac OS X: ~/Library/texmf/tex/latex/<packagename>
Windows: C:\texmf\tex\latex\<packagename>
On Unix-based systems (Linux and OS X) there is no need to run texhash
afterwards. On Windows, I know MiKTeX requires the tree to be added to
its config, and updated with the FNDB. I assume TeX Live does not
require this.
> Unless you write plain math
Why math? I never use the stuff.
> texts with no figures and tables and the
> likes, you always need a handful of other packages. So why not just
> maintain a set of packages, that are integrated into every Latex
> format?
They are. What on earth are you using?
> 10) Acceptance among the young.
>
> As a Latex user, one suddenly has to deal with things none ever has
> to deal with when using another word processor: going to CTAN to find
> a package that offers a certain functionality, or to try to find the
> documentation for a certain feature that is provided by a certain
> package.
On the contrary, I know Word and OO/LO users who do this regularly,
installing plugin after plugin...
> Now, if that is so difficult, what is the future of LaTeX among the
> young generation, who has grown up with iPhone and video games? They
> take this kind of ease-of-use for granted.
This is why the auto-download-and-install was implemented in MiKTeX, and
now as tlmgr in TeX Live. You do seem to be rather out of date with
your information.
> 11) No decent "IDE".
Agreed. You just need to write one. LyX is clever, but exposes far too much.
> Summary:
*plonk*
///Peter