Google Groups no longer supports new Usenet posts or subscriptions. Historical content remains viewable.
Dismiss

TeX memory limits

808 views
Skip to first unread message

Anthony Kramer

unread,
Jul 18, 2002, 2:08:11 AM7/18/02
to
Hi,

I have encountered the TeX memory limits and am unable to raise some of
them. This occurs with some latex documentation generated using doxygen. The
error I got was:
! TeX capacity exceeded, sorry [grouping levels=255].


Anthony Kramer

unread,
Jul 18, 2002, 2:12:23 AM7/18/02
to
I have looked in my texmf.cnf (/usr/share/texmf/web2c/texmf.cnf) file, and
can't find any setting corresponding to grouping levels.

Thanks in advance,
Anthony Kramer

"Anthony Kramer" <Anthony...@btinternet.com> wrote in message
news:ah5m0b$aop$1...@knossos.btinternet.com...

Robin Fairbairns

unread,
Jul 18, 2002, 6:58:27 AM7/18/02
to

this is such an unusual error message the natural assumption is that
there's an error in the document: i don't know about doxygen (i never
need doxies) so i don't know about its reliability.

it might be instructive to run the document under elatex (latex
compiled under e-tex) with \tracinggroups=1 (or higher).

this will give you lots of stuff in the log about how you get to this
enormous level of grouping. my guess is that you'll see an obvious
pattern -- some class of doxy object produces one more "begin group"
thing than it produces "end group" things. in such a situation, you
would need to increase the number of grouping levels according to the
length of your document, which is plainly silly (it's reasonable to
increase memory linearly for semantic things like labels and so on,
but not for syntactic things like groups).

armed with that information, you're in good shape for solving the
problem.
--
Robin Fairbairns, Cambridge -- rf10 at cam dot ac dot uk

Anthony Kramer

unread,
Jul 18, 2002, 11:26:26 AM7/18/02
to
Despite my limited knowledge of TeX, it appears that the code generated by
doxygen is sane (judging from the doxygen source code, and that fact that it
used to work fine). There is just rather a lot of it. This document has
grown over the past few weeks. Until recently it did work fine, but it
should now be around 2000 pages. There is most likely a simple configuration
change required, to raise the number of groups allowed. TeX uses about 40MB
of RAM before it gives up, so I could raise this limit considerably, as I
have 1GB of RAM.

Any suggestions?

Thanks in advance,
Anthony

"Robin Fairbairns" <r...@cl.cam.ac.uk> wrote in message
news:ah670j$sku$1...@pegasus.csx.cam.ac.uk...

Danilo Šegan

unread,
Jul 18, 2002, 8:37:15 PM7/18/02
to
Anthony Kramer wrote:
> Despite my limited knowledge of TeX, it appears that the code generated by
> doxygen is sane (judging from the doxygen source code, and that fact that it
> used to work fine). There is just rather a lot of it. This document has
> grown over the past few weeks. Until recently it did work fine, but it
> should now be around 2000 pages. There is most likely a simple configuration
> change required, to raise the number of groups allowed.

It seems that TeX uses internal 8-bit variable to count group nesting (I
haven't checked the source, but this conclusion seems reasonable based
on error you've got -- any guru may correct me). That means that there
is no way to make use of any deeper group nesting (if previous is true),
so few alternatives come to mind.
1. Use Omega which should have replaced all 8-bit quantities with
16-bit: from registers to font sizes etc. -- probably group nesting too
2. Divide the file into smaller chunks and process them separately (I
don't know if this works with LaTeX because of its ,,structured'' documents)
3. Try to reduce group nesting: do you really need to enclose that group
inside that group inside that group inside that group, just to get the
effect of one letter being bold, the second italic, the third smallcaps,
etc? Quite a few situations are equivalent to this one and can be solved
by ,,spreading'' grouping out: put first part in one group, next in
other, eteceter. Since you're using software which may automatically
generate the files, dive into the source could be inevitable


> TeX uses about 40MB
> of RAM before it gives up, so I could raise this limit considerably, as I
> have 1GB of RAM.
>

It's not RAM that's creating troubles, but rather, as "guessed" and
explained above

> Any suggestions?
>

There they are, listed from 1 to 3 :)

> Thanks in advance,
> Anthony
>

No problem, I always enjoy sharing my ignorance :)

Regards,
Danilo

Robin Fairbairns

unread,
Jul 19, 2002, 3:09:31 AM7/19/02
to
=?ISO-8859-2?Q?Danilo_=A9egan?= <mm0...@alas.matf.bg.ac.yu> writes:
>Anthony Kramer wrote:
>> Despite my limited knowledge of TeX, it appears that the code generated by
>> doxygen is sane (judging from the doxygen source code, and that fact that it
>> used to work fine). There is just rather a lot of it. This document has
>> grown over the past few weeks. Until recently it did work fine, but it
>> should now be around 2000 pages. There is most likely a simple configuration
>> change required, to raise the number of groups allowed.
>
>It seems that TeX uses internal 8-bit variable to count group nesting (I
>haven't checked the source, but this conclusion seems reasonable based
>on error you've got -- any guru may correct me).

you may well be right. i've not examined that bit of code either.

>That means that there
>is no way to make use of any deeper group nesting (if previous is true),

>[and more good common sense, snipped]

the op chooses not to pursue the line of investigation i suggested; i
think analysing his latex-generator is probably the only way out of
this problem. he doesn't even tell us what proportion of his document
is processed at the point the collapse happens, so we can't judge
whether increasing the permitted number of grouping levels will help.

in any case, the number of grouping levels isn't the only thing that's
going to run out: save stack will presumably be the next to go, but
code that runs away with grouping is probably profligate with other
usage.

Michael J Downes

unread,
Jul 19, 2002, 8:06:47 AM7/19/02
to
"Anthony Kramer" <Anthony...@btinternet.com> writes:

> Despite my limited knowledge of TeX, it appears that the code generated by
> doxygen is sane (judging from the doxygen source code, and that fact that it
> used to work fine).

Despite my limited knowledge of doxygen, the fact that no one in the
world ever ran into this particular kind of TeX memory limit before makes it
appear that the handling of TeX groups (either in doxygen, or in the way
you are using it) *is* insane :-).

The error message that you are getting is analogous to running out of
stack by calling a (non-tail-)recursive function too many times. The
standard answer is flatten the code into an iterative equivalent, not
just keep increasing the stack.

> There is just rather a lot of it. This document has
> grown over the past few weeks. Until recently it did work fine, but it
> should now be around 2000 pages. There is most likely a simple configuration
> change required, to raise the number of groups allowed.

From the fact that "until recently it did work fine" you would like
to hypothesize that the doxygen use of TeX is in fact all right, and a
straightforward increase in memory allocation is called for; but it
seems more likely to me that there is a bug in doxygen which causes an
essential \endgroup or \egroup to be omitted somewhere along the line,
and although you were triggering the bug before too, the number of times
that you triggered it was less than 255.

If that is not the case then perhaps doxygen is simply building too deep
a tree. Does the structure of your documentation really call for a tree
256 levels deep? You have leaf nodes that are 256 levels below the root?

0 new messages