Google Groups no longer supports new Usenet posts or subscriptions. Historical content remains viewable.
Dismiss

LaTeX Error ! TeX capacity exceeded, sorry [.....]

1,107 views
Skip to first unread message

ega...@feathers.phys.unm.edu

unread,
Mar 30, 1998, 3:00:00 AM3/30/98
to

I am writing a long document and tried to add an 8th appendix to the
document resulting in the following error:

! TeX capacity exceeded, sorry [number of strings=3449].
\IfFileExists #1#2#3->\openin \@inputcheck #1
\ifeof \@inputcheck \ifx \inpu...
l.10 \@input{app6.aux}


Everything appears to work ok regardless, except that it doesn't include the
bibliography in the Table of Contents, though it is included in the document.

Any suggestions on how to fix this or were my problem really lies?


ega...@feathers.phys.unm.edu

unread,
Mar 30, 1998, 3:00:00 AM3/30/98
to

In article <1d6q6r4.17x...@p40.nas1.is2.u-net.net>,
Rebecca and Rowland <real-addr...@rhu.barb.foobar> wrote:
>Try asking again showing the section of document that causes the
>problem. I can't work out anything from the above (not that I'm an
>expert or anything)
>
>Rowland.
>


There is no section of the document that causes the problem, each section
will latex successfully on its own. The problem arises when I put it all
together (where the final appendix appears to be the critical mass causing the
error). I have determined that in fact it is a memory problem. The
following is from the .log file:

Here is how much of TeX's memory you used:
3448 strings out of 3449
39474 string characters out of 49610
112756 words of memory out of 262141
4916 multiletter control sequences out of 9500
12359 words of font info for 43 fonts, out of 72000 for 255
14 hyphenation exceptions out of 607
28i,17n,26p,240b,577s stack positions out of 300i,40n,60p,3000b,4000s

Anyway, what I need now is help in changing the strings buffer size to make it
large enough to accomodate what I'm doing. There is no LaTeX wizard here, so
any help on which file contains the buffer sizes or whatever needs to be
modified.

Any advice or help would be appreciated.

Elinor
ega...@unm.edu


Rebecca and Rowland

unread,
Mar 31, 1998, 3:00:00 AM3/31/98
to

<ega...@feathers.phys.unm.edu> wrote:

> I am writing a long document and tried to add an 8th appendix to the
> document resulting in the following error:
>
> ! TeX capacity exceeded, sorry [number of strings=3449].
> \IfFileExists #1#2#3->\openin \@inputcheck #1
> \ifeof \@inputcheck \ifx \inpu...
> l.10 \@input{app6.aux}
>
>
> Everything appears to work ok regardless, except that it doesn't include the
> bibliography in the Table of Contents, though it is included in the document.
>
> Any suggestions on how to fix this or were my problem really lies?

Try asking again showing the section of document that causes the
problem. I can't work out anything from the above (not that I'm an
expert or anything)

Rowland.

--
Remove the animal for my email address: reb...@astrid.dog.u-net.com
Sorry - the spam got to me. PGP pub key A680B89D
UK biker? Join MAG and help keep bureaucracy at bay
http://dredd.meng.ucl.ac.uk/www/mag/mag.html

Rebecca and Rowland

unread,
Mar 31, 1998, 3:00:00 AM3/31/98
to

<ega...@feathers.phys.unm.edu> wrote:

> In article <1d6q6r4.17x...@p40.nas1.is2.u-net.net>,
> Rebecca and Rowland <real-addr...@rhu.barb.foobar> wrote:

> ><ega...@feathers.phys.unm.edu> wrote:
> >
> >> I am writing a long document and tried to add an 8th appendix to the
> >> document resulting in the following error:
> >>
> >> ! TeX capacity exceeded, sorry [number of strings=3449].
> >> \IfFileExists #1#2#3->\openin \@inputcheck #1
> >> \ifeof \@inputcheck \ifx \inpu...
> >> l.10 \@input{app6.aux}
> >>
> >>
> >> Everything appears to work ok regardless, except that it doesn't
> >> include the bibliography in the Table of Contents, though it is
> >> included in the document.
> >>
> >> Any suggestions on how to fix this or were my problem really lies?
> >
> >Try asking again showing the section of document that causes the
> >problem. I can't work out anything from the above (not that I'm an
> >expert or anything)
> >
> >Rowland.
> >
>
>

> There is no section of the document that causes the problem,

Yes there is - the error is caused by a segment of LaTeX code.

> each section
> will latex successfully on its own. The problem arises when I put it all
> together (where the final appendix appears to be the critical mass causing the
> error).

Righto - so can you post the section of code you're using to input all
these files to produce the final document?

> I have determined that in fact it is a memory problem.

No you haven't - you can't be sure of this until you are sure that there
is no mistake in the document. This has not been determined.

> The
> following is from the .log file:
>
> Here is how much of TeX's memory you used:
> 3448 strings out of 3449
> 39474 string characters out of 49610
> 112756 words of memory out of 262141
> 4916 multiletter control sequences out of 9500
> 12359 words of font info for 43 fonts, out of 72000 for 255
> 14 hyphenation exceptions out of 607
> 28i,17n,26p,240b,577s stack positions out of 300i,40n,60p,3000b,4000s
>
> Anyway, what I need now is help in changing the strings buffer size to make it
> large enough to accomodate what I'm doing. There is no LaTeX wizard here, so
> any help on which file contains the buffer sizes or whatever needs to be
> modified.

Well over 90% of problems which result in TeX running out of memory are
in fact caused by an error in the document. Increasing the size of the
buffer can't help in such cases.

The first thing to do is examine the code for errors, which is why I
asked you to post the code here. If there are no errors, then you
should try increasing the appropriate parameter.

So, as I say, post the code here so we can have a look at it.

David Carlisle

unread,
Mar 31, 1998, 3:00:00 AM3/31/98
to


R & R wrote


> >> ! TeX capacity exceeded, sorry [number of strings=3449].

> Well over 90% of problems which result in TeX running out of memory are


> in fact caused by an error in the document. Increasing the size of the
> buffer can't help in such cases.

Whilst that is true, I would guess that this is one of the 10%
3449 strings is not very much, that is only 3500 command names, file
names, ****cross_references_in the_aux_file***** etc.

For comparison I get

bash$ latex x
This is TeX, Version 3.14159 (Web2C 7.2)
(/home/red5/davidc/local/share/texmf/tex/latex/tools/x.tex
LaTeX2e <1997/12/01> patch level 2

bash$ tail x.log


Here is how much of TeX's memory you used:

4 strings out of 10982
115 string characters out of 73418
42939 words of memory out of 263001
2952 multiletter control sequences out of 10000+0
3640 words of font info for 14 fonts, out of 200000 for 1000
14 hyphenation exceptions out of 1000
4i,0n,1p,75b,7s stack positions out of 300i,100n,500p,30000b,4000s

No pages of output.


The only current tex implementation that I know of that is that small
is the _small_ emtex (ie emtex for a sub 386 PC) If the original poster
is using emtex on a 386 or better he should use tex386.exe rather than
tex.exe, and can then set the sizes of these arrays to be something more
managable.

If really using a small system, remake the latex format following the
advice in autoload.txt _and_ tex2.txt which will keep latex as small as
possible leaving more room for your cross references.

Also you can \input each of the sections rather than \include them,
this can save quite a bit as latex no longer needs to save the
checkpoint information required to skip over non included sections.

David

ega...@feathers.phys.unm.edu

unread,
Mar 31, 1998, 3:00:00 AM3/31/98
to

>>Also you can \input each of the sections rather than \include them,
>>this can save quite a bit as latex no longer needs to save the
>>checkpoint information required to skip over non included sections.
>>

>Ah! Using \input instead of \include saved the day and I now get everything
>to come out right and with no errors.
>
>Merci Beaucoup!
>
>Elinor


Addendum:

Even using \input I come dangerously close the the strings buffer size of 3500,
so any info on how to increase that would be appreciated. I got one suggestion
that I need a more recent version of TeX and that buffer sizes were set at the
time of installation. Is this true? Where does one find a new verson of LaTeX
for Solaris?

Elinor
ega...@unm.edu


Rebecca and Rowland

unread,
Apr 1, 1998, 3:00:00 AM4/1/98
to

<ega...@feathers.phys.unm.edu> wrote:

You don't need to get a newer version of TeX - all you need to do is
build a new format file with the appropriate parameter set larger. The
documentation that came with LaTeX should tell you how to build a new
format file. What you need to do is modify the appopriate configuration
file and follow the instructions to build a new format.

Somewhere on you system is a file(s) that are used to set these values -
I don't know about Unix versions of TeX, but in case it's any use, the
config file I use for LaTeX under OzTeX on my Mac uses this set of
parameters:

% TeX parameters (the values in brackets show the possible ranges):

mem_max = 200000 (mem_top..100000000)
font_max = 256 (1..256)
font_mem_size = 80000 (8..100000000)
max_strings = 10000 (1300..16382)
string_vacancies = 50000 (0..pool_size-23500)
pool_size = 80000 (string_vacancies+23500..100000000)
buf_size = 3000 (120..32760)
stack_size = 600 (1..1600)
max_in_open = 20 (1..20)
param_size = 200 (1..8190)
nest_size = 150 (1..1000)
save_size = 2000 (1..4094)
trie_size = 21000 (was 16000 (cf 8000 for std LaTeX); 1..32760)
trie_op_size = 1000 (1..16382)
% %
% %
% % If you change any of the next four parameters then you'll need
% % to run INITEX and rebuild all your format files.
% % Some people will probably want to increase the hash_size value,
% % so here are some suitable hash_prime values:
% % hash_size = 3000 4000 5000 6000 10000 20000 31000
% % hash_prime = 2551 3407 4253 5101 8501 16993 26347
% %
mem_top = 200000 (mem_min+1100..mem_max)
hash_size = 10000 (325..31000)
hash_prime = 8501 (prime about 85% of hash_size)
hyph_size = 2551 (prime from 1..16382)


The trie_size is so large only because I've got UK and US English
hyphenation patterns installed.


Hope this helps

Rebecca and Rowland

unread,
Apr 1, 1998, 3:00:00 AM4/1/98
to

David Carlisle <dav...@nag.co.uk> wrote:

> R & R wrote
> > >> ! TeX capacity exceeded, sorry [number of strings=3449].
>
> > Well over 90% of problems which result in TeX running out of memory are
> > in fact caused by an error in the document. Increasing the size of the
> > buffer can't help in such cases.
>
> Whilst that is true, I would guess that this is one of the 10%
> 3449 strings is not very much, that is only 3500 command names, file
> names, ****cross_references_in the_aux_file***** etc.

Ah - I see. I didn't know cross-refs in the aux file counted. That
would explain it. Well, I didn't assume it was certainly an error in
the TeX code.

Rowland.

[snip]

David Carlisle

unread,
Apr 1, 1998, 3:00:00 AM4/1/98
to


> You don't need to get a newer version of TeX - all you need to do is
> build a new format file with the appropriate parameter set larger.

Not wanting to start an OS war again:-) but you are showing a Mac bias:-)
web2c TeX has only had the posibility of increasing these sizes at
format time (as opposed to recompiling the web sources) since web2c7
at the beginning of last year, so quite possibly the original poster
does need to get a new web2c distribution.

David

David Carlisle

unread,
Apr 1, 1998, 3:00:00 AM4/1/98
to

> Ah - I see. I didn't know cross-refs in the aux file counted.

They don't - while they are in the aux file - but when the aux file
is input at \begin{document} each of those \newlabel{foo}{{1}{2}}
commands is really \gdef\r@foo{{1}{2}} so takes up one csname per
\label.

Timothy Murphy

unread,
Apr 1, 1998, 3:00:00 AM4/1/98
to

ega...@feathers.phys.unm.edu () writes:

>There is no section of the document that causes the problem, each section

>will latex successfully on its own. The problem arises when I put it all
>together (where the final appendix appears to be the critical mass causing the
>error).

You could try printing the parts separately,
using \include{...} perhaps with askinclude.sty .


--
Timothy Murphy
e-mail: t...@maths.tcd.ie
tel: +353-1-2842366
s-mail: School of Mathematics, Trinity College, Dublin 2, Ireland

Timothy Murphy

unread,
Apr 1, 1998, 3:00:00 AM4/1/98
to

real-addr...@rhu.barb.foobar (Rebecca and Rowland) writes:

>> Even using \input I come dangerously close the the strings buffer size of
>> 3500, so any info on how to increase that would be appreciated. I got one
>> suggestion that I need a more recent version of TeX and that buffer sizes
>> were set at the time of installation. Is this true? Where does one find a
>> new verson of LaTeX for Solaris?

>You don't need to get a newer version of TeX - all you need to do is


>build a new format file with the appropriate parameter set larger.

Unfortunately that is not true of all TeX implementations --
only some allow dynamic array sizing of that kind.

The latest web2c has this (but needs compiling).
I don't think the current version of teTeX binaries has it,
but the next one, out shortly I think, should.

Rebecca and Rowland

unread,
Apr 2, 1998, 3:00:00 AM4/2/98
to

David Carlisle <dav...@nag.co.uk> wrote:

> > You don't need to get a newer version of TeX - all you need to do is
> > build a new format file with the appropriate parameter set larger.
>

> Not wanting to start an OS war again:-) but you are showing a Mac bias:-)

Well, it's all I know about, isn't it? I'm just trying to be helpful
(he said, meekly).

> web2c TeX has only had the posibility of increasing these sizes at
> format time (as opposed to recompiling the web sources) since web2c7
> at the beginning of last year, so quite possibly the original poster
> does need to get a new web2c distribution.

I see. Oh well - you live and learn, as they say.

Rowland.

Peter Schmitt

unread,
Apr 2, 1998, 3:00:00 AM4/2/98
to

David Carlisle <dav...@nag.co.uk> writes:
>
>Not wanting to start an OS war again:-) but you are showing a Mac bias:-)
>web2c TeX has only had the posibility of increasing these sizes at
>format time (as opposed to recompiling the web sources) since web2c7
>at the beginning of last year, so quite possibly the original poster
>does need to get a new web2c distribution.
>
Even that remark shows a ( unix ) bias :-)
DOS implementations like emtex do not offer the possibility of
recompiling - they only offer options to initex.

Peter

Peter Schmitt

unread,
Apr 2, 1998, 3:00:00 AM4/2/98
to

David Carlisle <dav...@nag.co.uk> writes:
>
>> >> ! TeX capacity exceeded, sorry [number of strings=3449].
>
>> Well over 90% of problems which result in TeX running out of memory are
>> in fact caused by an error in the document. Increasing the size of the
>> buffer can't help in such cases.
>
>Whilst that is true, I would guess that this is one of the 10%
>3449 strings is not very much, that is only 3500 command names, file
>names, ****cross_references_in the_aux_file***** etc.
>
The number of cross references seems to cause problems rather often.
Perhaps LaTeX should offer an alternative -- less greedy --
implementation of cross references which does not use up a new
control sequence for every reference?
( and be less generous with TeX resources in general :-)

Peter

David Carlisle

unread,
Apr 2, 1998, 3:00:00 AM4/2/98
to Peter Schmitt

> The number of cross references seems to cause problems rather often.
> Perhaps LaTeX should offer an alternative -- less greedy --
> implementation of cross references which does not use up a new
> control sequence for every reference?

It could, I'm sure you could implement it, but I'm not sure if it
would help.

Instead of storing ref/pageref information for
\label{aaa} and \label{bbb}

by

\def\r@aaa{{1}{2}}
\def\r@bbb{{3}{4}}

and so eat a csname per label it could instead store all the labels
in one property list

\def\labels@list{[aaa]{1}{2}[bbb]{3}{4}}

or some such and then define \ref to get the information out of the list
via parsing with a delimited argument.

But if people are running out of csnames because they have a lot of
references then this list would start to get ***long*** and it would
need to be repeatedly expanded and parsed so I suspect it would get
slow. Although this would save on csname usage it would increase the
token memory usage considerably as the names of the labels would be
stored as tokens so long label names would be particularly inefficient.

David

Robin Fairbairns

unread,
Apr 2, 1998, 3:00:00 AM4/2/98
to

In article <17F2CD929...@AWIUNI11.EDVZ.UniVie.AC.AT>,

Peter Schmitt <A813...@AWIUNI11.EDVZ.UniVie.AC.AT> wrote:
>David Carlisle <dav...@nag.co.uk> writes:
>>
>>Not wanting to start an OS war again:-) but you are showing a Mac bias:-)
>>web2c TeX has only had the posibility of increasing these sizes at
>>format time (as opposed to recompiling the web sources) since web2c7
>>at the beginning of last year, so quite possibly the original poster
>>does need to get a new web2c distribution.
>
>Even that remark shows a ( unix ) bias :-)

rather little, in fact.

> DOS implementations like emtex do not offer the possibility of
> recompiling - they only offer options to initex.

current web2c provides exactly that. even in its djgpp incarnation
(for dos) it would be eccentric to recompile when all was needed was a
changed parameter in the initial options.

emtex development seems to have stopped, so perhaps emtex users should
get hold of a new distribution too?[*]

there's really rather little between the distributions nowadays.

[*] not seriously, folks. but if you want to investigate people's
experiments with tex-for-the-future you do have to change. pdftex,
e-tex and omega aren't available within emtex and, as far as one can
tell, aren't going to be.
--
Robin (the beetle must go) Fairbairns r...@cl.cam.ac.uk
U of Cambridge Computer Lab, Pembroke St, Cambridge CB2 3QG, UK
Home page: http://www.cl.cam.ac.uk/users/rf/robin.html

Rebecca and Rowland

unread,
Apr 3, 1998, 3:00:00 AM4/3/98
to

David Carlisle <dav...@nag.co.uk> wrote:
[snip]

> and so eat a csname per label it could instead store all the labels
> in one property list
>
> \def\labels@list{[aaa]{1}{2}[bbb]{3}{4}}
>
> or some such and then define \ref to get the information out of the list
> via parsing with a delimited argument.
>
> But if people are running out of csnames because they have a lot of
> references then this list would start to get ***long*** and it would
> need to be repeatedly expanded and parsed so I suspect it would get
> slow. Although this would save on csname usage it would increase the
> token memory usage considerably as the names of the labels would be
> stored as tokens so long label names would be particularly inefficient.

From what little I understand of these things, this is the sort of thing
that's done for option passing to class and package files.

In the case of my rmpage package (for setting page layout parameters and
available from CTAN) which has a *lot* of options, it's very slow
indeed.

Donald Arseneau

unread,
Apr 3, 1998, 3:00:00 AM4/3/98
to

In article <1d6qnhf.1tq...@p5.nas1.is2.u-net.net>, real-addr...@rhu.barb.foobar (Rebecca and Rowland) writes...

><ega...@feathers.phys.unm.edu> wrote:
>> >> ! TeX capacity exceeded, sorry [number of strings=3449].
>> >> \IfFileExists #1#2#3->\openin \@inputcheck #1
>> >> \ifeof \@inputcheck \ifx \inpu...
>> >> l.10 \@input{app6.aux}
>> each section
>> will latex successfully on its own. The problem arises when I put it all
>> together
>> I have determined that in fact it is a memory problem.
>
>No you haven't - you can't be sure of this until you are sure that there
>is no mistake in the document. This has not been determined.

Gatesfeathers... er... HORSEFEATHERS!

3449 is just too few strings, especially with the profligate use LaTeX
and many poackages put them to.

Increasing the memory parameters depends on the implementation, and may
be a simple run-time command switch, or it may mean a recompile of TeX.
Check the installation docs.

Donald Arseneau as...@triumf.ca

Rebecca and Rowland

unread,
Apr 4, 1998, 3:00:00 AM4/4/98
to

Donald Arseneau <as...@erich.triumf.ca> wrote:

> In article <1d6qnhf.1tq...@p5.nas1.is2.u-net.net>,
real-addr...@rhu.barb.foobar (Rebecca and Rowland) writes...

[snip]


> >No you haven't - you can't be sure of this until you are sure that there
> >is no mistake in the document. This has not been determined.
>
> Gatesfeathers... er... HORSEFEATHERS!
>
> 3449 is just too few strings, especially with the profligate use LaTeX
> and many poackages put them to.

Oh all right. Too many (La)TeX experts round here, that's the problem.

Rowland.

[snip]

0 new messages