gmhwxi <
gmh...@gmail.com> skribis:
> I am not against alloca(1024) but I am against (1024*1024) :)
My general view is a mathematical one, I suppose. If the size of the
allocation has some natural upper bound, then unless it really is
very, very big, it is ‘small’. For example: an array allocated for
some matrix operation in LAPACK or GSL; one should not normally have
to use Fortran’s ALLOCATE just to do that, without good cause. The
rank of the matrix likely is known; and, even if it is not, the
algorithm is likely to put a sharp limit on the rank.
C99, then, is to me just the copying over to C of a feature Fortran
already had, and which helped make the language much more pleasant to
use than it had been. Admittedly, in C this is less important, on
account of C having a long history of malloc, and not being used as
much for array algorithms.
A typical use of variable size arrays in my C code would be something
like a routine to do basis conversion of a bezier curve, or some other
polynomial operation. I do not know what the degree of the polynomials
will be; the most I have dealt with to date is about 9, but the future
may hold surprises. However, I do know that the degree cannot get very
large before there is no practical use. So the storage is ‘small’,
however much it happens to be.
If OTOH there is no natural limit, such as a general text string --
then, even if one _expects_ it to be small, it is not ‘small’ in the
sense above. It needs a malloc.