Google Groups no longer supports new Usenet posts or subscriptions. Historical content remains viewable.
Dismiss

ksh: /usr/bin/rm: 0403-027 The parameter list is to long

1,305 views
Skip to first unread message

mark

unread,
Mar 4, 2002, 3:50:20 PM3/4/02
to
Is there a setting to increase the number of files that can be removed at
one time.


Ton Voon

unread,
Mar 4, 2002, 5:08:32 PM3/4/02
to
Mark,

I think the size can be changed in AIX 5L via smitty system. However,
you should consider using xargs. For example, instead of:

rm *
(ksh will expand * to all files in the current directory, thus blowing
the limit)

use:

ls | xargs rm
(where xargs will separate the list of files from stdin so that rm gets
invoked several times)

Ton

In article <u87n9v...@corp.supernews.com>,

Brent Butchard

unread,
Mar 4, 2002, 5:11:39 PM3/4/02
to
Not that I know of,

This would be because I presume that you supplied something like rm * if
which is too many for the command to processing in one command line
statement (remember that * is equated by the shell before executing the
command)

If you want to remove all files drop to the parent directory and do a

rm -rf {mydirectory}

or selectively delete a few at a time

rm -f [a-z]*
rm -f [0-9]*

Hope this is of help

Regards,

Brent Butchard

"mark" <ma...@amicusystems.com> wrote in message
news:u87n9v...@corp.supernews.com...

Adrian

unread,
Mar 5, 2002, 5:46:58 AM3/5/02
to
What about removing the files in 2 steps ??

cd /foo
find . -mtime -5 -exec rm {} \:
rm *

Read the man on find, that should tell you what I was doing there. This
usually happens to me when there is too many files in the directory so I
usually hust view of remove or ?? by listing them with the find command.
find . -name "*commonstring*" -ls

Anyway that is probably something that you didn't need to know,

Hope it helps but
Adrian


"Brent Butchard" <bbut...@ultradata.com.au> wrote in message
news:a60rg2$cgs$1...@perki.connect.com.au...

Hans-Joachim Ehlers

unread,
Mar 6, 2002, 5:26:47 PM3/6/02
to
Hi Adrian,

"Adrian" <ahi...@SPAM.tpg.com.au> schrieb im Newsbeitrag
news:3c84...@dnews.tpgi.com.au...


> What about removing the files in 2 steps ??
>
> cd /foo
> find . -mtime -5 -exec rm {} \:
> rm *
>
> Read the man on find, that should tell you what I was doing there. This
> usually happens to me when there is too many files in the directory so I
> usually hust view of remove or ?? by listing them with the find command.
> find . -name "*commonstring*" -ls
>

But find does travers subdirectories. So you might get in trouble to delete
files only in one directory.
My favorate command is " echo * | xargs -n10 rm"

Hajo


Compuman

unread,
Mar 7, 2002, 9:19:08 AM3/7/02
to
"Adrian" <ahi...@SPAM.tpg.com.au> wrote in message news:<3c84...@dnews.tpgi.com.au>...
Not to my knowledge , this problem is due to the shell expanding * to
all the files in the current directory. There is a maximum number of
bytes that can be contained in the argument list to any command (not
sure of the exact number!) but this is a feature of the shell you are
using (e.g. sh ,ksh) and cannot be altered.

Bela Gazdy

unread,
Mar 7, 2002, 9:38:04 AM3/7/02
to
Compuman <suk...@zoom.co.uk> wrote:
C> "Adrian" <ahi...@SPAM.tpg.com.au> wrote in message news:<3c84...@dnews.tpgi.com.au>...
C> Not to my knowledge , this problem is due to the shell expanding * to
C> all the files in the current directory. There is a maximum number of
C> bytes that can be contained in the argument list to any command (not
C> sure of the exact number!) but this is a feature of the shell you are
C> using (e.g. sh ,ksh) and cannot be altered.

Well, it's true, but xargs will go around it
find . |xargs rm -f
or, if you have file names with 'spaces' in them
find . |xargs -i rm -f {}
To see how it works:
find . |xargs -i echo "rm -f {}"
rmdir should work as well.

--
Bela Gazdy, EUCLID/AIX Systems Support
Curmudgeon of Academic Hodge-Podge, ITD/ATG @ Emory
"...who shook his family tree, and a bunch of NUTS fell out."

Wladimir Mutel

unread,
Mar 10, 2002, 8:34:41 AM3/10/02
to
В статье <a6655k$bucjh$1...@ID-78836.news.dfncis.de> Hans-Joachim Ehlers написал(а):

>> find . -name "*commonstring*" -ls
>>
>
> But find does travers subdirectories. So you might get in trouble to delete
> files only in one directory.
> My favorate command is " echo * | xargs -n10 rm"

'echo *' is not much better than 'rm *', since it gets the same
pattern expansion of '*'.

'ls | xargs rm' should be better.

Amarcoli

unread,
Mar 11, 2002, 11:59:21 AM3/11/02
to
This work, no matter how many files do you have on you directory.

for I in `ls -1`; do rm -f $I;done

NOTE: you can use meta chars like * ? (ls -1 file*.dat)

Bye.

"Wladimir Mutel" <m...@fluffy.isd.dp.ua> wrote in message
news:slrna8mo7...@fluffy.isd.dp.ua...

Hans-Joachim Ehlers

unread,
Mar 12, 2002, 1:02:06 PM3/12/02
to
Wladimir Mutel <m...@fluffy.isd.dp.ua> wrote in message news:<slrna8mo7...@fluffy.isd.dp.ua>...

echo * is much better because you can say "echo *.tmp" or something
without getting the "paramter list to long" error because "echo *"
does not produce a paramter list. Its just echo what the shell expands
for it.

Hajo

Gary R. Hook

unread,
Mar 13, 2002, 10:40:18 AM3/13/02
to

Um, no, that's not correct. The shell will be limited to a certain
amount of workspace in expanding wildcards; it does this _before_
executing the command, and therefore the prior comment is correct.
If the shell runs out of room during the expansion, you get the error
message. The 'find' version is the one that will not be limited
to shell implementation details.

--
Gary R. Hook / AIX PartnerWorld for Developers / These opinions are MINE
________________________________________________________________________

Villy Kruse

unread,
Mar 14, 2002, 2:43:36 AM3/14/02
to
On Wed, 13 Mar 2002 15:40:18 GMT,
Gary R. Hook <ho...@com.ibm.austin> wrote:


>Hans-Joachim Ehlers wrote:
>>
>> Wladimir Mutel <m...@fluffy.isd.dp.ua> wrote in message news:<slrna8mo7...@fluffy.isd.dp.ua>...
>> > В статье <a6655k$bucjh$1...@ID-78836.news.dfncis.de> Hans-Joachim Ehlers написал(а):
>> >
>> > >> find . -name "*commonstring*" -ls
>> > >>
>> > >
>> > > But find does travers subdirectories. So you might get in trouble to delete
>> > > files only in one directory.
>> > > My favorate command is " echo * | xargs -n10 rm"
>> >
>> > 'echo *' is not much better than 'rm *', since it gets the same
>> > pattern expansion of '*'.
>>
>> echo * is much better because you can say "echo *.tmp" or something
>> without getting the "paramter list to long" error because "echo *"
>> does not produce a paramter list. Its just echo what the shell expands
>> for it.
>
>Um, no, that's not correct. The shell will be limited to a certain
>amount of workspace in expanding wildcards; it does this _before_
>executing the command, and therefore the prior comment is correct.
>If the shell runs out of room during the expansion, you get the error
>message. The 'find' version is the one that will not be limited
>to shell implementation details.
>

Does it make a difference if ehco is a built-in, in which case the parameter
list won't be pased in the execve() system call, and therefore not a limited
by the kernel?

BTW, I wouldn't use ehco in this way either.

Villy

Gary R. Hook

unread,
Mar 14, 2002, 11:27:06 AM3/14/02
to
Villy Kruse wrote:
>
> Does it make a difference if ehco is a built-in, in which case the parameter
> list won't be pased in the execve() system call, and therefore not a limited
> by the kernel?

Nope. The maximum command line length is what limits your workspace.
Built-in
or not, there's a finite number of characters for wildcard expansion.

Damien Salvador

unread,
Mar 22, 2002, 2:37:59 PM3/22/02
to
On Mon, 11 Mar 2002 16:59:21 -0000, Amarcoli
<amar...@clix.pt> a écrit:

>This work, no matter how many files do you have on you directory.
>
> for I in `ls -1`; do rm -f $I;done
>
>NOTE: you can use meta chars like * ? (ls -1 file*.dat)
>

I'm not sure under AIX, but under linux (and I'm using the same shell, zsh so
it could behave the same) the backquote substitution, when too long, generates
an error.
Solution was

ls | while read myfile
do
rm -f $myfile
done;

George Baltz

unread,
Mar 22, 2002, 2:54:01 PM3/22/02
to

Yet another suggestion for hardening against funky filenames
> rm -f $myfile
rm -f -- "$myfile"
> done;

The "--" says no more options, and the quotes take care of whitespace in names.

--
George Baltz N3GB
Computer Sciences Corp Rule of thumb: ANYthing offered
@NOAA/NESDIS/IPD by unsolicited email is a hoax,
Suitland, MD 20746 ripoff, scam or outright fraud.

Wladimir Mutel

unread,
Mar 26, 2002, 6:33:36 AM3/26/02
to
В статье <slrna9n20l.fr3....@zen.via.ecp.fr> Damien Salvador написал(а):

>>This work, no matter how many files do you have on you directory.

>> for I in `ls -1`; do rm -f $I;done

>>NOTE: you can use meta chars like * ? (ls -1 file*.dat)

> I'm not sure under AIX, but under linux (and I'm using the same shell, zsh so
> it could behave the same) the backquote substitution, when too long, generates
> an error.
> Solution was

> ls | while read myfile
> do
> rm -f $myfile
> done;

Or maybe

ls | xargs rm -f --

0 new messages