Large number of files in single directory
Chris
redhat-list at dotcomdesigners.com
Wed May 25 17:18:47 UTC 2005
There seems to be a filesystem limitation on most flavors of Linux I've
worked on, in terms of a max number of files in a single directory - before
tools like tar, gzip, rm, mv, cp and others stop working properly. For
example, I have some users that have 2000+ files in a single directory (some
as many as 10,000 files) and trying to tar these directories is always
coming up with "argument list too long."
Is there a way for tar and these other tools to "see" all these files and
process them as normal? I recall once I had to resort to something like
"find . -print | xargs rm -fr" to remove thousands of files from a single
directory. Is doing something similar but replacing "rm" with "tar" the
only way to make this work, or does tar have some sort of command line
switch (I couldn't find one) to work with extremely long argument lists?
Chris
More information about the redhat-list
mailing list