Couple of questions for you all:
* What's the maximum of files/folders that can exist in any given folder?
(on NTFS, NT4)
* Ignoring the maximum value, is their a practical limit on the number of
files in a folder, above which performance starts to degrade?
Thanks in advance,
Paul.
One thing I have noticed is that if you create a folder with many thousands
of files, then delete those, that Windows doesn't shrink the data structures
it uses for the file directory, and this can create some very strange
performance side-effects as you continue to add and delete files. I guess
what I am saying is that if you plan to have 10,000 static files and not
change the contents dramatically every day, then you are probably okay.
But if you are going to have 10,000 files, delete them, create 10,000 new
files, delete them, and so on, you are going to end up with the data
structure that NTFS maintains for its directory continuously growing, maybe
fragmenting, and this will affect performance eventually. We found that
by just deleting the directory and recreating it you could reset things.
Knowing if your application requires a static or dynamic configuration is
very important to pursuing an efficient use of NTFS.
Our experimentation was with NTFS under Windows NT 4, and the new NTFS file
system may have changed things quite a bit.
--
Will
NOTE: To reply, CHANGE the username to westes AT uscsw.com
"Paul Moss" <pa...@9000i.com> wrote in message
news:94k9uj$g4l$1...@taliesin2.netcom.net.uk...
--
Will
NOTE: To reply, CHANGE the username to westes AT uscsw.com
"CHANGE username to just westes" <DELETE...@uscsw.com> wrote in message
news:#k3T4LAuAHA.320@tkmsftngp05...