nob...@nowhere.com
unread,Feb 11, 2024, 12:05:03 PMFeb 11You do not have permission to delete messages in this group
Either email addresses are anonymous for this group or you need the view member email addresses permission to view the original message
to
When you search for a file it does a depth first search intead of a
breadth first search.
So youve got a directory with 12 subdirectories in it, each of which
contains a few hundred or thousand sub-subdirectories contining tens or
hundreds of thousands of files. And one of those sub-subdirectories is
the one you want.
A breadth first search does exactly what you want. But with a depth
first search, you have to wait for it to trawl through every single one
of the millions of files and directories in SUBDIR1 before it will even
think about looking in SUBDIR2. And so on.
Did anyone at M$ ever actually *think through* a design decision? As
in, work out what is the best way to do something for the most likely
use cases?
I dont know which of these three possibilities is the more likely, and
the more depressing:
1. "Depth first is more efficient; it uses a smaller stack."
I can see that being a consideration in 1980. But 1980 was a long time
ago. Maybe keep your software up to date? Youre charging enough for it
to make that feasible.
2. "It makes no difference which one you use; nobody has millions of
files in subdirectories anyway."
Nobody who is stupid enough to use Winblows for mission critical work,
anyway. At least I persuaded $ORK[-n] to abandon Winblows Servuar. :|||
3. "Duh, there are different kinds of search?"
The word "Bingo!" springs unbidden. I hope not. :|