Google Groups no longer supports new Usenet posts or subscriptions. Historical content remains viewable.
Dismiss

fastest way to do a recursive file count like explorer

85 views
Skip to first unread message

Frank

unread,
Jul 14, 2008, 2:46:03 PM7/14/08
to
Hi,

I have a need to do a recursive file count on a directory that may have a
number of subdirs up to 86,400. In the subdirs, I have over 2 million files.
When I do a "properties" to a directory in the explorer, the results appear
in about 2 minutes. When I do a (dir -r | where ($_.mode -notmatch
"d")).length, it ran for about 5 minutes before I did a ctrl-c. Is there
anyway in Powershell to mimic the speed of explorer? I need this to run in
script.

Thanks in advance,


Kiron

unread,
Jul 14, 2008, 3:22:30 PM7/14/08
to
$dir = gi .
$files = $dir.getFiles('*','allDirectories')
$files.count
# free resources
rv dir, files
 
--
Kiron

Frank

unread,
Jul 14, 2008, 5:36:02 PM7/14/08
to
Hi Kiron,

Although this works, it appears that it is still slower. For the same
directory, the PS script below took about 5 min 30 sec, the explorer way
took 3 min 10 sec for 3.4 million files. Is there an Powershell console
tuning that can be done?

Thanks!

Shay Levy [MVP]

unread,
Jul 14, 2008, 5:53:25 PM7/14/08
to

(dir -n -r | where {!$_.PSIsContainer}).length


---
Shay Levy
Windows PowerShell MVP
blog: http://blogs.microsoft.co.il/blogs/ScriptFanatic

F> Hi,
F>
F> I have a need to do a recursive file count on a directory that may
F> have a
F> number of subdirs up to 86,400. In the subdirs, I have over 2
F> million files.
F> When I do a "properties" to a directory in the explorer, the results
F> appear
F> in about 2 minutes. When I do a (dir -r | where ($_.mode -notmatch
F> "d")).length, it ran for about 5 minutes before I did a ctrl-c. Is
F> there anyway in Powershell to mimic the speed of explorer? I need
F> this to run in script.
F>
F> Thanks in advance,
F>


Shay Levy [MVP]

unread,
Jul 14, 2008, 6:00:39 PM7/14/08
to

Actually, it works faster without -n.


---
Shay Levy
Windows PowerShell MVP
blog: http://blogs.microsoft.co.il/blogs/ScriptFanatic

SL> (dir -n -r | where {!$_.PSIsContainer}).length
SL>
SL> ---
SL> Shay Levy
SL> Windows PowerShell MVP
SL> blog: http://blogs.microsoft.co.il/blogs/ScriptFanatic


F>> Hi,
F>>
F>> I have a need to do a recursive file count on a directory that may
F>> have a
F>> number of subdirs up to 86,400. In the subdirs, I have over 2
F>> million files.
F>> When I do a "properties" to a directory in the explorer, the results
F>> appear
F>> in about 2 minutes. When I do a (dir -r | where ($_.mode -notmatch
F>> "d")).length, it ran for about 5 minutes before I did a ctrl-c. Is
F>> there anyway in Powershell to mimic the speed of explorer? I need
F>> this to run in script.

F>> Thanks in advance,
F>>


Robert Aldwinckle

unread,
Jul 14, 2008, 10:41:55 PM7/14/08
to
"Frank" <Fr...@discussions.microsoft.com> wrote in message
news:0F705D9A-637C-47DD...@microsoft.com...


Then why not use cmd.exe to do the grunt work?

E.g. just adapt Shay Levi's solution from an earlier thread

news:95d808932e0178...@news.microsoft.com
Date: Sat, 31 May 2008 22:35:00 +0000 (UTC)
Subject: Re: Query shared folder free space

$dirs = & cmd /c dir/s
$dir[-2]


>
> Thanks in advance,


HTH

Robert Aldwinckle
---


Martin Zugec

unread,
Jul 15, 2008, 7:24:38 AM7/15/08
to
Just out of curiosity, can you try following:
PS C:\Links\HDDs\DevHDD\WorkingProjects\S4Maintenance>
[System.IO.Directory]::GetFiles("C:\Windows", "*.*", [System.IO.S
earchOption]::AllDirectories) | Measure-Object

It should return files as strings, so should be faster.

Let me know if it helped (ideally with some benchmarks ;))

Martin

"Frank" <Fr...@discussions.microsoft.com> wrote in message
news:0F705D9A-637C-47DD...@microsoft.com...

Robert Aldwinckle

unread,
Jul 15, 2008, 2:51:34 PM7/15/08
to
"Frank" <Fr...@discussions.microsoft.com> wrote in message
news:0F705D9A-637C-47DD...@microsoft.com...


In that same thread I mentioned in my previous reply
FloweringWeeds gave the idea of using LogParser
to do the parsing. However, I noticed that LogParser
could actually do the counting (or summing) as well.
In case there is some optimization available from that tool,
e.g. something which avoids making the concatenated listing
and formatting of all files and their attributes while counting them,
you could try it too.

FWIW here is my attempt with that.

PS F:\Program Files\Log Parser 2.2>
.\LogParser.exe 'select count(ALL Name) from "\*" where Attributes NOT LIKE ''%D%''' -i:FS -e:1

If you don't care about differentiating files from directories
in your count you could omit the where clause and avoid
its pattern matching. I'm sure that a combined count
would be produced much quicker than one which included
such overhead.


>
> Thanks in advance,
>
>


HTH

Robert
---


Frank

unread,
Jul 15, 2008, 11:38:35 PM7/15/08
to
Hi Martin,

For 3,536,991 files, 70,260 folders, it took explorer around 20 minutes and
the script below around 27 minutes.

Joel (Jaykul) Bennett

unread,
Jul 16, 2008, 12:40:39 AM7/16/08
to
On Jul 15, 11:38 pm, Frank <Fr...@discussions.microsoft.com> wrote:
> Hi Martin,
>
> For 3,536,991 files, 70,260 folders, it took explorer around 20 minutes and
> the script below around 27 minutes.
>

I hate to say it, but that doesn't sound unreasonable considering the
second one actually lists and pipes all the file names. When it comes
to speed, the pipeline isn't really your friend here. The best thing
you could do in .net would be to change it to this:

[System.IO.Directory]::GetFiles("C:\Windows", "*.*",

[System.IO.SearchOption]::AllDirectories).Count

On my system a folder that takes 1:08 with that Measure-Object script
goes to 3 seconds using .Count

Martin Zugec

unread,
Jul 16, 2008, 4:40:21 AM7/16/08
to
Hi Frank,

and compared to other solutions proposed before?

Martin

"Frank" <Fr...@discussions.microsoft.com> wrote in message

news:4F626794-DFE6-4320...@microsoft.com...

0 new messages