PS G:\> gci -fi *.vcproj -fi *.dsp -re
Get-ChildItem : Cannot bind parameter because parameter 'Filter' is
specified more than once. To provide multiple values to parameters that can
accept multiple values, use the array syntax. For example, "-parameter
value1,value2,value3".
At line:1 char:21
+ gci -fi *.vcproj -fi <<<< *.dsp -re
PS G:\> gci -fi *.vcproj,*.dsp -re
Get-ChildItem : Cannot convert 'System.Object[]' to the type 'System.String'
required by parameter 'Filter'. Specified method is not supported.
At line:1 char:8
Use the -include parameter instead (each extension is delimited by a comma):
gci -include *.vcproj,*.dsp -re
---
Shay Levy
Windows PowerShell MVP
http://blogs.microsoft.co.il/blogs/ScriptFanatic
CI> It does not work:
CI>
CI> PS G:\> gci -fi *.vcproj -fi *.dsp -re
CI> Get-ChildItem : Cannot bind parameter because parameter 'Filter' is
CI> specified more than once. To provide multiple values to parameters
CI> that can
CI> accept multiple values, use the array syntax. For example,
CI> "-parameter
CI> value1,value2,value3".
CI> At line:1 char:21
CI> + gci -fi *.vcproj -fi <<<< *.dsp -re
CI> PS G:\> gci -fi *.vcproj,*.dsp -re
CI> Get-ChildItem : Cannot convert 'System.Object[]' to the type
CI> 'System.String'
CI> required by parameter 'Filter'. Specified method is not supported.
CI> At line:1 char:8
from get-help get-childitem -full
-filter <string>
Specifies a filter in the provider's format or language. The value of this
parameter qualifies the Path parameter. The syntax of the filter, including
the use of wildcards, depends on the provider. Filters are more efficient
than other parameters, because the provider applies them when retrieving
the objects, rather than having Windows PowerShell filter the objects after
they are retrieved.
Brandon Shell
---------------
Blog: http://www.bsonposh.com/
PSH Scripts Project: www.codeplex.com/psobject
S> Hi Calin,
S>
S> Use the -include parameter instead (each extension is delimited by a
S> comma):
S>
S> gci -include *.vcproj,*.dsp -re
S>
S> ---
S> Shay Levy
S> Windows PowerShell MVP
S> http://blogs.microsoft.co.il/blogs/ScriptFanatic
Here are a few I can think of, the first one is faster then the other two:
dir -r -include *.vcproj,*.dsp
dir -r | where {$_.extension -match "vcproj|dsp"}
dir -r | where {$_.extension -eq ".vcproj" -OR $_.extension -eq ".dsp"}
dir -r | where {$_.extension -like "*vcproj" -OR $_.extension -like "*dsp"}
---
Shay Levy
Windows PowerShell MVP
http://blogs.microsoft.co.il/blogs/ScriptFanatic
B> Just a note.. there is a performance difference in using filters vs
B> using -include and -exclude.
B>
B> from get-help get-childitem -full
B>
B> -filter <string>
B> Specifies a filter in the provider's format or language. The value of
B> this
B> parameter qualifies the Path parameter. The syntax of the filter,
B> including
B> the use of wildcards, depends on the provider. Filters are more
B> efficient
B> than other parameters, because the provider applies them when
B> retrieving
B> the objects, rather than having Windows PowerShell filter the objects
B> after
B> they are retrieved.
B> Brandon Shell
B> ---------------
B> Blog: http://www.bsonposh.com/
B> PSH Scripts Project: www.codeplex.com/psobject
S>> Hi Calin,
S>>
S>> Use the -include parameter instead (each extension is delimited by a
S>> comma):
S>>
S>> gci -include *.vcproj,*.dsp -re
S>>
S>> ---
S>> Shay Levy
S>> Windows PowerShell MVP
S>> http://blogs.microsoft.co.il/blogs/ScriptFanatic
CI>>> It does not work:
CI>>>
CI>>> PS G:\> gci -fi *.vcproj -fi *.dsp -re
CI>>> Get-ChildItem : Cannot bind parameter because parameter 'Filter'
CI>>> is
IMO... -filter should take a RegEx and not wildcard.
Brandon Shell
---------------
Blog: http://www.bsonposh.com/
PSH Scripts Project: www.codeplex.com/psobject
S> Yes, but what are the options with regard to multiple file
S> extensions?
S>
S> Here are a few I can think of, the first one is faster then the other
S> two:
S>
S> dir -r -include *.vcproj,*.dsp
S>
S> dir -r | where {$_.extension -match "vcproj|dsp"}
S>
S> dir -r | where {$_.extension -eq ".vcproj" -OR $_.extension -eq
S> ".dsp"}
S>
S> dir -r | where {$_.extension -like "*vcproj" -OR $_.extension -like
S> "*dsp"}
S>
S> ---
S> Shay Levy
S> Windows PowerShell MVP
S> http://blogs.microsoft.co.il/blogs/ScriptFanatic
B>> Just a note.. there is a performance difference in using filters vs
B>> using -include and -exclude.
B>>
B>> from get-help get-childitem -full
B>>
B>> -filter <string>
B>> Specifies a filter in the provider's format or language. The value
B>> of
B>> this
B>> parameter qualifies the Path parameter. The syntax of the filter,
B>> including
B>> the use of wildcards, depends on the provider. Filters are more
B>> efficient
B>> than other parameters, because the provider applies them when
B>> retrieving
B>> the objects, rather than having Windows PowerShell filter the
B>> objects
B>> after
B>> they are retrieved.
B>> Brandon Shell
B>> ---------------
B>> Blog: http://www.bsonposh.com/
B>> PSH Scripts Project: www.codeplex.com/psobject
S>>> Hi Calin,
S>>>
S>>> Use the -include parameter instead (each extension is delimited by
S>>> a comma):
S>>>
S>>> gci -include *.vcproj,*.dsp -re
S>>>
S>>> ---
S>>> Shay Levy
S>>> Windows PowerShell MVP
S>>> http://blogs.microsoft.co.il/blogs/ScriptFanatic
CI>>>> It does not work:
CI>>>>
CI>>>> PS G:\> gci -fi *.vcproj -fi *.dsp -re
CI>>>> Get-ChildItem : Cannot bind parameter because parameter 'Filter'
CI>>>> is
CI>>>> specified more than once. To provide multiple values to
CI>>>> parameters
CI>>>> that can
CI>>>> accept multiple values, use the array syntax. For example,
CI>>>> "-parameter
CI>>>> value1,value2,value3".
CI>>>> At line:1 char:21
CI>>>> + gci -fi *.vcproj -fi <<<< *.dsp -re
CI>>>> PS G:\> gci -fi *.vcproj,*.dsp -re
CI>>>> Get-ChildItem : Cannot convert 'System.Object[]' to the type
CI>>>> 'System.String'
CI>>>> required by parameter 'Filter'. Specified method is not
CI>>>> supported.
should be:
'faster then the others'
---
Shay Levy
Windows PowerShell MVP
http://blogs.microsoft.co.il/blogs/ScriptFanatic
SL> Yes, but what are the options with regard to multiple file
SL> extensions?
SL>
SL> Here are a few I can think of, the first one is faster then the
SL> other two:
SL>
SL> dir -r -include *.vcproj,*.dsp
SL>
SL> dir -r | where {$_.extension -match "vcproj|dsp"}
SL>
SL> dir -r | where {$_.extension -eq ".vcproj" -OR $_.extension -eq
SL> ".dsp"}
SL>
SL> dir -r | where {$_.extension -like "*vcproj" -OR $_.extension -like
SL> "*dsp"}
SL>
SL> ---
SL> Shay Levy
SL> Windows PowerShell MVP
SL> http://blogs.microsoft.co.il/blogs/ScriptFanatic
B>> Just a note.. there is a performance difference in using filters vs
B>> using -include and -exclude.
B>>
B>> from get-help get-childitem -full
B>>
B>> -filter <string>
B>> Specifies a filter in the provider's format or language. The value
B>> of
B>> this
B>> parameter qualifies the Path parameter. The syntax of the filter,
B>> including
B>> the use of wildcards, depends on the provider. Filters are more
B>> efficient
B>> than other parameters, because the provider applies them when
B>> retrieving
B>> the objects, rather than having Windows PowerShell filter the
B>> objects
B>> after
B>> they are retrieved.
B>> Brandon Shell
B>> ---------------
B>> Blog: http://www.bsonposh.com/
B>> PSH Scripts Project: www.codeplex.com/psobject
S>>> Hi Calin,
S>>>
S>>> Use the -include parameter instead (each extension is delimited by
S>>> a comma):
S>>>
S>>> gci -include *.vcproj,*.dsp -re
S>>>
S>>> ---
S>>> Shay Levy
S>>> Windows PowerShell MVP
S>>> http://blogs.microsoft.co.il/blogs/ScriptFanatic
CI>>>> It does not work:
CI>>>>
CI>>>> PS G:\> gci -fi *.vcproj -fi *.dsp -re
CI>>>> Get-ChildItem : Cannot bind parameter because parameter 'Filter'
CI>>>> is
CI>>>> specified more than once. To provide multiple values to
CI>>>> parameters
CI>>>> that can
CI>>>> accept multiple values, use the array syntax. For example,
CI>>>> "-parameter
CI>>>> value1,value2,value3".
CI>>>> At line:1 char:21
CI>>>> + gci -fi *.vcproj -fi <<<< *.dsp -re
CI>>>> PS G:\> gci -fi *.vcproj,*.dsp -re
CI>>>> Get-ChildItem : Cannot convert 'System.Object[]' to the type
CI>>>> 'System.String'
CI>>>> required by parameter 'Filter'. Specified method is not
CI>>>> supported.
In fact, I would prefer a scriptBlock where you can decide what operators
you want to use.
---
Shay Levy
Windows PowerShell MVP
http://blogs.microsoft.co.il/blogs/ScriptFanatic
B> I was just making a generic point that most people over look :)
B>
B> IMO... -filter should take a RegEx and not wildcard.
B>
B> Brandon Shell
B> ---------------
B> Blog: http://www.bsonposh.com/
B> PSH Scripts Project: www.codeplex.com/psobject
S>> Yes, but what are the options with regard to multiple file
S>> extensions?
S>>
S>> Here are a few I can think of, the first one is faster then the
S>> other two:
Brandon Shell
---------------
Blog: http://www.bsonposh.com/
PSH Scripts Project: www.codeplex.com/psobject
S> typo, the 'faster then the other two'
S>
S> should be:
S>
S> 'faster then the others'
AMDx64 5200+, 4GiB RAM, Vistax64, using gci -Recurse with and without
filtering on a directory structure containing over 46,000 items and
capturing the output into a variable to eliminate scrolling from affecting
time to run:
No filtering whatsoever, averages about 6.2 seconds.
No filtering, but using -Include *: about 10.4 seconds.
No filtering, using -Filter *: about 6.2 seconds.
No filtering, but passing output through ?{$_.Name -like "*"}: about 15
seconds.
Filtering down to about 39,400 files using -Include *.txt, averaging about
9.8 seconds.
Filtering down the same way using -Filter *.txt: about 5.5 seconds
Filtering down to only 3300 items using -Include *.jpg: about 6.5 seconds.
Filtering down to only 3300 items using -Filter *.jpg: about 1.7 seconds
Filtering to no items using -Include *.badextension: 6.3 seconds.
Filtering down to no items using -Filter *.badextension: 1.3 seconds.
Filtering down to no items using ?{$_.Name -like "*.badextension"}: 15
seconds
Filtering down to no items using ?{$_.Name -like "*"}: about 15 seconds.
Here are a few conclusions and rules of thumb - anyone care to comment on
how sensible each one is?
(1) If all desired item names match a single wildcard that can be expressed
using -Filter, it is much more efficient to use -Filter. The smaller the
fraction of searched files that will likely match the expression, the
greater the speed of the operation, down to a bare minimum needed to perform
the querying operations even if there are no matches. However, no matter
what you're doing, using -Filter to reduce the set size is likely to
dramatically improve performance in proportion to how many items are
filtered out.
(2) Filtering using -Include will ALWAYS take more time than performing no
filtering whatsoever. However, the fewer the items that match, the closer
the time to run is to the raw unfiltered output.
(3) Post-retrieval filtering time doesn't seem to be affected at all by how
many items are rejected versus passed.
(4) A general rule of thumb for best performance:
If you will be filtering items down to a set that matches a single wildcard
pattern, use -Filter. If you need multiple matches, -Include is preferable
to using a post-enumeration Where-Object filter. General order of preference
for speed: -Filter, -Include/-Exclude, then Where-Object.
"Brandon Shell [MVP]" <a_bshe...@hotmail.com> wrote in message
news:29d4f64667838...@nn.bloomberg.com...
I don't consider -filter a bug, it's just not built to accept multiple wildcards
so there's nothing to
fix in v2.
---
Shay Levy
Windows PowerShell MVP
http://blogs.microsoft.co.il/blogs/ScriptFanatic
CI> On a few posts below, Alex tests the speed of different options.
CI> What you are suggesting is a workaroud. I knew of the -include
CI> switch. Do you consider the -filter a bug and do you expect it to be
CI> fixed with PowerShell 2.0?
CI>
CI> "Shay Levy [MVP]" wrote:
CI>
-Filter <System.String>
Specifies filter elements as required and supported by providers
Parameter required? false
Parameter position? 2
Parameter type System.String
Default value
Accept multiple values? false
Accepts pipeline input? false
Accepts wildcard characters? false
which shows the input type to be a string. The -exclude parameter accepts an
array of strings:
-Exclude <System.String[]>
Specifies those items upon which the Cmdlet is not to act and
include all others.
Parameter required? false
Parameter position? named
Parameter type System.String[]
Default value
Accept multiple values? true
Accepts pipeline input? false
Accepts wildcard characters? true
PS > dir -include *.txt -include *.rar
Get-ChildItem : Cannot bind parameter because parameter 'Include' is specified
more than once. To provide multiple valu
es to parameters that can accept multiple values, use the array syntax. For
example, "-parameter value1,value2,value3".
At line:1 char:28
+ dir -include *.txt -include <<<< *.rar
And if you try to follow the "-parameter value1,value2,value3" notation then
you'll get another error saying that
the Specified method is not supported.
PS > dir -filter *.txt,*.rar
Get-ChildItem : Cannot convert 'System.Object[]' to the type 'System.String'
required by parameter 'Filter'. Specified
method is not supported.
At line:1 char:12
+ dir -filter <<<< *.txt,*.rar
---
Shay Levy
Windows PowerShell MVP
http://blogs.microsoft.co.il/blogs/ScriptFanatic
CI> The error message tells us to
CI> "
CI> To provide multiple values to parameters that can
CI> accept multiple values, use the array syntax.
CI> "
CI> Either this message should not be displayed for -filter or the
CI> -filter
CI> should allow multiple filters. It is important that PowerShell gets
CI> this
CI> right because this and other solutions would encourage more people
CI> to adopt.
CI> Given the speed benefits, this should be considered.
CI> "Shay Levy [MVP]" wrote:
CI>
I don't have a definitive answer why -filter does not accept multiple values,
as I said earlier - I think it should.
---
Shay Levy
Windows PowerShell MVP
http://blogs.microsoft.co.il/blogs/ScriptFanatic
CI> This is clear; -fi twice or -fi x,y does not work because the type
CI> of the
CI> parameter is System.String. Why not make it array of strings to
CI> allow
CI> multiple filters? If it has no performance benefit, then maybe it
CI> should not
CI> be defined. One of the things that I like about gci is that it feels
CI> more
CI> natural with other commands. On the cmd syntax I could specify
CI> different
CI> filters like this:
CI> for /r %f in (*.vcproj, *.dsp) do @ find "_CONSOLE" %f
CI> If you have a quick answer, I would appreciate it.
The Get-WmiObject Filter parameter is conceptually identical in
implementation: Filter is written as WMI Where clause (just without the
initial WHERE).
Now, there still _is_ one annoyance here. For large collections of files and
folders, using -Include/-Exclude or a following Where-Object statement can
be extremely slow - for example, when I get above 10,000 or so files,
there's an appreciable wait for results, and of course pure
include/exclude/where filtering speed is dependent on the number of files
handled, not the number passed.
If that happens to be a problem, there's a way to speed things up
dramatically. It may look strange, but it's perfectly acceptable to do
within PowerShell and has high forward and backward compatibility. The
primary performance cost comes from transforming filesystem items into
objects, so a technique that reduces the count of items and then keeps them
as strings will work wonders for speed. So you can do this:
$f = cmd /c 'dir /s /b g:\ | findstr /r "\.vcproj$ \.dsp$"'
Using the cmd.exe directory listing and findstr may look like cheating, but
it works quite well, and one of the points of PowerShell is making it easy
to glue in odd bits like that when necessary; I do the same thing with FTYPE
and ASSOC in PowerShell. Furthermore, the strings coming back are very easy
to cast into file items if necessary.
"Calin Iaru" <Cali...@discussions.microsoft.com> wrote in message
news:7035CF9E-1177-4C56...@microsoft.com...
I find that using -include is faster (16,739 files 2,442 folders) + the result
is reach .NET objects.
PS > measure-command {dir c:\windows -r -inc *.txt,*.log}
(...)
Seconds : 1
Milliseconds : 654
Ticks : 16543306
TotalDays : 1.91473449074074E-05
TotalHours : 0.000459536277777778
TotalMinutes : 0.0275721766666667
TotalSeconds : 1.6543306
TotalMilliseconds : 1654.3306
PS > measure-command {cmd /c dir /s /b c:\windows | findstr /r "\.txt$ \.log$"}
(...)
Seconds : 5
Milliseconds : 250
Ticks : 52502958
TotalDays : 6.07673125E-05
TotalHours : 0.0014584155
TotalMinutes : 0.08750493
TotalSeconds : 5.2502958
TotalMilliseconds : 5250.2958
---
Shay Levy
Windows PowerShell MVP
http://blogs.microsoft.co.il/blogs/ScriptFanatic
A> Just to clarify, the reason that Filter is single-valued for
A> Get-ChildItem use on the filesystem is that the Filter parameter is
A> opportunistic. The Filter parameter is essentially an expansion slot
A> that a provider writer can plug into if the underlying data store can
A> make special use of it. In the case of the FileSystem provider, the
A> low-level underlying APIs had built-in support for MS-DOS style
A> wildcard filtering, so it was exposed for people to use with
A> Get-ChildItem. Although a different filter could be built into the
A> FileSystem Provider, it wouldn't be nearly as fast and would be
A> significantly slower than the current one, which probably just passes
A> the Filter string straight down to the FindFirstFile() API call.
A>
A> The Get-WmiObject Filter parameter is conceptually identical in
A> implementation: Filter is written as WMI Where clause (just without
A> the initial WHERE).
A>
A> Now, there still _is_ one annoyance here. For large collections of
A> files and folders, using -Include/-Exclude or a following
A> Where-Object statement can be extremely slow - for example, when I
A> get above 10,000 or so files, there's an appreciable wait for
A> results, and of course pure include/exclude/where filtering speed is
A> dependent on the number of files handled, not the number passed.
A>
A> If that happens to be a problem, there's a way to speed things up
A> dramatically. It may look strange, but it's perfectly acceptable to
A> do within PowerShell and has high forward and backward compatibility.
A> The primary performance cost comes from transforming filesystem items
A> into objects, so a technique that reduces the count of items and then
A> keeps them as strings will work wonders for speed. So you can do
A> this:
A>
A> $f = cmd /c 'dir /s /b g:\ | findstr /r "\.vcproj$ \.dsp$"'
A>
A> Using the cmd.exe directory listing and findstr may look like
A> cheating, but it works quite well, and one of the points of
A> PowerShell is making it easy to glue in odd bits like that when
A> necessary; I do the same thing with FTYPE and ASSOC in PowerShell.
A> Furthermore, the strings coming back are very easy to cast into file
A> items if necessary.
A>
A> "Calin Iaru" <Cali...@discussions.microsoft.com> wrote in message
A> news:7035CF9E-1177-4C56...@microsoft.com...
A>
Also, are you using 32-bit Windows? I used the 64-bit cmd.exe and PowerShell
on Vista. I now also seem to recall that there could be a problem with the
64-bit assemblies not getting ngen'd correctly, which would explain the
substantially longer time for my code to run. I probably need to confirm
that the ngen process worked and then try the same tests with both the 32
and 64-bit versions of Cmd.exe and PowerShell to know for sure how things
work on my system.
"Shay Levy [MVP]" <n...@addre.ss> wrote in message
news:89228ed2383d38...@news.microsoft.com...
$fc = 'cmd /c dir c:\windows /b /s | findstr /r "\.txt$ .\log$" '
and this:
$fc = cmd /c dir /s /b c:\windows | findstr /r "\.txt$ \.log$"
The external single-quotes ensure that the piping into findstr is part of
cmd.exe's job, rather than PowerShell's, in the interest of speeding up the
initial processing.
Your test also confirms something else I suspected but wasn't sure about.
Once you've created pipeline objects, processing them with non-PowerShell
tools probably doesn't give any significant performance benefits, simply
because of the cost of flattening them into textstream objects and then
reconstituting them.
"Shay Levy [MVP]" <n...@addre.ss> wrote in message
news:89228ed2383d38...@news.microsoft.com...
>
> PS > measure-command {cmd /c dir /s /b c:\windows | findstr /r "\.txt$
> \.log$"}
---
Shay Levy
Windows PowerShell MVP
blog: http://blogs.microsoft.co.il/blogs/ScriptFanatic
A> Did you perform multiple tests? With mine, I actually captured the
A> data so rendering was not involved, and did multiple passes each way
A> since initial caching seemed to affect performance. On the first pass
A> over the documents with each method, they took substantially longer.
A>
A> Also, are you using 32-bit Windows? I used the 64-bit cmd.exe and
A> PowerShell on Vista. I now also seem to recall that there could be a
A> problem with the 64-bit assemblies not getting ngen'd correctly,
A> which would explain the substantially longer time for my code to run.
A> I probably need to confirm that the ngen process worked and then try
A> the same tests with both the 32 and 64-bit versions of Cmd.exe and
A> PowerShell to know for sure how things work on my system.
A>
A> "Shay Levy [MVP]" <n...@addre.ss> wrote in message
A> news:89228ed2383d38...@news.microsoft.com...
A>
PS > $fc = 'cmd /c dir c:\windows /b /s | findstr /r "\.txt$ .\log$" '
PS > $fc
cmd /c dir c:\windows /b /s | findstr /r "\.txt$ .\log$"
PS > measure-command {$fc = dir c:\windows -r -inc *.txt,*.log}
(...)
Seconds : 32
Milliseconds : 167
Ticks : 321679677
TotalDays : 0.000372314440972222
TotalHours : 0.00893554658333333
TotalMinutes : 0.536132795
TotalSeconds : 32.1679677
TotalMilliseconds : 32167.9677
PS > measure-command {$fc = iex 'cmd /c dir c:\windows /b /s | findstr /r
"\.txt$ .\log$" '}
(...)
Seconds : 25
Milliseconds : 29
Ticks : 250297940
TotalDays : 0.000289696689814815
TotalHours : 0.00695272055555556
TotalMinutes : 0.417163233333333
TotalSeconds : 25.029794
TotalMilliseconds : 25029.794
When invoked, the cmd expression is faster but it leaves you with raw strings
that you need to parse/concat and then use get-childitem to turn them into
full .net objects, and all of that can eventually take the precious seconds
we are trying to save. Which one is better/elegant? That's for the readers
to decide :)
---
Shay Levy
Windows PowerShell MVP
blog: http://blogs.microsoft.co.il/blogs/ScriptFanatic
A> I found the cause of the slower processing. You unwrapped the
A> pipeline; it's the difference between this
A>
A> $fc = 'cmd /c dir c:\windows /b /s | findstr /r "\.txt$ .\log$" '
A>
A> and this:
A>
A> $fc = cmd /c dir /s /b c:\windows | findstr /r "\.txt$ \.log$"
A>
A> The external single-quotes ensure that the piping into findstr is
A> part of cmd.exe's job, rather than PowerShell's, in the interest of
A> speeding up the initial processing.
A>
A> Your test also confirms something else I suspected but wasn't sure
A> about. Once you've created pipeline objects, processing them with
A> non-PowerShell tools probably doesn't give any significant
A> performance benefits, simply because of the cost of flattening them
A> into textstream objects and then reconstituting them.
A>
A> "Shay Levy [MVP]" <n...@addre.ss> wrote in message
A> news:89228ed2383d38...@news.microsoft.com...
A>
> can be extremely slow - for example, when I
> get above 10,000 or so files,
Mmm (local or remote)
file system data parsing?
LogParser.exe -h -i:fs
Input format: FS (FileSystem properties)
Returns properties of files and folders
Perhaps one can even
write the Log Parser query once,
in a plain text file
(that accepts params)
and store this text file
in a network location
that all
(office, sales, admins, shipping clerks)
can use!
Ah double one's data parsing fun,
automate Log Parser within PowerShell.
"Shay Levy [MVP]" <n...@addre.ss> wrote in message
news:uKMQAhc7...@TK2MSFTNGP05.phx.gbl...
>
> I removed the outer quotes cause otherwise the command is just a string,
> PowerShell won't execute it (unless invoked):
Using measure-command on it? I cheated and instead used get-date before and
after completion to do the measurement; not nearly so nice, but it let me at
least measure the darned thing ;).
> When invoked, the cmd expression is faster but it leaves you with raw
> strings that you need to parse/concat and then use get-childitem to turn
> them into full .net objects, and all of that can eventually take the
> precious seconds we are trying to save. Which one is better/elegant?
> That's for the readers to decide :)
I'm not touching the elegance issue. : )
As for better, there's only one situation where I think the cmd.exe choice
may be better. If you're trying to perform a file enumeration operation that
is simply too time-consuming with Get-ChildItem, and if it's a repeatable
problem, then using cmd.exe makes sense to me as a workaround - something
like putting inline assembler into a program in order to get around a
difficult problem, where the only alternative is not solving the problem at
all.
Of course, if this was a REALLY big demand, someone could write a nice C++
DLL to handle enumeration and name-based filtering, and then wrap it up in a
specialty PowerShell cmdlet to provide the best of both worlds. :)
---
Shay Levy
Windows PowerShell MVP
blog: http://blogs.microsoft.co.il/blogs/ScriptFanatic
A> Inline - interesting points here...
A>
A> "Shay Levy [MVP]" <n...@addre.ss> wrote in message
A> news:uKMQAhc7...@TK2MSFTNGP05.phx.gbl...
A>
>> I removed the outer quotes cause otherwise the command is just a
>> string, PowerShell won't execute it (unless invoked):
>>
A> Using measure-command on it? I cheated and instead used get-date
A> before and after completion to do the measurement; not nearly so
A> nice, but it let me at least measure the darned thing ;).
A>
>> When invoked, the cmd expression is faster but it leaves you with raw
>> strings that you need to parse/concat and then use get-childitem to
>> turn them into full .net objects, and all of that can eventually take
>> the precious seconds we are trying to save. Which one is
>> better/elegant? That's for the readers to decide :)
>>
A> I'm not touching the elegance issue. : )
A>
A> As for better, there's only one situation where I think the cmd.exe
A> choice may be better. If you're trying to perform a file enumeration
A> operation that is simply too time-consuming with Get-ChildItem, and
A> if it's a repeatable problem, then using cmd.exe makes sense to me as
A> a workaround - something like putting inline assembler into a program
A> in order to get around a difficult problem, where the only
A> alternative is not solving the problem at all.
A>
A> Of course, if this was a REALLY big demand, someone could write a
A> nice C++ DLL to handle enumeration and name-based filtering, and then
A> wrap it up in a specialty PowerShell cmdlet to provide the best of
A> both worlds. :)
A>
Mmm so within PowerShell,
one is back again to using
Log Parser! :)
"Flowering Weeds" <flowering...@hotmail.com> wrote in message
news:e3z0pNo7...@TK2MSFTNGP04.phx.gbl...