In fact, what Has.Property(string name) by itself does is just check for
the existence of the property. But if you follow it by .AnyThing, the
operation applies to the value of the property.
In a certain way, this makes sense, because most constraints can only
be applied to a value. On the other hand, it's in intuitive to expect
your example to work, even with a null value.
So it's not a bug, because it works as designed, but maybe it's
a bug in the design.
What do others think?
Charlie
> --
> You received this message because you are subscribed to the Google Groups "NUnit-Discuss" group.
> To post to this group, send email to nunit-...@googlegroups.com.
> To unsubscribe from this group, send email to nunit-discus...@googlegroups.com.
> For more options, visit this group at http://groups.google.com/group/nunit-discuss?hl=en.
>
>
What would happen if the property is of type Object and only contains a
string?
Best Regards, David
Yes, there's really no clean way to "fix" this except to change the syntax
so it doesn't sound like it will do something it doesn't. Something like
Has.Property("name").Value.TypeOf(SomeType)
would make it clearer, but I'm not sure if requiring it would be something
people liked very much as it's an extra thing to type.
Charlie
I originally added With for the purpose of writing expressions read
more naturally,
but it's not a complete no-op. With has a right precedence higher than
either And
or Or, so this statement...
Assert.That(array,
Contains.Item.With.Property("x").LessThan(100).And.GreaterThan(50));
has a different meaning than this one...
Assert.That(array,
Contains.Item.Property("x").LessThan(100).And.Greater.Than(50));
The first applies both LessThan and GreaterThan to the property value,
while the second
applies LessThan to the property and GreaterThan to the array itself,
which causes
an error. Unfortunately, this isn't well documented so it's probably
not used much. :-(
In any case, we could try to give With some other meaning in 3.0, but
it's hard to see
what it might be. Your example seems to work with TypeOf, because of
the word "type",
but what about the situation where we actually _want_ to check the
Type of the _value_?
I'm convinced we need to do something to distinguish whether further
constraints apply
to the property itself or to the value, but I'm still not sure how
best to do it. One approach
would be to use 'Value' (or With.Value) but that still leaves the
question of what it means
when you use TypeOf by itself. Could it apply to the value unless the
value is null and
apply to the type in that case? Or is that too confusing?
BTW, this problem also applies to Attributes.
Charlie
Yes, specifically, the fluent syntax suffers from the fact that it is
linear. I've
sometimes thought of introducing parentheses as you have done in this
example. I wonder if we could augment the current syntax in this way
without too much disruption for the simple cases.
Charlie
I'm thinking about the increase in interface surface and how it affects the
programmer using NUnit. For example, the user sees property All as
well as method EveryItem in the intellisense and wonders what the
difference is.
Of course, that's not a reason to add stuff, just a reason for adding
it carefully.
Charlie
Nice idea, but could you please take care to overload thr correct
operators? I'd be very surprised it bitwise-and were used as logical
operator.
Adding operators also starts to open up the question whether this won't
become as complex as using an Expression (uncompiled lambda
representation) without the familiar syntax. e.g.:
Assert.That(array, Contains.AnyItem(i => 50 < i.x && i.x < 100));
Best Regards, David
Actually, we have always (at least since we had Assert.That) had operator
overloads for & and | as applied to constraints.
Use of bitwise-and and -or is necessary because
1) we are combining two constraints to form a new constraint
2) you can't overload logical && and ||
> Assert.That(array, Contains.AnyItem(i => 50 < i.x && i.x < 100));
We currently support lambdas using the PredicateConstraint and
the Matches syntactic element.
Assert.That(array, Has.Some.Matches(i => 50 < i.x && i.x < 100));
Charlie
>
> Best Regards, David
Nice write-up! Have you seen Charlie's note about using lamdas with the
PredicateConstraint in this thread? It'd be nice if you could compare
that too!
Best Regards, David
Assert.That(new []{new X{x=1}}, Has.Some.Matches<X>(i => 5 < i.x && i.x < 10));
unlike the Expression example that has expression printed as expected, you get an error message with no detail of the lamba logic.
1: Test 'MyTest.Test' failed:
2: Expected: some item value matching lambda expression
3: But was: < <X(1)> >
4: AccessorTest.cs(27,0): at SharpCut.UnitTests.AccessorTest`1.Test()
--
You received this message because you are subscribed to the Google Groups "NUnit-Discuss" group.
To post to this group, send email to nunit-...@googlegroups.com.
To unsubscribe from this group, send email to nunit-discuss+unsubscribe@googlegroups.com.
Of course, as we're talking of the future, we could overload
PredicateConstraint to
take an Expression in the framework builds for .NET 3.5 and higher.
Charlie
>> nunit-discus...@googlegroups.com.
>> For more options, visit this group at
>> http://groups.google.com/group/nunit-discuss?hl=en.
>>
>
> --
> You received this message because you are subscribed to the Google Groups
> "NUnit-Discuss" group.
> To post to this group, send email to nunit-...@googlegroups.com.
> To unsubscribe from this group, send email to
> nunit-discus...@googlegroups.com.
While we could build in some knowledge of the most common libraries, it
would be especially cool if there were a way to add that knowledge as
some sort of extension.
Charlie
Yes, that's what I meant. Mono.Addins allows extensions without any code
involved. The extension is just information in the XML file. Menu items
are often added this way, for example.
At the time when NUnit is filtering a stack trace it only has the text of
the stack trace to work with. So the easiest implementation would be to
use full names of any classes to filter out. It's also possible
to do a more "correct" implementation that allows for the assembly
name as well, but I think I would wait to see if any ambiguity arose
from using the class names only.
Charlie
Charlie
Well, first off, the constraint is independent of the syntax, so there
could be other
syntactic expressions that invoke it.
But it might make sense to have Matches plus a lambda invoke
ExpressionConstraint
in the 3.5 and 4.0 builds and PredicateConstraint in the 2.0 build.
Alternatively,
PredicateConstraint could evolve to include a constructor for an Expression in
it's 3.5+ incarnation.
But that's all details... the real question is "Would it be useful?"
Charlie
But it might make sense to have Matches plus a lambda invoke
ExpressionConstraint
in the 3.5 and 4.0 builds
and PredicateConstraint in the 2.0 build.
Alternatively,
PredicateConstraint could evolve to include a constructor for an Expression in
it's 3.5+ incarnation.
But that's all details... the real question is "Would it be useful?"
Separate builds for .NET 2.0, 3.5 and 4.0 have always been part of the design
of NUnit 3.0, with the higher builds offering more features. I guess this can
be called incompatibility, but I don't think it's a problem, unless
you had a reason
to build the tests against several builds of the framework.
I can see that this is slightly different from other cases because code that
worked one way under 2.0 would work a bit differently under 3.5+. But since
the only difference we intend is to give a better error message, my guess
is that it's not going to cause a problem.
Charlie
Separate builds for .NET 2.0, 3.5 and 4.0 have always been part of the design
of NUnit 3.0, with the higher builds offering more features. I guess this can
be called incompatibility, but I don't think it's a problem,
unless you had a reason to build the tests against several builds of the framework.
I can see that this is slightly different from other cases because code that
worked one way under 2.0 would work a bit differently under 3.5+. But since
the only difference we intend is to give a better error message, my guess
is that it's not going to cause a problem.
On Sun, Aug 7, 2011 at 2:55 PM, Kenneth Xu <ken...@homexu.com> wrote:
> Hi Charlie,
>
> On Sun, Aug 7, 2011 at 5:17 PM, Charlie Poole <nuni...@gmail.com> wrote:
>>
>> Separate builds for .NET 2.0, 3.5 and 4.0 have always been part of the
>> design
>> of NUnit 3.0, with the higher builds offering more features. I guess this
>> can
>> be called incompatibility, but I don't think it's a problem,
>
>
> Hmm... more features doesn't have to be incompatible. For example, if we
> just add one overloaded method:
>
> Assert.That<T>(T value, Expression<Predicate<T>> criteria) {...}
>
> then it is fully binary compatible.
FWIW, when I have talked about full compatibility, I meant both
forward and backward. We only aim for backward compatibility.
Note too that we can't have binary compatibility between the
2.0/3.0 builds and the 4.0 build.
> But if a method's signature is changed,
> then it is no longer binary compatible.
>
>>
>> unless you had a reason to build the tests against several builds of the
>> framework.
>
>
> I think it is more a problem for people writing common utility/plugin for
> NUnit. If the utility/plugin only needs features for 2.0 build. I should
> have no need to create multiple builds if NUnit multiple builds are binary
> compatible.
>
>>
>> I can see that this is slightly different from other cases because code
>> that
>> worked one way under 2.0 would work a bit differently under 3.5+. But
>> since
>> the only difference we intend is to give a better error message, my guess
>> is that it's not going to cause a problem.
>
>
> It is a common understanding that if I build an assembly for .Net 2.0, I can
> run it under .Net 3.5+ without recompiling. I would expect the same for
> NUnit. I.e., if I build an assembly with NUnit for .Net 2.0, I should be
> able to run with NUnit for .Net 3.5+ without recompiling.
If you build a test against the NUnit framework for .NET 2.0, it will
always run
against that framework because it's a reference. The test engine and console
runner (as well as the coming gui runner) are completely separate from the
framework and make no reference to it - just as now, in fact.
If we continue to support addins in the framework then we will have to maintain
a common interface for addins, but that's separate from the interface used
by tests.
So I really don't see this as a problem, but even so, we could use a different
syntactic element to avoid confusion if it seems better.
Charlie
> Just my 2 cents :)
>
>
> Cheers,
> Kenneth
>
> Hmm... more features doesn't have to be incompatible. For example, if we
> just add one overloaded method:
>
> Assert.That<T>(T value, Expression<Predicate<T>> criteria) {...}
>
> then it is fully binary compatible.
FWIW, when I have talked about full compatibility, I meant both
forward and backward. We only aim for backward compatibility.
Note too that we can't have binary compatibility between the
2.0/3.0 builds and the 4.0 build.
On Sun, Aug 7, 2011 at 4:35 PM, Kenneth Xu <ken...@homexu.com> wrote:
> Hi Charlie,
>
> Thanks for your patience being with me thus far.
>
> On Sun, Aug 7, 2011 at 6:51 PM, Charlie Poole <nuni...@gmail.com> wrote:
>>
>> > Hmm... more features doesn't have to be incompatible. For example, if we
>> > just add one overloaded method:
>> >
>> > Assert.That<T>(T value, Expression<Predicate<T>> criteria) {...}
>> >
>> > then it is fully binary compatible.
>>
>> FWIW, when I have talked about full compatibility, I meant both
>> forward and backward. We only aim for backward compatibility.
>
> My bad, shouldn't have used word "fully". I meant backward binary
> compatibility from build for higher .Net framework to lower .Net framework
> of same version of NUnit 3.x.
>>
>> Note too that we can't have binary compatibility between the
>> 2.0/3.0 builds and the 4.0 build.
>
> Just want to confirm, do you mean by design, there won't be such backward
> binary compatibility?
OK, up to now, I've only been considering test code, for which binary
compatibility
doesn't make much sense - a given test assembly always depends on a particular
version of NUnit (currently) and will depend on a particular version
and build (future).
> It may be easier with an example. Let's say I'm writing some reusable test
> library (x.dll), one function is to test any class that implements a given
> interface. That x.dll will be in turn used in various test projects built
> for various .Net frameworks. Further, x.dll builds well with NUnit framework
> for 2.0. Do you mean that I'm now forced to create different builds of x.dll
> for each corresponding NUnit framework builds?
I think it depends on whether your library has a reference to a particular
nunit.framework build. If there's a reference, then I think you would have to
create a different library for each framework you support. OTOH, if the library
were an addin of some sort - something loaded by NUnit rather than the other
way around - that wouldn't be an issue.
Also, I could be wrong about this. It might be possible to give all the
different builds the same identity and trick your library into using the one
that's found in the directory containing the tests. When I get a chance
I'll run some experiments to test this.
Charlie
> Thanks,
I think it depends on whether your library has a reference to a particular
nunit.framework build. If there's a reference, then I think you would have to
create a different library for each framework you support.
OTOH, if the library
were an addin of some sort - something loaded by NUnit rather than the other
way around - that wouldn't be an issue.
Also, I could be wrong about this. It might be possible to give all the
different builds the same identity and trick your library into using the one
that's found in the directory containing the tests. When I get a chance
I'll run some experiments to test this.
Charlie