Is this a bug?

70 views
Skip to first unread message

const

unread,
Jul 27, 2011, 3:57:32 PM7/27/11
to NUnit-Discuss
The test at the endfails with the result:

NUnitBugs.Tests.TestMe:
Expected: property AltName <System.String>
But was: null

whereas if AltName had been set to a value, e.g. AltName="foo" then it
succeeds.

I had taken the Typeof() method to apply to the type of the Property
as opposed to the type of the value it returns which I think is what's
happening here, i.e. implicitly the property is being 'got' which
returns 'null' hence the disparity in types. When AltName is set as
above 'foo' is returned which is a string hence the test passes.

What's the correct behaviour: should this be testing the type of the
property or it's value. If the former then this looks like a bug but
if it's the latter any ideas on how the former is achieved?

namespace NUnitBugs
{
public class NullStringProp
{
private string AltName { get; set; }
}

class Tests
{
[Test]
public static void TestMe()
{
NullStringProp foo = new NullStringProp();

Assert.That(foo, Has.Property("AltName").Typeof<string>());
}
}
}

Thanks,

Pete

Charlie Poole

unread,
Jul 27, 2011, 9:37:27 PM7/27/11
to nunit-...@googlegroups.com
Short answer: Maybe.

In fact, what Has.Property(string name) by itself does is just check for
the existence of the property. But if you follow it by .AnyThing, the
operation applies to the value of the property.

In a certain way, this makes sense, because most constraints can only
be applied to a value. On the other hand, it's in intuitive to expect
your example to work, even with a null value.

So it's not a bug, because it works as designed, but maybe it's
a bug in the design.

What do others think?

Charlie

> --
> You received this message because you are subscribed to the Google Groups "NUnit-Discuss" group.
> To post to this group, send email to nunit-...@googlegroups.com.
> To unsubscribe from this group, send email to nunit-discus...@googlegroups.com.
> For more options, visit this group at http://groups.google.com/group/nunit-discuss?hl=en.
>
>

David Schmitt

unread,
Jul 28, 2011, 11:47:30 AM7/28/11
to nunit-...@googlegroups.com
I've never seen the Has.Property.TypeOf before, but I'd say failing on
null is the easy way out.

What would happen if the property is of type Object and only contains a
string?


Best Regards, David

Charlie Poole

unread,
Jul 28, 2011, 1:35:44 PM7/28/11
to nunit-...@googlegroups.com
Hi David,

Yes, there's really no clean way to "fix" this except to change the syntax
so it doesn't sound like it will do something it doesn't. Something like
Has.Property("name").Value.TypeOf(SomeType)
would make it clearer, but I'm not sure if requiring it would be something
people liked very much as it's an extra thing to type.

Charlie

const

unread,
Aug 2, 2011, 3:38:06 AM8/2/11
to NUnit-Discuss
I noticed that the constraint based syntax contains the currently
unimplemented 'with' clause. Would something along the following
lines be possible:

Assert.That(foo, Has.Property("AltName").With.Typeof<string>());

If not I think having what follows Has.Property() validating the
Property as opposed to implicitly retrieving the property's value
would be idea, i.e.
so that Has.Property("AltName").Value is not implicit but needs to be
explicitly called, e.g.

Assert.That(foo,
Has.Property("AltName").With.Value.Equal("foobar"));

Thanks,

Pete

Charlie Poole

unread,
Aug 2, 2011, 10:49:58 AM8/2/11
to nunit-...@googlegroups.com
Hi Pete,

I originally added With for the purpose of writing expressions read
more naturally,
but it's not a complete no-op. With has a right precedence higher than
either And
or Or, so this statement...
Assert.That(array,
Contains.Item.With.Property("x").LessThan(100).And.GreaterThan(50));
has a different meaning than this one...
Assert.That(array,
Contains.Item.Property("x").LessThan(100).And.Greater.Than(50));

The first applies both LessThan and GreaterThan to the property value,
while the second
applies LessThan to the property and GreaterThan to the array itself,
which causes
an error. Unfortunately, this isn't well documented so it's probably
not used much. :-(

In any case, we could try to give With some other meaning in 3.0, but
it's hard to see
what it might be. Your example seems to work with TypeOf, because of
the word "type",
but what about the situation where we actually _want_ to check the
Type of the _value_?

I'm convinced we need to do something to distinguish whether further
constraints apply
to the property itself or to the value, but I'm still not sure how
best to do it. One approach
would be to use 'Value' (or With.Value) but that still leaves the
question of what it means
when you use TypeOf by itself. Could it apply to the value unless the
value is null and
apply to the type in that case? Or is that too confusing?

BTW, this problem also applies to Attributes.

Charlie

Kenneth Xu

unread,
Aug 2, 2011, 3:55:04 PM8/2/11
to nunit-...@googlegroups.com
Hi Charlie,

Since we are at that, I'd like to share my thoughts on this. While fluent API is nice in many use cases, handing a complex and/or logic is never quite straightforward. I felt this is due to the fact that human language was never good at it.

I'm wondering if operator overloading can help to make the syntax more easier to understand. e.g.

Assert.That(array, Contains.AnyItem(Has.Property("x").Value(Is.LessThan(100))) & Is.GreaterThan(50));
Assert.That(array, Contains.EveryItem(Has.Property("x").Value(Is.LessThan(100) & Is.GreaterThan(50))));

Cheers,
Kenneth

Charlie Poole

unread,
Aug 2, 2011, 6:12:22 PM8/2/11
to nunit-...@googlegroups.com
Hi Kenneth,

Yes, specifically, the fluent syntax suffers from the fact that it is
linear. I've
sometimes thought of introducing parentheses as you have done in this
example. I wonder if we could augment the current syntax in this way
without too much disruption for the simple cases.

Charlie

Kenneth Xu

unread,
Aug 3, 2011, 12:03:23 AM8/3/11
to nunit-...@googlegroups.com
Hi Charlie,

For the examples I give, EveryItem(), AnyItem() and Value() are all new methods so there should be no disruption for simple cases.

Cheers,
Kenneth

Charlie Poole

unread,
Aug 3, 2011, 12:29:59 AM8/3/11
to nunit-...@googlegroups.com
Hi Kenneth,

I'm thinking about the increase in interface surface and how it affects the
programmer using NUnit. For example, the user sees property All as
well as method EveryItem in the intellisense and wonders what the
difference is.

Of course, that's not a reason to add stuff, just a reason for adding
it carefully.

Charlie

David Schmitt

unread,
Aug 3, 2011, 5:51:49 AM8/3/11
to nunit-...@googlegroups.com
On 02.08.2011 21:55, Kenneth Xu wrote:
> Hi Charlie,
>
> Since we are at that, I'd like to share my thoughts on this. While
> fluent API is nice in many use cases, handing a complex and/or logic is
> never quite straightforward. I felt this is due to the fact that human
> language was never good at it.
>
> I'm wondering if operator overloading can help to make the syntax more
> easier to understand. e.g.
>
> Assert.That(array,
> Contains.AnyItem(Has.Property("x").Value(Is.LessThan(100))) &
> Is.GreaterThan(50));
> Assert.That(array,
> Contains.EveryItem(Has.Property("x").Value(Is.LessThan(100) &
> Is.GreaterThan(50))));

Nice idea, but could you please take care to overload thr correct
operators? I'd be very surprised it bitwise-and were used as logical
operator.

Adding operators also starts to open up the question whether this won't
become as complex as using an Expression (uncompiled lambda
representation) without the familiar syntax. e.g.:

Assert.That(array, Contains.AnyItem(i => 50 < i.x && i.x < 100));


Best Regards, David

Greg Young

unread,
Aug 3, 2011, 8:50:06 AM8/3/11
to nunit-...@googlegroups.com
Would that be adding operator overloads or just an expression?
> --
> You received this message because you are subscribed to the Google Groups "NUnit-Discuss" group.
> To post to this group, send email to nunit-...@googlegroups.com.
> To unsubscribe from this group, send email to nunit-discus...@googlegroups.com.
> For more options, visit this group at http://groups.google.com/group/nunit-discuss?hl=en.
>
>

--
Le doute n'est pas une condition agréable, mais la certitude est absurde.

Charlie Poole

unread,
Aug 3, 2011, 9:25:59 AM8/3/11
to nunit-...@googlegroups.com
Hi David,

Actually, we have always (at least since we had Assert.That) had operator
overloads for & and | as applied to constraints.

Use of bitwise-and and -or is necessary because
1) we are combining two constraints to form a new constraint
2) you can't overload logical && and ||

>  Assert.That(array, Contains.AnyItem(i => 50 < i.x && i.x < 100));

We currently support lambdas using the PredicateConstraint and
the Matches syntactic element.

Assert.That(array, Has.Some.Matches(i => 50 < i.x && i.x < 100));

Charlie

>
> Best Regards, David

Kenneth Xu

unread,
Aug 3, 2011, 4:02:59 PM8/3/11
to nunit-...@googlegroups.com
Hi David,

Very interesting topic. I did some research and have some examples posted to the link below as I think it is too long to have it in here.

http://kennethxu.blogspot.com/2011/08/use-linq-expression-for-unit-test.html

Cheers,
Kenneth

Kenneth Xu

unread,
Aug 3, 2011, 4:09:53 PM8/3/11
to nunit-...@googlegroups.com
Hi Charlie,

I see you point. The only difference are those new stuff are all methods instead of properties. And I admit that it can easily develop questions from user on when to use what.

Maybe just let 2.0 users stick with what it is, and 3.5+ user can use Expression easily anyway.

Thanks,
Kenneth

David Schmitt

unread,
Aug 4, 2011, 5:10:48 AM8/4/11
to nunit-...@googlegroups.com
On 03.08.2011 22:02, Kenneth Xu wrote:
> Hi David,
>
> Very interesting topic. I did some research and have some examples
> posted to the link below as I think it is too long to have it in here.
>
> http://kennethxu.blogspot.com/2011/08/use-linq-expression-for-unit-test.html

Nice write-up! Have you seen Charlie's note about using lamdas with the
PredicateConstraint in this thread? It'd be nice if you could compare
that too!


Best Regards, David

Kenneth Xu

unread,
Aug 6, 2011, 1:23:05 AM8/6/11
to nunit-...@googlegroups.com
Hi David,

Yes, I have seen Charlie's note about PredicateConstraint.The major difference is it uses a delegate instead of Expression. So when test fails the expect message loses the detail. For example, when below assert fails,
Assert.That(new []{new X{x=1}}, Has.Some.Matches<X>(i => 5 < i.x && i.x < 10));

unlike the Expression example that has expression printed as expected, you get an error message with no detail of the lamba logic.

   1:  Test 'MyTest.Test' failed: 
   2:    Expected: some item value matching lambda expression
   3:    But was:  < <X(1)> >
   4:      AccessorTest.cs(27,0): at SharpCut.UnitTests.AccessorTest`1.Test()

While I can use Expression to replace all the constraints and still get very meaningful information when test fails, PredicateConstraint will put me in blind when looking at the test result.

Hope that explains why I didn't consider it as comparable.

Cheers,
Kenneth

--
You received this message because you are subscribed to the Google Groups "NUnit-Discuss" group.
To post to this group, send email to nunit-...@googlegroups.com.
To unsubscribe from this group, send email to nunit-discuss+unsubscribe@googlegroups.com.

Charlie Poole

unread,
Aug 6, 2011, 9:30:04 AM8/6/11
to nunit-...@googlegroups.com
Hi Kenneth,

Of course, as we're talking of the future, we could overload
PredicateConstraint to
take an Expression in the framework builds for .NET 3.5 and higher.

Charlie

>> nunit-discus...@googlegroups.com.


>> For more options, visit this group at
>> http://groups.google.com/group/nunit-discuss?hl=en.
>>
>

> --
> You received this message because you are subscribed to the Google Groups
> "NUnit-Discuss" group.
> To post to this group, send email to nunit-...@googlegroups.com.
> To unsubscribe from this group, send email to

> nunit-discus...@googlegroups.com.

Kenneth Xu

unread,
Aug 7, 2011, 12:45:42 AM8/7/11
to nunit-...@googlegroups.com
Hi Charlie,

That sounds great to have it build in although I can do it easily myself today as demonstrated.

Another direction is to provide better support/integrate those general assertion libraries instead of building one. E.g., threat PAssertFailedException similar as AssertionExcepiton and strip stack frames from those assertion DLLs.

Cheers,
Kenneth

Charlie Poole

unread,
Aug 7, 2011, 1:21:56 AM8/7/11
to nunit-...@googlegroups.com
Hi Kenneth,

While we could build in some knowledge of the most common libraries, it
would be especially cool if there were a way to add that knowledge as
some sort of extension.

Charlie

Kenneth Xu

unread,
Aug 7, 2011, 12:23:53 PM8/7/11
to nunit-...@googlegroups.com
Hi Charlie,

I think the extensibility can simply be some kind of configuration. For each general assertion lib, I guess we only need the name of the assembly and the assembly qualified class name to provide good integration. This also removes the direct dependency to a specific version of the assertion lib.

Cheers,
Kenneth

Charlie Poole

unread,
Aug 7, 2011, 3:01:26 PM8/7/11
to nunit-...@googlegroups.com
Hi Kenneth,

Yes, that's what I meant. Mono.Addins allows extensions without any code
involved. The extension is just information in the XML file. Menu items
are often added this way, for example.

At the time when NUnit is filtering a stack trace it only has the text of
the stack trace to work with. So the easiest implementation would be to
use full names of any classes to filter out. It's also possible
to do a more "correct" implementation that allows for the assembly
name as well, but I think I would wait to see if any ambiguity arose
from using the class names only.

Charlie

Charlie Poole

unread,
Aug 7, 2011, 3:21:31 PM8/7/11
to nunit-...@googlegroups.com
So should we include ExpressionConstraint in 3.0 do you think?

Charlie

Kenneth Xu

unread,
Aug 7, 2011, 3:56:03 PM8/7/11
to nunit-...@googlegroups.com
Let's assume the answer is yes. How are we going to use this without ambiguity? We cannot overload Matches as it will certainly causes ambiguity when you write Matches(x => x = 10);

Charlie Poole

unread,
Aug 7, 2011, 4:16:55 PM8/7/11
to nunit-...@googlegroups.com
Hi Kenneth,

Well, first off, the constraint is independent of the syntax, so there
could be other
syntactic expressions that invoke it.

But it might make sense to have Matches plus a lambda invoke
ExpressionConstraint
in the 3.5 and 4.0 builds and PredicateConstraint in the 2.0 build.
Alternatively,
PredicateConstraint could evolve to include a constructor for an Expression in
it's 3.5+ incarnation.

But that's all details... the real question is "Would it be useful?"

Charlie

Kenneth Xu

unread,
Aug 7, 2011, 4:53:02 PM8/7/11
to nunit-...@googlegroups.com
Hi Charlie,

On Sun, Aug 7, 2011 at 4:16 PM, Charlie Poole <nuni...@gmail.com> wrote:

But it might make sense to have Matches plus a lambda invoke
ExpressionConstraint
in the 3.5 and 4.0 builds
That means the Matches will take parameter type Expression<Predicate<T>> in 3.5+
 
and PredicateConstraint in the 2.0 build.
And takes parameter type Predicate<T> in 2.0 build.

Now the 3.5 build is not binary compatible to 2.0 build, i.e. if an assembly (some common utility) uses Matches compiled with 2.0 build will not run with 3.5 build.
 
Alternatively,
PredicateConstraint could evolve to include a constructor for an Expression in
it's 3.5+ incarnation.
I has same overloading cause ambiguity and replacing causes incompatibility issue.


But that's all details... the real question is "Would it be useful?"
IMHO, it would be useful if it is easy to use and there is no side effects.

Cheers,
Kenneth
 

Charlie Poole

unread,
Aug 7, 2011, 5:17:15 PM8/7/11
to nunit-...@googlegroups.com
Hi Kenneth,

Separate builds for .NET 2.0, 3.5 and 4.0 have always been part of the design
of NUnit 3.0, with the higher builds offering more features. I guess this can
be called incompatibility, but I don't think it's a problem, unless
you had a reason
to build the tests against several builds of the framework.

I can see that this is slightly different from other cases because code that
worked one way under 2.0 would work a bit differently under 3.5+. But since
the only difference we intend is to give a better error message, my guess
is that it's not going to cause a problem.

Charlie

Kenneth Xu

unread,
Aug 7, 2011, 5:55:27 PM8/7/11
to nunit-...@googlegroups.com
Hi Charlie,

On Sun, Aug 7, 2011 at 5:17 PM, Charlie Poole <nuni...@gmail.com> wrote:
Separate builds for .NET 2.0, 3.5 and 4.0 have always been part of the design
of NUnit 3.0, with the higher builds offering more features. I guess this can
be called incompatibility, but I don't think it's a problem,
 
Hmm... more features doesn't have to be incompatible. For example, if we just add one overloaded method:

Assert.That<T>(T value, Expression<Predicate<T>> criteria) {...}

then it is fully binary compatible. But if a method's signature is changed, then it is no longer binary compatible.
 
unless you had a reason to build the tests against several builds of the framework.
 
I think it is more a problem for people writing common utility/plugin for NUnit. If the utility/plugin only needs features for 2.0 build. I should have no need to create multiple builds if NUnit multiple builds are binary compatible.
 
I can see that this is slightly different from other cases because code that
worked one way under 2.0 would work a bit differently under 3.5+. But since
the only difference we intend is to give a better error message, my guess
is that it's not going to cause a problem.
 
It is a common understanding that if I build an assembly for .Net 2.0, I can run it under .Net 3.5+ without recompiling. I would expect the same for NUnit. I.e., if I build an assembly with NUnit for .Net 2.0, I should be able to run with NUnit for .Net 3.5+ without recompiling.

Just my 2 cents :)

Cheers,
Kenneth

Charlie Poole

unread,
Aug 7, 2011, 6:51:38 PM8/7/11
to nunit-...@googlegroups.com
Hi Kenneth,

On Sun, Aug 7, 2011 at 2:55 PM, Kenneth Xu <ken...@homexu.com> wrote:
> Hi Charlie,
>
> On Sun, Aug 7, 2011 at 5:17 PM, Charlie Poole <nuni...@gmail.com> wrote:
>>
>> Separate builds for .NET 2.0, 3.5 and 4.0 have always been part of the
>> design
>> of NUnit 3.0, with the higher builds offering more features. I guess this
>> can
>> be called incompatibility, but I don't think it's a problem,
>
>
> Hmm... more features doesn't have to be incompatible. For example, if we
> just add one overloaded method:
>
> Assert.That<T>(T value, Expression<Predicate<T>> criteria) {...}
>
> then it is fully binary compatible.

FWIW, when I have talked about full compatibility, I meant both
forward and backward. We only aim for backward compatibility.

Note too that we can't have binary compatibility between the
2.0/3.0 builds and the 4.0 build.

> But if a method's signature is changed,
> then it is no longer binary compatible.
>
>>
>> unless you had a reason to build the tests against several builds of the
>> framework.
>
>
> I think it is more a problem for people writing common utility/plugin for
> NUnit. If the utility/plugin only needs features for 2.0 build. I should
> have no need to create multiple builds if NUnit multiple builds are binary
> compatible.
>
>>
>> I can see that this is slightly different from other cases because code
>> that
>> worked one way under 2.0 would work a bit differently under 3.5+. But
>> since
>> the only difference we intend is to give a better error message, my guess
>> is that it's not going to cause a problem.
>
>
> It is a common understanding that if I build an assembly for .Net 2.0, I can
> run it under .Net 3.5+ without recompiling. I would expect the same for
> NUnit. I.e., if I build an assembly with NUnit for .Net 2.0, I should be
> able to run with NUnit for .Net 3.5+ without recompiling.

If you build a test against the NUnit framework for .NET 2.0, it will
always run
against that framework because it's a reference. The test engine and console
runner (as well as the coming gui runner) are completely separate from the
framework and make no reference to it - just as now, in fact.

If we continue to support addins in the framework then we will have to maintain
a common interface for addins, but that's separate from the interface used
by tests.

So I really don't see this as a problem, but even so, we could use a different
syntactic element to avoid confusion if it seems better.

Charlie

> Just my 2 cents :)
>
>
> Cheers,
> Kenneth
>

Kenneth Xu

unread,
Aug 7, 2011, 7:35:32 PM8/7/11
to nunit-...@googlegroups.com
Hi Charlie,

Thanks for your patience being with me thus far.

On Sun, Aug 7, 2011 at 6:51 PM, Charlie Poole <nuni...@gmail.com> wrote:
> Hmm... more features doesn't have to be incompatible. For example, if we
> just add one overloaded method:
>
> Assert.That<T>(T value, Expression<Predicate<T>> criteria) {...}
>
> then it is fully binary compatible.

FWIW, when I have talked about full compatibility, I meant both
forward and backward. We only aim for backward compatibility.
My bad, shouldn't have used word "fully". I meant backward binary compatibility from build for higher .Net framework to lower .Net framework of same version of NUnit 3.x.

Note too that we can't have binary compatibility between the
2.0/3.0 builds and the 4.0 build.
Just want to confirm, do you mean by design, there won't be such backward binary compatibility?

It may be easier with an example. Let's say I'm writing some reusable test library (x.dll), one function is to test any class that implements a given interface. That x.dll will be in turn used in various test projects built for various .Net frameworks. Further, x.dll builds well with NUnit framework for 2.0. Do you mean that I'm now forced to create different builds of x.dll for each corresponding NUnit framework builds?

Thanks,
Kenneth

Charlie Poole

unread,
Aug 7, 2011, 8:16:50 PM8/7/11
to nunit-...@googlegroups.com
Hi Kenneth,

On Sun, Aug 7, 2011 at 4:35 PM, Kenneth Xu <ken...@homexu.com> wrote:
> Hi Charlie,
>
> Thanks for your patience being with me thus far.
>
> On Sun, Aug 7, 2011 at 6:51 PM, Charlie Poole <nuni...@gmail.com> wrote:
>>
>> > Hmm... more features doesn't have to be incompatible. For example, if we
>> > just add one overloaded method:
>> >
>> > Assert.That<T>(T value, Expression<Predicate<T>> criteria) {...}
>> >
>> > then it is fully binary compatible.
>>
>> FWIW, when I have talked about full compatibility, I meant both
>> forward and backward. We only aim for backward compatibility.
>
> My bad, shouldn't have used word "fully". I meant backward binary
> compatibility from build for higher .Net framework to lower .Net framework
> of same version of NUnit 3.x.
>>
>> Note too that we can't have binary compatibility between the
>> 2.0/3.0 builds and the 4.0 build.
>
> Just want to confirm, do you mean by design, there won't be such backward
> binary compatibility?

OK, up to now, I've only been considering test code, for which binary
compatibility
doesn't make much sense - a given test assembly always depends on a particular
version of NUnit (currently) and will depend on a particular version
and build (future).

> It may be easier with an example. Let's say I'm writing some reusable test
> library (x.dll), one function is to test any class that implements a given
> interface. That x.dll will be in turn used in various test projects built
> for various .Net frameworks. Further, x.dll builds well with NUnit framework
> for 2.0. Do you mean that I'm now forced to create different builds of x.dll
> for each corresponding NUnit framework builds?

I think it depends on whether your library has a reference to a particular
nunit.framework build. If there's a reference, then I think you would have to
create a different library for each framework you support. OTOH, if the library
were an addin of some sort - something loaded by NUnit rather than the other
way around - that wouldn't be an issue.

Also, I could be wrong about this. It might be possible to give all the
different builds the same identity and trick your library into using the one
that's found in the directory containing the tests. When I get a chance
I'll run some experiments to test this.

Charlie

> Thanks,

Kenneth Xu

unread,
Aug 7, 2011, 8:47:44 PM8/7/11
to nunit-...@googlegroups.com
Hi Charlie,

On Sun, Aug 7, 2011 at 8:16 PM, Charlie Poole <nuni...@gmail.com> wrote:
I think it depends on whether your library has a reference to a particular
nunit.framework build. If there's a reference, then I think you would have to
create a different library for each framework you support.
If you mean .net framework, then I don't have to. If you mean NUnit framework, then it depends on how the NUnit framework is built.
 
OTOH, if the library
were an addin of some sort - something loaded by NUnit rather than the other
way around - that wouldn't be an issue.

Also, I could be wrong about this. It might be possible to give all the
different builds the same identity and trick your library into using the one
that's found in the directory containing the tests. When I get a chance
I'll run some experiments to test this.

Yes, that can be done. And that's how exactly NLog binaries are built. NLog bulid for .Net 3.5 is binary backward compatible with build for .Net 2.0. That's the only way to let assemblies built for various .Net framework using NLog to work together.

Cheers,
Kenneth

Charlie Poole

unread,
Aug 7, 2011, 11:37:52 PM8/7/11
to nunit-...@googlegroups.com
I'll take a look at how NLog does it.

Charlie

Reply all
Reply to author
Forward
0 new messages