Unit Test - Ideas / Thoughts

24 views
Skip to first unread message

James Buckingham

unread,
Mar 17, 2011, 10:41:01 AM3/17/11
to valida...@googlegroups.com
Hi groups,

Hope you don't mind but I'm going to throw a stone in the pond, relating to Unit Testing Validation scripts, and see what kind of ripples I get back :-)

----

Our current setup

At the moment we've got a pile of unit tests for our validation. It's a 1-2-1 setup where I've got x1 TestSuite for x1 VT script against x1 object.

The approach I've taken to date has been to identify each context within the VT script and then write my own tests to simulate the object going through them.

I've written my own MXUnit Custom Assertion, appropriately named assertFailedValidation, that checks the following:-

That it actually failed
The expected number of errors,
The error messages

This is all fine but I'm starting to feel that I'm doing the same thing over and over again for each one and the smallest change to these VT scripts means I've got to recode the tests as well.

So I'm at a stage now of thinking - there has got to be a better way!

----

Food for Thought

I guess the first question is - how other peoples doing their tests? Do you take the same approach or something a bit more funky?

A different approach I'm thinking about just now is to create some kind of "Base" test suite. Off the top of my head this could work like so:-

1) Give the base test(s) the name of the validation script I want to run and an object to run it against
2) The base looks at the file, works out all context within it ( so identify paths ). The base begins tests by looping over those contexts ( and all context as well I guess )
3) For each property it identifies the rules needing to be run and within this test suite I setup Unit Tests for those rules.

So the idea would be that I can take AN object, A VT script and apply standard Unit Tests for each rule against that object.

I'm not quite sure though how I would handle exceptions. For example I've got certain parts where I'm needing to decorator the object and setup some custom query stuff. So I would need to some how exclude certain rules from the standard tests.

----

Anyway think that's enough to mention just now :-)

Cheers,
James

Bob Silverberg

unread,
Mar 17, 2011, 11:38:54 AM3/17/11
to valida...@googlegroups.com
Great questions James. I recall that Marc Esher was doing something
using dataproviders where he would setup data for a number of objects
that should pass and a number of objects that should fail and then
write tests to assert that each object did in fact pass/fail (using
the dataproviders). I'm not sure if he was checking for actual
failure messages, or just using sample data to check for pass/fail. I
never saw the actual tests, only heard from Marc about it. So...

Marc, would it be possible for you to describe a bit better than I
just did what you were doing and how it worked, and perhaps provide
some code samples? I, James, and I'm sure others would greatly
appreciate it.

Cheers,
Bob

> --
> You received this message because you are subscribed to the Google Groups
> "ValidateThis" group.
> To post to this group, send email to valida...@googlegroups.com.
> To unsubscribe from this group, send email to
> validatethis...@googlegroups.com.
> For more options, visit this group at
> http://groups.google.com/group/validatethis?hl=en.
>

--
Bob Silverberg
www.silverwareconsulting.com

Marc Esher

unread,
Mar 17, 2011, 1:16:42 PM3/17/11
to valida...@googlegroups.com, Bob Silverberg
Dataproviders are gold for this. I wrote a few assertions (attached)
which I use for more easily testing validations success/failure, as
well.

My approach is generally to set up a query, or an array of structs,
which contains the input data, the expected validation result
(pass/fail), the expected failure field/fields if i expect a failure,
and even an expected failure message if I expect a failure. Something
like:

private function createValidationFailuresDataProvider(){
variables.validationFailuresDataProvider = [
{ firstname="", lastname="", username="", birthday="marc",
expectfailure=true, failurefields=["firstname", "lastname", username,
"birthday"], failurefieldtypes=["required","required","required","date"]}
,{firstname=repeatString("a", 1000), lastname=repeatString("a",
1000), username=repeatString("a",1000), birthday=5,
failurefields=["firstname", "lastname", username, "birthday"],
failurefieldtypes=["required","required","required","date"] }
];
}

I add to that failure data provider as I find or think of more
conditions to test against.

and then the test will be something like:

createValidationFailuresDataProvider();
/**
* @mxunit:dataprovider validationFailuresDataProvider
*/
function validateUser_should_validate_all_expected_conditions( condition ){
var user = createUser( condition.firstname, condition.lastname,
condition.username, condition.birthday );
var result = validateThis.validate( user, "User" );
var property = "";
for( property in condition.failurefields ){
assertValidationFailureExists(....);
}
}

You could even get better (I'm lame, so I didn't) and write an
assertion function that took all the failure fields, failure field
types, and failure messages and did the looping fo ryou so it'd look
like:

var result = ....;
assertAllValidationFailuresExist( result.getFailures(),
condition.failurefields, condition.failurefieldtypes,
condition.failureMessages );


I do the same for non-failures, as well. So I'll create an array of
structs of known *good* data, call it something like
validationPassesDataProvider, and write a test that ensures those
conditions do not fail.

What I love about data provider driven tests is that when you think of
new conditions, or perhaps business logic changes, you simply change
or add to your dataprovider and the test just does what it does. no
need to write extra assertions.

Marc

ValidationAssertions.cfc

John Whish

unread,
Mar 17, 2011, 1:26:10 PM3/17/11
to valida...@googlegroups.com
Nice Marc, thanks for posting.

Have you got a Matrix style brain upgrade I could use - there's so
much good stuff to learn.

- John

Marc Esher

unread,
Mar 17, 2011, 1:44:03 PM3/17/11
to valida...@googlegroups.com, John Whish
On Thu, Mar 17, 2011 at 1:26 PM, John Whish <john....@googlemail.com> wrote:
> Nice Marc, thanks for posting.
>
> Have you got a Matrix style brain upgrade I could use - there's so
> much good stuff to learn.

I wish! I tend to repeat this mantra to myself whenever I get in a
"there's too much to learn" funk: "Step by Step, up toward the
mountaintop!"

James Buckingham

unread,
Mar 17, 2011, 6:33:31 PM3/17/11
to valida...@googlegroups.com, John Whish
Thanks a lot for the input Marc, although I can't seem to see the attachment :-|. Don't know if it's something about the new version of Google Groups. Either that or I'm going blind ( more likely! ).

Can definitely see the benefit in using a DataProvider - quite like that approach actually :-) - I'll go and have a play with some simple tests and build up from there.

---

I'm just wondering though if there is a way of taking things one step further and actually getting the tests to read the VT XML, identify the rules being set against a property and then setting that property up with typical scenerios.

Things like conditions and params though I'm not sure how that would be managed. Could get messy.

I'm kind of "typing aloud" on that one but maybe this is an opportunity to create some common tests for users to run against their work? A potential "extra" for VT.

Cheers,
James

Bob Silverberg

unread,
Mar 17, 2011, 6:43:16 PM3/17/11
to valida...@googlegroups.com
On Thu, Mar 17, 2011 at 6:33 PM, James Buckingham <clar...@gmail.com> wrote:
> Thanks a lot for the input Marc, although I can't seem to see the attachment
> :-|. Don't know if it's something about the new version of Google Groups.
> Either that or I'm going blind ( more likely! ).
> Can definitely see the benefit in using a DataProvider - quite like that
> approach actually :-) - I'll go and have a play with some simple tests and
> build up from there.

Thanks indeed Marc. I'll admit that I haven't peeked inside the
attachement yet, but I _can_ see it in gmail. This is something that
would be great to add to the VT wiki. I'll put that on my todo list.

> ---
> I'm just wondering though if there is a way of taking things one step
> further and actually getting the tests to read the VT XML, identify the
> rules being set against a property and then setting that property up with
> typical scenerios.
> Things like conditions and params though I'm not sure how that would be
> managed. Could get messy.
> I'm kind of "typing aloud" on that one but maybe this is an opportunity to
> create some common tests for users to run against their work? A potential
> "extra" for VT.

I'm not sure that makes sense, James. If you did that all you'd really
be testing is the framework, and it already has its own suite of
tests. I'm not claiming that it's infallible, but when you use a fw
like VT you generally assume that it works as advertised.

I think what you need to test is _your_ configuration, not the fw.
That necessarily means that you have to create the tests to indicate
what should be valid and what should not. If you wrote something to
inspect the VT xml and then generate tests you wouldn't really be
testing your configuration at all.

Does that make sense?

Cheers,
Bob

> Cheers,
> James
>


--
Bob Silverberg
www.silverwareconsulting.com

James Buckingham

unread,
Mar 17, 2011, 7:03:48 PM3/17/11
to valida...@googlegroups.com

I'm not sure that makes sense, James. If you did that all you'd really
be testing is the framework, and it already has its own suite of
tests. I'm not claiming that it's infallible, but when you use a fw
like VT you generally assume that it works as advertised.

Absolutely. I was question myself about exactly the same thing going home on the bus tonight :-)

I think what you need to test is _your_ configuration, not the fw.
That necessarily means that you have to create the tests to indicate
what should be valid and what should not. If you wrote something to
inspect the VT xml and then generate tests you wouldn't really be
testing your configuration at all.

Does that make sense?


Yeap it does ;-), I have to disagree that I'm testing the framework though. The UTs in VT, I'm guessing, are testing the rules insolation? So they're not testing them with my objects and my combination of rules?

I'm suggesting that I take AN object and my projects VT script and run its properties through a set of common tests. In terms of simple rules like "required" its fairly black and white. Test for it being there / not being there. 

For things like max / min I'm assuming that my tests would need to read the parameters off the XML and then setup the object properties for those tests.

I don't know if I'm trying to break a nut with a sleigh-hammer here. I'm just seeing, even with Marc's similar but more streamlined solution, that Developers using VT are potentially going to be building the same kinds of Unit Tests as us.

Why reinvent the wheel :-)

Cheers,
James

Marc Esher

unread,
Mar 17, 2011, 11:59:20 PM3/17/11
to valida...@googlegroups.com, James Buckingham
First, you guys are fantastic. This is a quite solid conversation
amongst really smart folk and I'm proud to be a part of it. I wish the
rest of the CF world would peep in on this stuff from time to time.

I have a *very strong opinion* on this one, as a guy who's written too
damn many unit test. Normally I'm a "strong opinions, weakly held"
dude.

Here's an exception.

I believe it is absolutely wrong to automate this kind of testing by
parsing the guts of the underlying framework in order to prove that
something works correctly.

When you're testing validations on an object, you are stating to the
world -- and by 'world' I mean "the world in which your Object lives"
-- that "These conditions are what I, the test writer, declare as the
conditions under which validation MUST fail. This is a very deliberate
statement. In TDD, this would happen before you've written a single
line of production code. When using VT, this happens even before
you've written a rules file for this object.

In the real world, this means that you're writing a test, perhaps with
a dataprovider which describes a bunch of failed validations, which
can only possibly pass if your object validates for all those cases.
If it fails for any of those tests, then your validation is
incomplete.

The only possible way to do this with integrity is to write the
*expectation* manually. Or, perhaps more accurately, The thing that
describes your *expectations* MUST be independent of the underlying
mechanism for validating. If you say "these 20 conditions are failure
conditions, for these reasons", then you can only do that manually (or
with a tiny bit of code that generates those conditions. But those
conditions are framework independent.

blah blah blah. What this means in real life is that if you write a
test which parses the validation XML and then creates validation
conditions based on that, you are only testing the validation rules
you've described. You are absolutely not testing what you, the
programmer of the unit test, have indicated is the "validation story"
for your object. Once you write your automation thing which parses VT
and proves that your validations work, it will *always* be correct.

This is a very bad thing.

Imagine:

you have a user. you write a "User.xml" file with a single rule, a
"required" on "firstName". and you run your test, which parses your
User.xml and creates failure cases for that file. Well, that automated
test thing only cares about "firstName", since that's the only rule in
the file. so your automated test thing creates conditions that are
guaranteed to fail for "first name required", and "first name less
than 20 chars", or whatever. but it can't possibly flag "lastName is
required", because that's not in your User.xml file yet.

What you end up with is a test which generates "fuzz" data -- username
empty, username less than minChars, username GT maxChars -- but that's
it! It can't possibly generate meaningful tests for the other
properties because those aren't' described -- YET -- in the User.xml
file

So your test, then, depends on the state of your user.xml file. Let's
say that your system expects that a user's last name is required. And
your VT user.xml file doesn't have that rule.

Your test will pass! how can it not?

BUT if you have an array of structs which describes your failure
conditions, your tests will only pass when your validation code -- in
this case your User.xml file -- completely declares your validations.

This means that the job of your tests is to prove that your validation
routine -- whatever that may be -- correctly validates your object.
But your tests cannot possibly do that job if they rely on the
underlying framework to provide them with the information they need to
perform those assertions.

Put another way: your tests prove your validation story. VT implements
your validation story. Your tests should *only ever pass* when VT
fully implements that validation story, and not before. And in this
sample case, where your validation story is "firstname is required,
lastname is required, username is required and cant' be a duplicate
and can't contain the firstname or lastname and can't contain the word
'unicorns'", then a user.xml file whose only "rule" is "username" is
required" should not pass that test!

But if you're using the user.xml to drive your test -- i.e. your
implementation drives your *proof* -- then that test will pass if you
automate it by parsing the implementation. And that is clearly not
correct.

Here's an image I keep in my head when I'm writing validation tests.
And I consider validation testing extremely important. I picture my
tests as cruel taskmasters, with a whip in hand, cracking my system's
ass whenever it doesn't work. And it's gonna crack ass until I get
that user.xml file in a state that it *serves its master*. The tests
are the master, because that's the only place in the system that knows
with certainty what exactly constitutes validity.

It's a crude analogy, but it suffices: Tests prove what's correct.
Implementation wants to be correct, but is only deemed so when tests
put the whip back and say "You, implementation, have done your job. Ye
shall be spared tonight".

Don't skimp here. Don't shortcut. If it helps, create a spreadsheet
with the fields you expect to be valid. Add rows for all manner of
conditions you expect to fail. Give that to your business folks and
have them fill it in. then use that spreadsheet as your dataprovider
-- mxunit has a "file' dataprovider born for cases like this.

When all rows in that spreadsheet pass validation, based on your
implementation, then you've done your job.

Best,

marc

Matt Quackenbush

unread,
Mar 18, 2011, 12:15:04 AM3/18/11
to valida...@googlegroups.com
Marc, I have read countless posts over the years.  I have read who knows how many books.  I have read all kinds of things (many written by you!) about testing.  That, sir, is one of the greatest posts I have ever read.  *THAT* post, _really_ drove the point home and made it crystal clear in my head.  THANK YOU for taking the time to add an exception to your "weakly held" rule and writing it!  :-)

James Buckingham

unread,
Mar 18, 2011, 6:05:26 AM3/18/11
to valida...@googlegroups.com
Cheers dude for the HUGE input :-), defiantly a ripple coming back there that's worth some serious thinking.

My initial thoughts are - the brain of Marc is a wonderfully weird but some what scary place.

Did someone just hear the sound of a cracking whip there?!? 

....probably my imagination :-)

Cheers,
James

Jason Seminara

unread,
Apr 11, 2011, 5:51:33 PM4/11/11
to ValidateThis
I agree! Amazing concise post. I wrote some similar tests that verify
my custom VT validation types, but didn't know about data providers
until now. My tests are primarily concerned with testing the custom
validators, but I don't see why they can't be used in his same way.
This addition will probably drastically change my implementation. Ill
try to post my code for review when I clean it up enough to be
presentable :)

Thanks so much for the post.
-Jason

Marc Esher

unread,
Apr 11, 2011, 10:45:21 PM4/11/11
to valida...@googlegroups.com

Glad to contribute, gents.

One thing: a case where automation would help here is a CFBuilder
extension... in my experience probably 60-70% of validation rules
could be derived, or at least well-started, by introspecting the
database and looking at the columns. I could see a CFB extension where
you right click on a table in the RDS View, select "Create
ValidateThis file", and have it spit out a VT rules file that would
create, at a minimum, the "required" rules, maxlength rules, and
perhaps some rules around numeric data.

Best,

Marc

> -Jason

Reply all
Reply to author
Forward
0 new messages