I love the idea of writing textual feature descriptions, but these
will most likely be at a higher abstraction level than most unit
tests. They will probably be written by a business analyst or product
owner. They just want to describe what the feature should do without
caring about the details. These will almost always be integration
tests, won't they? They will probably need to hit the db to test for
things like "A Product is created" or something like that.
Unit tests, on the other hand, are very detailed and deeper than a
product owner or user story acceptance test needs to go. They also
cannot cross module boundaries and therefore shouldn't hit the
database, for instance.
What I'd like to know is what the workflow looks like for you guys in
when you write unit tests and when you simply satisfy the acceptance
tests in your feature files.
I see it going one of three ways:
- Either you guys write very specific acceptance tests and these
textual tests ARE your unit tests (not real product owner friendly,
but could maybe work if a developer expanded upon what the owner
wrote)
- All your acceptance tests are basically integration tests, and
that's as granular as your testing gets (seems like you'd lose
something important from TDD here)
- You write granular unit tests that help you accomplish or move
towards your acceptance tests, which primarily act as a guide (sounds
the best, but I'd love to see an example because I've seen zero :))
Any help or examples you can provide would be great. For a newcomer
to TDD and BDD, it's difficult to see where they all fit together in
the land of unit and integration tests.
Thanks,
Tim
On Feb 26, 8:36 pm, timhardy <hardy....@gmail.com> wrote:
> I love the idea of writing textual feature descriptions, but these
> will most likely be at a higher abstraction level than most unit
> tests. They will probably be written by a business analyst or product
yes, specflow is to be used on the 'analysis' level (although IMHO the
technical personnel should be part of the analysis, so it'll be
probably forged out together), not on the unit test level.
> owner. They just want to describe what the feature should do without
> caring about the details. These will almost always be integration
> tests, won't they? They will probably need to hit the db to test for
> things like "A Product is created" or something like that.
It depends on what do you call integration tests. I'd say anything
that tests more components working together (even if there's no
'external' system like a DB is involved) is an integration test, so in
this sense, the SF test will definitely be integration tests in almost
all of the cases.
As for whether you'll really touch the DB or just mock it; use the
real UI frontend through a browser or only testing the MVC controllers
- I'd say it's up to how you implement the bindings and of course
should depend your application. On larger projects you may have
multiple strategies for different kinds of features: to use the DB for
a reporting feature that's mostly implemented in custom SQL queries;
to automate the browser testing you're new fancy UI with AJAX/Comet;
and maybe keep away from both for the features that are about business
calculations implemented fully in the middle layer.
> Unit tests, on the other hand, are very detailed and deeper than a
> product owner or user story acceptance test needs to go. They also
> cannot cross module boundaries and therefore shouldn't hit the
> database, for instance.
a unit test, at least how I see it, should not cross _any_ boundaries,
only the interface of the particular unit, otherwise it's more than a
unit you're testing.
> What I'd like to know is what the workflow looks like for you guys in
> when you write unit tests and when you simply satisfy the acceptance
> tests in your feature files.
>
> I see it going one of three ways:
> - Either you guys write very specific acceptance tests and these
> textual tests ARE your unit tests (not real product owner friendly,
> but could maybe work if a developer expanded upon what the owner
> wrote)
> - All your acceptance tests are basically integration tests, and
> that's as granular as your testing gets (seems like you'd lose
> something important from TDD here)
> - You write granular unit tests that help you accomplish or move
> towards your acceptance tests, which primarily act as a guide (sounds
> the best, but I'd love to see an example because I've seen zero :))
Well, I'd say it's the 4th approach I try to follow, basically the
opposite of you're last one:
as the features (the 'what') we are going to implement are - in the
ideal case :) - defined when thinking about the code (the 'how')
itself can start, so can the features be written in specflow.
then writing the code can start (and writing the appropriate unit test
should be inseparable of that, personally I recommend doing it really
the test-first way), with the aim to fill in the 'gaps' of the
features (the bindings to make the SF tests be 'really' red, then
their implementation to make them green). Originally we imagined this
to be almost always outside-in (so starting from the uppermost layer
the binding connects to and going down to the other layers until all
the tests are fulfilled), although in practice it's not always like
that (whether it's because of lack of discipline, lack of experience
or just because oversimplified theories rarely work in reality, I
don't have an opinion yet).
> Any help or examples you can provide would be great. For a newcomer
> to TDD and BDD, it's difficult to see where they all fit together in
> the land of unit and integration tests.
>
> Thanks,
> Tim
br,
sztupi