Basically our application (that we are testing) does not fully dispose properly so we are all having to share the same single instance of our application during the entire test run. Yes, our devs should do the work to make everything fully dispose properly, but that's not going to happen any time soon. Sharing the same instance is really putting a crimp on our test flexibility and forcing us to coordinate between utterly separate test suites and not allowing us to write tests that bring up our app in different configurations.
What would be really great is if we could configure TestNG to run different test suites and/or different test cases in totally different processes (not threads) - so that way when the test suite ends (or test case ends) our application would be fully gone and the next test case could bring up a new instance of our application. I.e. I need a test case & our app to run in their own JVM instance that then gets destroyed when that test case ends, and then the next test case gets executed in a new JVM.
And of course I don't want to run each test case separately (through ant) because we all like the 1 big report at the end.
---------------------------------------------------------------------
Posted via Jive Forums
http://forums.opensymphony.com/thread.jspa?threadID=64328&messageID=123667#123667
Yeah, the key piece is being able to do aggregate reporting,
basically pointing tng at an existing dir of reports and telling it
to just add stuff, not overwrite.
It also allows for teams to run their own tests and aggregate into a
final report.
I work on the problem of deploying distributed systems, and one of my
part time activities is to do large scale distribution of tests,
running them against databases and applications you've deployed:
http://smartfrog.org/presentations/distributed_testing_with_smartfrog_slides.pdf
So yes, this kind of thing is doable. And as Hani says, the problem is
the reports.
My current prototype streams out XHTML files, flushing them as you go,
so you can see the per-test results. And I have the stubs of the
(remote) aggregator to create the index/summary stuff, but havent
implemented it. So right now my reports not only look worse than the
junitreport content, they arent as easy to navigate.
The gridunit people, http://gridunit.sourceforge.net/ have done some
nice stuff with batch runs of junit tests; they aggregate tests so you
can see which tests work on all machines, which are on some, which are
still ongoing. Its probably the best real-time display of test runs
I've seen, and one that scales up to thousands of machines.
Now, putting all that aside, one little thing I have to choreograph
the functional testing of smartfrog (and its test suite) is a new ant
task,. <functionaltest>
This task has the following workflow
-starts an application <sequence> in one thread. you want to fork an
app, start it with <java fork=true>
-runs a probe, a condition, until the startup timeout says its too late
-runs the <test> sequence, which can be <junit>, <testng>, anything else
-after the test finish, runs the <teardown> sequence (its a kind of finally)
-does some brute force emergency termination if the teardown didnt
stop the application
you can use this to start any program, spin until any ant condition
implies it is running, then run the unit tests -and finally clean up.
This is me using it to test our program:
<functionaltest testTimeout="600" shutdownTime="20">
<application>
<start-logging-daemon spawn="false"/>
</application>
<probe>
<socket port="${smartfrog.daemon.port}" server="localhost"/>
</probe>
<test>
<junit-system-test-run/>
<fail if="test.failed">Junit failed</fail>
</test>
<teardown>
<sf-stopdaemon failonerror="false"/>
<sf-junitreport data="${test.data.dir}"
reports="${test.reports.dir}" />
</teardown>
</functionaltest>
This task may meet your needs. If you are interested contact me
directly and I will who you how to use it (its in the LGPL licensed
smartfrog.org ant tasks JAR, and is somewhat underdocumented as it was
primarily for internal use)
As for merging reports under Ant; you can run any number of reports
that generate the XML files in the same directory, as long as they
have different names. <junitreport> will merge them.
-steve