Can TestNG run tests in separate Processes (not threads)?

596 views
Skip to first unread message

Curtis Siemens

unread,
Feb 13, 2007, 2:17:48 PM2/13/07
to testng...@googlegroups.com
Hi I'm an SDET at my company and there are a few SDETs like me creating different test suites and we are running into a problem with TestNG.

Basically our application (that we are testing) does not fully dispose properly so we are all having to share the same single instance of our application during the entire test run. Yes, our devs should do the work to make everything fully dispose properly, but that's not going to happen any time soon. Sharing the same instance is really putting a crimp on our test flexibility and forcing us to coordinate between utterly separate test suites and not allowing us to write tests that bring up our app in different configurations.

What would be really great is if we could configure TestNG to run different test suites and/or different test cases in totally different processes (not threads) - so that way when the test suite ends (or test case ends) our application would be fully gone and the next test case could bring up a new instance of our application. I.e. I need a test case & our app to run in their own JVM instance that then gets destroyed when that test case ends, and then the next test case gets executed in a new JVM.
And of course I don't want to run each test case separately (through ant) because we all like the 1 big report at the end.
---------------------------------------------------------------------
Posted via Jive Forums
http://forums.opensymphony.com/thread.jspa?threadID=64328&messageID=123667#123667

Cédric Beust ♔

unread,
Feb 13, 2007, 2:24:04 PM2/13/07
to testng...@googlegroups.com
Hi Curtis,

I wrote a prototype to do this a while ago in a project called Distributed TestNG, which I documented here.

That was over a year ago and I haven't worked on it since then, but this proof of concept showed that it was fairly easy to distribute TestNG processes across several machines and send test requests to clients and then get test results back (including the consolidation of the reports that you mention).

The good news is that TestNG definitely supports this, but the bad news is that you will probably have to roll your own implementation.

I was also told recently that a Gigaspaces-backed implementation of Distributed TestNG is currently in the work, but I don't have any more details about it at this point...

Finally, just to make sure you are aware:  you can invoke TestNG from ant in a different JVM (fork="yes").  Not sure it would help you, but thought I'd mention it...

--
Cédric

Hani Suleiman

unread,
Feb 13, 2007, 2:26:41 PM2/13/07
to testng...@googlegroups.com
Yeah, the key piece is being able to do aggregate reporting,
basically pointing tng at an existing dir of reports and telling it
to just add stuff, not overwrite.

Cédric Beust ♔

unread,
Feb 13, 2007, 2:33:18 PM2/13/07
to testng...@googlegroups.com
On 2/13/07, Hani Suleiman <ha...@formicary.net> wrote:

Yeah, the key piece is being able to do aggregate reporting,
basically pointing tng at an existing dir of reports and telling it
to just add stuff, not overwrite.

The aggregation is a bit more complicated than that (well, simpler actually).

In Distributed mode, the main TestNG process receives various ITestResult reports and the HTML generation must happen once it has received them all. 

But I'm curious, what scenario do you have in mind for this "append mode" that you speak of?

--
Cédric

Hani Suleiman

unread,
Feb 13, 2007, 2:40:43 PM2/13/07
to testng...@googlegroups.com
Once it's done, then doing stuff like forking a java processing to
run each test in its own JVM becomes an ant task impl.

It also allows for teams to run their own tests and aggregate into a
final report.

Curtis Siemens

unread,
Feb 13, 2007, 2:55:11 PM2/13/07
to testng...@googlegroups.com
Yes, this is the kind of solution that would work.

---------------------------------------------------------------------
Posted via Jive Forums
http://forums.opensymphony.com/thread.jspa?threadID=64328&messageID=123680#123680

Steve Loughran

unread,
Feb 13, 2007, 4:10:57 PM2/13/07
to testng...@googlegroups.com
On 13/02/07, Hani Suleiman <ha...@formicary.net> wrote:
>
> Yeah, the key piece is being able to do aggregate reporting,
> basically pointing tng at an existing dir of reports and telling it
> to just add stuff, not overwrite.
>


I work on the problem of deploying distributed systems, and one of my
part time activities is to do large scale distribution of tests,
running them against databases and applications you've deployed:
http://smartfrog.org/presentations/distributed_testing_with_smartfrog_slides.pdf

So yes, this kind of thing is doable. And as Hani says, the problem is
the reports.

My current prototype streams out XHTML files, flushing them as you go,
so you can see the per-test results. And I have the stubs of the
(remote) aggregator to create the index/summary stuff, but havent
implemented it. So right now my reports not only look worse than the
junitreport content, they arent as easy to navigate.

The gridunit people, http://gridunit.sourceforge.net/ have done some
nice stuff with batch runs of junit tests; they aggregate tests so you
can see which tests work on all machines, which are on some, which are
still ongoing. Its probably the best real-time display of test runs
I've seen, and one that scales up to thousands of machines.

Now, putting all that aside, one little thing I have to choreograph
the functional testing of smartfrog (and its test suite) is a new ant
task,. <functionaltest>

This task has the following workflow
-starts an application <sequence> in one thread. you want to fork an
app, start it with <java fork=true>
-runs a probe, a condition, until the startup timeout says its too late
-runs the <test> sequence, which can be <junit>, <testng>, anything else
-after the test finish, runs the <teardown> sequence (its a kind of finally)
-does some brute force emergency termination if the teardown didnt
stop the application

you can use this to start any program, spin until any ant condition
implies it is running, then run the unit tests -and finally clean up.

This is me using it to test our program:

<functionaltest testTimeout="600" shutdownTime="20">
<application>
<start-logging-daemon spawn="false"/>
</application>
<probe>
<socket port="${smartfrog.daemon.port}" server="localhost"/>
</probe>
<test>
<junit-system-test-run/>
<fail if="test.failed">Junit failed</fail>
</test>
<teardown>
<sf-stopdaemon failonerror="false"/>
<sf-junitreport data="${test.data.dir}"
reports="${test.reports.dir}" />
</teardown>
</functionaltest>


This task may meet your needs. If you are interested contact me
directly and I will who you how to use it (its in the LGPL licensed
smartfrog.org ant tasks JAR, and is somewhat underdocumented as it was
primarily for internal use)

As for merging reports under Ant; you can run any number of reports
that generate the XML files in the same directory, as long as they
have different names. <junitreport> will merge them.

-steve

Curtis Siemens

unread,
Mar 7, 2007, 8:15:26 PM3/7/07
to testng...@googlegroups.com
This looked like a fair amount of work (extreme time pressures these days) so I managed to get our production code to dispose properly for all beans/objects, so now we can run multiple Test Suites and Test Cases that each create & destroy ALL of the production objects they use and therefore I don't have any more conflicts or a need for this solution. Thanks for the advice.

---------------------------------------------------------------------
Posted via Jive Forums
http://forums.opensymphony.com/thread.jspa?threadID=64328&messageID=130424#130424

Reply all
Reply to author
Forward
0 new messages