Meaning of testresults.xml and <testsforfix>

15 views
Skip to first unread message

Francisco Servant

unread,
Jul 11, 2010, 3:20:10 AM7/11/10
to iBugs
Hello,

I am using iBugs for a dynamic analysis project.

My problem is that after taking a look at the data provided by iBugs,
I still cannot exactly figure out which tests reproduce a certain bug.

My current guesses are that, for a certain revision:

a) All the test cases that existed in the pre-fix version are listed
in the file "pre-fix/testresults.xml".

b) The exact same list of test cases was executed with the post-fix
version, producing the file "post-fix/testresults.xml".

c) "post-fix/testresults.xml" does not include any of the test cases
that were committed together with the fix.

d) The test cases that were committed with a fix are reported in the
xml tags "<testsforfix>" in repository.xml.

e) The complete list of test cases corresponding to the post-fix
version is: all those in "post-fix/testresults.xml" + the ones
reported in "<testsforfix>".

f) The results of executing the tests in "<testforfix>" are not
reported by iBugs for either the pre-fix or post-fix version. However,
iBugs allows us to create a script to execute them ourselves.

g) We can assume that the test cases that reproduce a certain bug are
those reported in its "<testsforfix>" tags.

h) However, there are chances that the fix committed for a bug also
makes some old test cases pass. This is what explains possible
differences between the results reported in "pre-fix/testresults.xml"
and "post-fix/testresults.xml".
Those test cases with different results in the pre-fix and post-fix
revisions might also reproduce a bug.

Could you confirm whether these points are true?

I apologize for the long list of questions :)
Thank you very much!
Francisco

Valentin Dallmeier

unread,
Jul 11, 2010, 9:35:37 AM7/11/10
to ib...@googlegroups.com
Hi again!

> My current guesses are that, for a certain revision:
>
> a) All the test cases that existed in the pre-fix version are listed
> in the file "pre-fix/testresults.xml".

Almost true. pre-fix/testresults.xml contains all tests executable on
my system. Some tests depend on the setup of the system they are run
on. My guess is you should be happy with those tests found in
pre-fix/testresults.xml .

>
> b) The exact same list of test cases was executed with the post-fix
> version, producing the file "post-fix/testresults.xml".

Almost true (see above).

>
> c) "post-fix/testresults.xml" does not include any of the test cases
> that were committed together with the fix.

Wrong.

>
> d) The test cases that were committed with a fix are reported in the
> xml tags "<testsforfix>" in repository.xml.

True.

>
> e) The complete list of test cases corresponding to the post-fix
> version is: all those in "post-fix/testresults.xml" + the ones
> reported in "<testsforfix>".

Wrong (see above).

>
> f) The results of executing the tests in "<testforfix>" are not
> reported by iBugs for either the pre-fix or post-fix version. However,
> iBugs allows us to create a script to execute them ourselves.

This is a difficult one. When running tests you will notice that not
all of those reported as failing in any testresults.xml fail and some
reported as passing fails. This is due to the setup of the machine
(AspectJ has a hell of a build process. We tried to remove as many
dependencies as possible, but some are still there). My advice: Use
testresults.xml as a list of available tests, then use the ant scripts
to generate each test and see if it fails on your system.

> g) We can assume that the test cases that reproduce a certain bug are
> those reported in its "<testsforfix>" tags.

Yes/no. Usually, none of the pre-existing tests find a new bug
(otherwise the developers would have fixed it before). Also, not all
of the associated tests (testsforfix) actually fail. Again, run the
test and see if it fails on your system.

>
> h) However, there are chances that the fix committed for a bug also
> makes some old test cases pass. This is what explains possible
> differences between the results reported in "pre-fix/testresults.xml"
> and "post-fix/testresults.xml".
> Those test cases with different results in the pre-fix and post-fix
> revisions might also reproduce a bug.

Wrong (see above).

>
> Could you confirm whether these points are true?

I hope that some of my answers are actually helpful.

>
> I apologize for the long list of questions :)

You are very welcome. Again, thank you for your interest in iBUGS and
I hope that it helps you. In any case, please let me know if you have
more questions.

Regards,

Valentin

Francisco Servant

unread,
Jul 12, 2010, 4:27:35 PM7/12/10
to iBugs
Thank you for your answers.

However, there are some things that I still quite don't understand.
Please, see my comments below:

On Jul 11, 6:35 am, Valentin Dallmeier <valentin.dallme...@gmail.com>
wrote:
> Hi again!
>
> > My current guesses are that, for a certain revision:
>
> > a) All the test cases that existed in the pre-fix version are listed
> > in the file "pre-fix/testresults.xml".
>
> Almost true. pre-fix/testresults.xml contains all tests executable on
> my system. Some tests depend on the setup of the system they are run
> on. My guess is you should be happy with those tests found in
> pre-fix/testresults.xml .

If these are the tests that were executable on your system, how did
you produce the file "testresults.xml"?
Where can I find a complete list of the tests, whether they were
executable on your system or not?

>
>
>
> > b) The exact same list of test cases was executed with the post-fix
> > version, producing the file "post-fix/testresults.xml".
>
> Almost true (see above).

So, is it possible that for some bug reports the tests in
"<testsforfix>" are also included in "post-fix/testresults.xml"?

Additionally, is it possible that for some bug reports the tests in
"<testsforfix>" are also included in "pre-fix/testresults.xml"?

>
>
>
> > c) "post-fix/testresults.xml" does not include any of the test cases
> > that were committed together with the fix.
>
> Wrong.

I picked 3 example bug reports (29769, 158412, 104218), and in all of
them, "pre-fix/testresults.xml" and "post-fix/testresults.xml"
contained the same list of tests, only with different results.
Also, in all these 3 examples, "post-fix/testresults.xml" did not
contain the tests in <testsforfix>.

Does this mean that in these examples the tests in <testsforfix> were
not executable in your system? Is that the reason why they do not
appear in "post-fix/testresults.xml"?

>
>
>
> > d) The test cases that were committed with a fix are reported in the
> > xml tags "<testsforfix>" in repository.xml.
>
> True.
>
>
>
> > e) The complete list of test cases corresponding to the post-fix
> > version is: all those in "post-fix/testresults.xml" + the ones
> > reported in "<testsforfix>".
>
> Wrong (see above).
>
>
>
> > f) The results of executing the tests in "<testforfix>" are not
> > reported by iBugs for either the pre-fix or post-fix version. However,
> > iBugs allows us to create a script to execute them ourselves.
>
> This is a difficult one. When running tests you will notice that not
> all of those reported as failing in any testresults.xml fail and some
> reported as passing fails. This is due to the setup of the machine
> (AspectJ has a hell of a build process. We tried to remove as many
> dependencies as possible, but some are still there). My advice: Use
> testresults.xml as a list of available tests, then use the ant scripts
> to generate each test and see if it fails on your system.
>
> > g) We can assume that the test cases that reproduce a certain bug are
> > those reported in its "<testsforfix>" tags.
>
> Yes/no. Usually, none of the pre-existing tests find a new bug
> (otherwise the developers would have fixed it before). Also, not all
> of the associated tests (testsforfix) actually fail. Again, run the
> test and see if it fails on your system.

Do you mean that only the test cases in "<testsforfix>" are able to
reproduce the problem?
In such case, they "should" fail in the pre-fix version and pass in
the post-fix version.

However, how can I execute the tests in "<testsforfix>" with the pre-
fix version?
What will happen if I ask iBugs to create the script to run one of
these tests in the pre-fix version? Will it find the test or will I
have to copy them from the post-fix version first?

>
>
>
> > h) However, there are chances that the fix committed for a bug also
> > makes some old test cases pass. This is what explains possible
> > differences between the results reported in "pre-fix/testresults.xml"
> > and "post-fix/testresults.xml".
> > Those test cases with different results in the pre-fix and post-fix
> > revisions might also reproduce a bug.
>
> Wrong (see above).
>
>
>
> > Could you confirm whether these points are true?
>
> I hope that some of my answers are actually helpful.
>
>
>
> > I apologize for the long list of questions :)
>
> You are very welcome. Again, thank you for your interest in iBUGS and
> I hope that it helps you. In any case, please let me know if you have
> more questions.
>
> Regards,
>
> Valentin

Sorry if my questions are a little confusing. Please, let me know if
something is not clear.
Thank you again!
Francisco

Valentin Dallmeier

unread,
Jul 15, 2010, 3:02:23 AM7/15/10
to ib...@googlegroups.com
Hi!

On Mon, Jul 12, 2010 at 10:27 PM, Francisco Servant
<fser...@ics.uci.edu> wrote:
> Thank you for your answers.
>
> However, there are some things that I still quite don't understand.
> Please, see my comments below:
>
> On Jul 11, 6:35 am, Valentin Dallmeier <valentin.dallme...@gmail.com>
> wrote:
>> Hi again!
>>
>> > My current guesses are that, for a certain revision:
>>
>> > a) All the test cases that existed in the pre-fix version are listed
>> > in the file "pre-fix/testresults.xml".
>>
>> Almost true. pre-fix/testresults.xml contains all tests executable on
>> my system. Some tests depend on the setup of the system they are run
>> on. My guess is you should be happy with those tests found in
>> pre-fix/testresults.xml .
>
> If these are the tests that were executable on your system, how did
> you produce the file "testresults.xml"?
> Where can I find a complete list of the tests, whether they were
> executable on your system or not?

Unfortunately there is no list. AspectJ uses harness tests specified
in xml files and unit tests in Java files. You'd have to search all
those files to build a complete list. For a subset of all tests,
AspectJ offers ant goals to execute them.

>
>>
>>
>>
>> > b) The exact same list of test cases was executed with the post-fix
>> > version, producing the file "post-fix/testresults.xml".
>>
>> Almost true (see above).
>
> So, is it possible that for some bug reports the tests in
> "<testsforfix>" are also included in "post-fix/testresults.xml"?

Yes.

>
> Additionally, is it possible that for some bug reports the tests in
> "<testsforfix>" are also included in "pre-fix/testresults.xml"?

For most bugs, this should not be the case. In some situations, the
test might have been committed before the fix, and hence the test is
already included pre-fix/testresults.xml .

>> > g) We can assume that the test cases that reproduce a certain bug are
>> > those reported in its "<testsforfix>" tags.
>>
>> Yes/no. Usually, none of the pre-existing tests find a new bug
>> (otherwise the developers would have fixed it before). Also, not all
>> of the associated tests (testsforfix) actually fail. Again, run the
>> test and see if it fails on your system.
>
> Do you mean that only the test cases in "<testsforfix>" are able to
> reproduce the problem?
> In such case, they "should" fail in the pre-fix version and pass in
> the post-fix version.
>
> However, how can I execute the tests in "<testsforfix>" with the pre-
> fix version?
> What will happen if I ask iBugs to create the script to run one of
> these tests in the pre-fix version? Will it find the test or will I
> have to copy them from the post-fix version first?

You will have to copy them from the post-fix version first. Take a
look at which files where changed/added by the fix. New tests usually
alter files in module test.

Regards,

Valentin

Reply all
Reply to author
Forward
0 new messages