JUnit is a testing framework for Java, mostly focused on unit testing.
It is also used for writing integration and acceptance tests, making use of other libraries such as Selenium.
JUnit was massively used by the Java community and thus, its XML test result reports have become a de facto standard for test result reporting.
JUnit Basic Concepts
In JUnit, you have Tests and (Test) Suites. A Suite is a way of aggregating a group of tests together, along with their results. This applies not just to the original Java’s JUnit but also for other implementations that generate the JUnit XML report.
In Java, Tests are created within a Test Case class which will contain the Tests, implemented as class methods (and properly annotated).
The Test Case classes may be grouped in Test Suites.
JUnit provides way more concepts (Test Runners, Test Fixtures, Categories, etc.) although they are not relevant in this context.
Importing JUnit XML reports
Xray supports Junit XML imports and you read more here.
Below is a simplified example of a JUnit XML report containing a Test Suite with one Test Case.
The simplified tags hierarchy of these reports can be represented in the following diagram:
JUnit’s Test Cases are identified by the pair of attributes “classname” and “name” attributes.
Test Cases are imported to Xray’s Generic Test issues, and the “classname” and “name” attributes are concatenated and mapped to the Generic Test Definition field of the Generic Test.
If a Test already exists with the same Generic Test Definition, then it is not created again.
The Summary of the each Test issue will be based on the "name" attribute of the "testcase" element.
Test Cases are imported to a new (or user-specified) Test Execution in the context of some project, along with their respective execution results.
JUnit’s Test Suites are not mapped to any special entity. However, the execution details screen will show the Test Suite related to a specific test result.
The status of the Test Run will be set based on the Test Case result:
with failures (i.e. if <testcase> element contains an inner <failure> element; the text content of the element will be shown in the message)
with errors (i.e. if <testcase> element contains an inner <error> element; the text content of the element will be shown in the message)
skipped (i.e. if <testcase> element contains an inner <skipped> element; the text content of the element will be shown in the message)
without failures, errors, and that weren’t skipped
Note: Test Cases with the status FAIL may have an error/failure message which can be seen in the Test Run screen, under the Results section.
If the same Test Case has been executed on multiple Test Suites, then the result for each Test Suite will be shown.
When a Test Case is executed in multiple Test Suites, the overall status of the Test Run will be calculated as a joint value.
Overall status of the Test Run
If all the mapped results of the Test Case was PASS
If any of the mapped results of the Test Case was FAIL
Notes and Limitations
- attachments (e.g. screenshots and other files) are not supported/imported as they are not embedded in the XML report; it seems to be possible to add references to their local paths in the <system-out/> element but these cannot be imported as they are external to the report
Xray extended JUnit format
Test issue id/key
Two scenarios are supported to specify an existing test to import the JUnit test case to:
- A test issue id is passed as a
test_idproperty on the
- A test issue key is passed as a
test_keyproperty on the
If both properties exist in a
test_id will be used. If the given test issue id or key does not exist, an error will be thrown.
Three scenarios are supported to link a test with requirements:
- A requirement key is passed as a
requirementattribute on the
- A requirement key is passed on a
requirementproperty element inside the
- Multiple requirement keys are passed on a
requirementsproperty element inside the
testcaseelement, separated by "," (comma).
test_summary property element inside the
testcase element to explicitly set the issue summary. This summary will be used both to create or update the test.
If importing to a new test and the summary is not explicitly defined, it will default to the
name attribute of the
test_description property element inside the testcase element to set the issue description. This description will be used both to create or update the test.
tags property element inside the
testcase element to add labels to the issue. Multiple labels must be separated by "," (comma).
Test run comment
testrun_commentproperty element inside the
testcase element to set the overall comment of the test run.
Test run evidence
testrun_evidence property element inside the
testcase element to add files as global evidence on the test run. Each evidence must be an
item element inside the property, with the filename in the
name attribute and with Base64 encoded content.
Test run custom fields
Two scenarios are supported to set test run custom fields:
- Each custom field in a
testrun_customfieldproperty element inside the
testcaseelement. The name of the custom field must appear after the "testrun_customfield:" prefix in the
nameattribute and the value should be in the
- Multiple custom fields in a
testrun_customfieldsproperty element inside the
testcaseelement. Each custom field should be an
itemelement inside the property, with the custom field name in the
nameattribute and value in the element content.
In both scenarios, multiple select custom fields should have their values separated by ";" (semicolon).