Purpose
This report enables you to extract details of a Test Plan, such as the Tests that are part of it, Defects, Requirements and Test Executions, so that you can generate a document report focusing in what matter the most for your team, or even share it with someone else that hasn't access to Jira.
Possible usage scenarios:
- see all the requirements covered by the Test Plan
- see all the defects linked to this Test Plan
- see an overall status summary of the Test Plan
- see a summary of the Tests that are part of the Test Plan
- check a specific detail of a Test Execution (like evidences, attachments, assignee, etc)
Output Example(s)
The following screenshots shows an example of the sections you should expect in this report.
How to use
This report can be generated from the Issue details screen.
Learn more
Source data
This report is applicable to:
- 1 Test Plan issue
Output format
The standard output format is .DOCX, so you can open it in Microsoft Word, Google Docs, and other tools compatible with this format.
Report assumptions
The template has a set of assumptions that you make sure that your Jira/Xray environment complies with:
- Issue types having these names
- "Test", "Test Plan", "Test Execution"
If any of these assumptions is not met, you need to update the template accordingly.
Usage examples
Export all details obtained in the context of a given Test Plan
- open the Test Plan issue and export it using this template
Understanding the report
The report shows detailed information about the Test Plan provided.
Layout
The report is composed by several sections. Two major sections are available: Introduction and Test Executions details.
By default, and to avoid overload/redundancy of information, only the "Introduction" section will be rendered; you can change this behavior on the template (more info ahead).
"Introduction" section
This section is divided into 6 sub-sections to have an overview about the Test Plan we have just exported:
- Document Overview
- Test Plan Details
- Requirements covered by the Tests in this Test Plan
- Overall Execution Status
- Defects
- Tests Summary
Each of these sections is explained below.
Document Overview
Brief description of what you will find in this report and how it was generated.
Test Plan Details
In this section we are extracting the Test Plan key in the header and show the Begin and End Date (formatted as demonstrated below), the Summary and the Description present in the Test Plan.
Field | Description | Sample Code |
---|---|---|
Begin Date | Timestamp of the Begin Date field present in the Test Plan (with proper format) | ${dateformat("dd-MM-yyyy HH:mm:ss"):Begin Date} |
End Date | Timestamp of the End Date field present in the Test Plan (with proper format) | ${dateformat("dd-MM-yyyy HH:mm:ss"):End Date} |
Summary | Summary of the Test Plan | ${Summary} |
Description | Description of the Test Plan (as this field accepts wiki markup we will use "wiki:" in the code to be interpreted by the document) | ${wiki:Description} |
The output will have the following information, notice that as the Description field support wiki markup we are using "wiki:" keyword so that it is correctly interpreted.
Requirements covered by the Tests in this Test Plan
In this section we have an overview of all the requirements that are covered by Tests in this Test Plan, we extract the Key, Summary and the Workflow Status removing all the repeated entries.
Field | Description | Sample Code |
---|---|---|
Key | Key of the requirement (in this case we are adding it as link) | @{title=${Tests[n].Links[k].Key}|href=${BaseURL}/browse/${Tests[n].Links[k].Key} |
Summary | Summary of the requirement | ${Tests[n].Links[k].Summary} |
Workflow Status | Workflow Status of the requirement | ${Tests[n].Links[k].Status} |
The requirements are listed in a table with the informations explained above.
Overall Execution Status
As the name suggests we have an overview about the executions of the Tests in this Test Plan, here you will have information about how many Tests you have in this Test Plan and what are the statuses of their executions.
To obtain this information we are using:
Field | Description | Sample Code |
---|---|---|
TestsCount | The total number of Tests in this Test Plan | ${TestsCount} |
#Tests | To extract the count of the overall execution status per each <status> (TO DO, EXECUTING, PASSED, FAILED, ABORTED) | ${Overall Execution Status.<status>.Count} |
Percentage | To extract the percentage of the overall execution status per each <status> (TO DO, EXECUTING, PASSED, FAILED, ABORTED) | ${Overall Execution Status.<status>.Percentage} |
This will produce the following output:
Defects
In this section we are listing all the defects found that are associated with this Test Plan, we consider defects associated with TestRuns, defects in Test Steps or defects found during the iterations. We do not print duplicates.
Field | Description | Sample Code |
---|---|---|
Key | Key of the Defect |
|
Summary | Summary of the Defect |
|
Priority | Priority of the Defect |
|
The Defects appear in the document as a table with information regarding the defects found during the executions of the Test Plan.
Tests Summary
In this section we have a table with information regarding the Tests included in this Test Plan. You can find the following information about each Test:
Field | Description | Sample Code |
---|---|---|
Key | Key of the Test in a link form | @{title=${Tests[n].Key}|href=${BaseURL}/browse/${Tests[n].Key}} |
Summary | The Summary of the Test | ${Tests[n].Summary} |
Issue Assignee | Full name of the assignee | ${fullname:Tests[n].AssigneeId} |
Requirements | List of requirements covered by this Test (Check the template to see the extra cycle we need to list this information) | ${Tests[n].Links[k].Key} |
Test Executions | List of ids of the Test Executions for each Test (Check the template to see the extra cycle we need to list this information) | ${Tests[n].TestExecutions[j].Key} |
Latest Status | Latest Status of the execution | ${Tests[n].TestStatus} |
This information is presented in a table as we can see below:
Some particularities to highlight a different behavior about the code needed to show the Tests Runs section:
- Ability to put the Tests with a particular status on top of the table (more info ahead).
- Usage of ${fullname:Tests[n].AssigneeId}, this allows us to fetch the full name of the assignee instead of the key associated to it.
Test Executions
This section will gather all the information related to each Test Execution of each Test in the Test Plan with all the possible details.
It is composed with several sub-sections that will be filled with information if it is available or be filled with a message showing that no information is available.
Test Executions Summary
This section have a table with information regarding each Test Execution in this Test Plan (and will repeat these sections for each Test Execution). The information is presented as a table with the following fields:
Field | Description | Sample Code |
---|---|---|
Execution status | Execution Status of the Test Run | ${TestExecutions[n].TestRuns[a].Execution Status} |
Assignee | Full Name of the Assignee of the Test Run | ${fullname:TestExecutions[n].TestRuns[a].AssigneeId} |
Executed By | Full Name of the entity that has executed this Test Run | ${fullname:TestExecutions[n].TestRuns[a].Executed By} |
Started On | Timestamp of the Started Date from the TestRun | ${dateformat('dd-MM-yyyy HH:mm:ss'):TestExecutions[n].TestRuns[a].Started On} |
Finished On | Timestamp of the Finished Date from the TestRun | ${dateformat('dd-MM-yyyy HH:mm:ss'):TestExecutions[n].TestRuns[a].Finished On} |
All of these fields have code to handle empty fields. The resulting table look like the one below.
Execution Defects
If any Defects was found and associated globally with a TesRun it will appear here in the form of a table with the following fields:
Key | Description | Sample Code |
---|---|---|
Key | Jira Key of the Defect in the form of a link | @{title=${TestExecutions[n].TestRuns[a].ExecutionDefects[d].Key}|href=${BaseURL}/browse/${TestExecutions[n].TestRuns[a].ExecutionDefects[d].Key}} |
Summary | Summary of the Defect | ${TestExecutions[n].TestRuns[a].ExecutionDefects[d].Summary} |
Priority | Priority associated with the defect | ${TestExecutions[n].TestRuns[a].ExecutionDefects[d].Priority} |
The table will be similar to the one below.
Execution Evidences
If any Evidence was attached to the TestRun we are showing it in table with the FileName and a screenshot if the Evidence is an image otherwise just a link.
To obtain that information we have used the following code:
Key | Description | Sample Code |
---|---|---|
File Name | The File Name of the Evidence attached to the Execution | @{title=${TestExecutions[n].TestRuns[a].ExecutionEvidences[d].Name}|href=${TestExecutions[n].TestRuns[a].ExecutionEvidences[d].FileURL}} |
Evidence | The Evidence attached to the Execution | !{${TestExecutions[n].TestRuns[a].ExecutionEvidences[d].Evidence|maxwidth=100}} |
The table in case of an Evidence is of the type image will have the following aspect:
Comment
The comment associated to the TestRun (${TestExecutions[n].TestRuns[a].Comment}).
Test Description
The description of the TestRun (${wiki:TestExecutions[n].TestRuns[a].Description}).
Test Issue Attachments
This section only appears if you have any attachments associated to the Test.
Key | Description | Sample Code |
---|---|---|
File Name | File Name of the Attachment | @{title=${TestExecutions[n].Tests[t].Attachments[b].Name}|href=${TestExecutions[n].Tests[t].Attachments[b].FileURL}} |
Author | The Author of the attachment | ${fullname:TestExecutions[n].Tests[t].Attachments[b].Author} |
File Size | File Size of the attachments in bytes. | ${TestExecutions[n].Tests[t].Attachments[b].Size} |
This appears in the document in a table form:
Preconditions
This section only appear if you have a Precondition associated with the TestRun.
Key | Description | Sample Code |
---|---|---|
Precondition | The definition of the precondition associated to the TesRun | ${wiki:TestExecutions[n].TestRuns[a].PreConditions[l].PreCondition.Definition} |
A sub section will appear with the preconditions definitions.
Parameters
This section lists the existing parameters of the TestRun (we are iterating through the Parameters of the TestRun with: #{for m=TestExecutions[n].TestRuns[a].ParametersCount}).
Key | Description | Sample Code |
---|---|---|
Name | Key of the parameter | ${TestExecutions[n].TestRuns[a].Parameters[m].Key} |
Value | Value of the parameter | ${TestExecutions[n].TestRuns[a].Parameters[m].Value} |
It will list the Key and the Value of each parameter in a table.
Iterations
This section uses a sentence to show how many interactions we will go into more details in the next sections.
Key | Description | Sample Code |
---|---|---|
Iterations | The iterations count of the Test Run | ${TestExecutions[n].TestRuns[a].IterationsCount} |
A sentence is added to the document with this information.
Iteration Overall Execution Status
To obtain the overall execution status of the iteration we use two variables:
Key | Description | Sample Code |
---|---|---|
List of Statuses | Show the List of Statuses | ${TestExecutions[n].TestRuns[a].IterationsOverallExecutionStatus} |
TO DO EXECUTING PASS FAIL ABORTED | Overall Execution Status per Status | ${TestExecutions[n].TestRuns[a].IterationsOverallExecutionStatus.TO DO} ${TestExecutions[n].TestRuns[a].IterationsOverallExecutionStatus.EXECUTING} ${TestExecutions[n].TestRuns[a].IterationsOverallExecutionStatus.PASSED} ${TestExecutions[n].TestRuns[a].IterationsOverallExecutionStatus.FAILED} ${TestExecutions[n].TestRuns[a].IterationsOverallExecutionStatus.ABORTED} |
The above code will produce the below table.
Test Run details
In this section we are showing the Test Run details with the Name, Status and Parameters.
We extract that information using the following fields:
Key | Description | Sample Code |
---|---|---|
Iteration Name | Name of the iteration | ${TestExecutions[n].TestRuns[a].Iterations[m].Name} |
Status | Status of the iteration | ${TestExecutions[n].TestRuns[a].Iterations[m].Status} |
Total Parameters | Total number of parameters | ${TestExecutions[n].TestRuns[a].Iterations[m].ParametersCount} |
Parameters | Lists all parameters in the form of Key=Value | ${TestExecutions[n].TestRuns[a].Iterations[m].Parameters} |
This section will have the below appearance:
Iteration precondition definition
If a precondition is present we will use the following fields to extract that information:
Key | Description | Sample Code |
---|---|---|
Key | Iteration precondition key | ${TestExecutions[n].TestRuns[a].Iterations[m].PreConditions[l].Key} |
Definition | Iteration precondition definition | ${wiki:TestExecutions[n].TestRuns[a].Iterations[m].PreConditions[l].PreCondition.Definition} |
This will produce an entry like the one below:
Iteration parameters details
For that given Iteration we are listing the parameters used, that information is extracted with the following fields:
Key | Description | Sample Code |
---|---|---|
Name | Parameter Key | ${TestExecutions[n].TestRuns[a].Iterations[m].Parameters[l].Key} |
Value | Parameter Value | ${TestExecutions[n].TestRuns[a].Iterations[m].Parameters[l].Value} |
It generates a table of the following form:
Iteration Test Step Details
In this section we are listing the details of an iteration, we are listing each step present with details, the code we use for that purpose is present in the below table.
Key | Description | Sample Code |
---|---|---|
Step | The Step Number | ${TestExecutions[n].TestRuns[a].Iterations[m].TestSteps[r].StepNumber} |
Action | Action defined in the Test Step | ${wiki:TestExecutions[n].TestRuns[a].Iterations[m].TestSteps[r].Action} |
Data | Data defined in the Test Step | ${TestExecutions[n].TestRuns[a].Iterations[m].TestSteps[r].Data} |
Expected Result | Expected Result defined in the Test Step | ${wiki:TestExecutions[n].TestRuns[a].Iterations[m].TestSteps[r].ExpectedResult} |
Attachments | Attachments present in each Test Step (showing the FileURL and a screenshot in case of the Attachment being an image) | @{title=${TestExecutions[n].TestRuns[a].Iterations[m].TestSteps[r].Attachments[sa].Name}|href=${TestExecutions[n].TestRuns[a].Iterations[m].TestSteps[r].Attachments[sa].FileURL}} !{${TestExecutions[n].TestRuns[a].Iterations[m].TestSteps[r].Attachments[sa].Attachment|maxwidth=100}} |
Comment | Comment | ${wiki:TestExecutions[n].TestRuns[a].Iterations[m].TestSteps[r].Comment} |
Defects | Defects associated to this Iteration | @{title=${TestExecutions[n].TestRuns[a].Iterations[m].TestSteps[r].Defects[dc].Key}|href=${BaseURL}/browse/${TestExecutions[n].TestRuns[a].Iterations[m].TestSteps[r].Defects[dc].Key}} |
Evidence | FileURL and screenshot (if it is an image) of the Evidence | @{title=${TestExecutions[n].TestRuns[a].Iterations[m].TestSteps[r].Evidences[e].Name}|href=${TestExecutions[n].TestRuns[a].Iterations[m].TestSteps[r].Evidences[e].FileURL}} !{${TestExecutions[n].TestRuns[a].Iterations[m].TestSteps[r].Evidences[e].Evidence|maxwidth=100}} |
Status | Test Step Status | ${TestExecutions[n].TestRuns[a].Iterations[m].TestSteps[r].Status} |
The above information is gathered in a table like the one below:
Test Details
This section shows the Test details, for that we are considering the different possible Test we can have in Xray: Generic, Manual and Cucumber. For each type we will fetch different information.
It may seem similar with the Iteration Test Step Details section but in this section we will show the Test details (not instantiated in each Iteration like the previous section).
Type | Key | Description | Sample Code | Output |
---|---|---|---|---|
Generic | Test Type | Test Type field | ${TestExecutions[n].TestRuns[a].TestType} | |
Specification | Definition of the Generic test | ${TestExecutions[n].TestRuns[a].Generic Test Definition} | ||
Cucumber | Test Type | Test Type field | ${TestExecutions[n].TestRuns[a].TestType} | |
Gherkin Specification | Gherkin specification of the Test | ${TestExecutions[n].TestRuns[a].Cucumber Scenario} | ||
Manual | Step | Step Number | ${TestExecutions[n].TestRuns[a].TestSteps[r].StepNumber} | |
Action | Action of the Test Step | ${TestExecutions[n].TestRuns[a].TestSteps[r].Action} | ||
Data | Data of the Test Step | ${TestExecutions[n].TestRuns[a].TestSteps[r].Data} | ||
Expected Result | Expected Result of the Test Step | ${TestExecutions[n].TestRuns[a].TestSteps[r].ExpectedResult} | ||
Attachment | Attachment of the Test Step | ${TestExecutions[n].TestRuns[a].TestSteps[r].Attachments[sa].FileURL} !${TestExecutions[n].TestRuns[a].TestSteps[r]. Attachments[sa]. Attachment|maxwidth=100} | ||
Comment | Comment of the Test Step | ${wiki:TestExecutions[n].TestRuns[a].TestSteps[r].Comment} | ||
Defects | Defects associated with the Test Step | ${TestExecutions[n].TestRuns[a].TestSteps[r].Defects[dc].Key} | ||
Evidence | Evidence with the Test Step | ${TestExecutions[n].TestRuns[a].TestSteps[r].Evidences[e].FileURL}} !${TestExecutions[n].TestRuns[a].TestSteps[r].Evidences[e].Evidence|maxwidth=100} | ||
Status | Status of the Test Step | ${TestExecutions[n].TestRuns[a].TestSteps[r].Status} |
Requirements linked with this test
For each Test we are listing the Requirements linked
Key | Description | Sample Code |
---|---|---|
Requirement Key | Key of the Requirement | ${TestExecutions[n].Tests[t].Links[j].Key} |
Requirement Summary | Summary of the Requirement | ${TestExecutions[n].Tests[t].Links[j].Summary} |
Workflow Status | Workflow status of the Requirement | ${TestExecutions[n].Tests[t].Links[j].Status} |
This section present a table with that information like the one below:
Appendix A: Approval
This section is added for the cases where you need to have a signature validating the document.
Customizing the report
Sections that can be hidden or shown
The report has some variables/flags that can be used to show or hide some sections whose logic is already implemented in the template.
These variables are defined at the top of each sheet, at the report template; the variables are scoped just to the current sheet.
On the template, use one of these values for flag type of variables:
- 0: to hide a section
- 1: to show a section
The format for other types of variables is detailed ahead.
Example of setting a variable to, in this case, render information on the section "Test Executions"
${set(showTestRunDetails, 1)}
Variable/flag | Purpose | default | example(s) |
---|---|---|---|
showTestRunDetails | render the details section
| 0 | ${set(showTestRunDetails, 0)} |
showTestRunEvidences | render this section
(Make sure to define showTestRunDetails at 1) | 0 | ${set(showTestRunEvidences, 0)} |
showTestRunAttachments | render this section
(Make sure to define showTestRunDetails at 1) | 0 | ${set(showTestRunAttachments, 0)} |
showTestRunIterations | render this section
(Make sure to define showTestRunDetails at 1) | 0 | ${set(showTestRunIterations, 0)} |
statusesToShowFirst | render Test Summary section whose reported status is one in this list first (delimited by comma); use an empty string '' to include all statuses
| '' (i.e., all statuses) | ${set(statusesToInclude, 'FAILED')} ${set(statusesToInclude, 'FAIL,EXECUTING')} ${set(statusesToInclude, '')} |
Adding or removing information to/from the report
As this report is a document with different sections, if some sections are not relevant to you, you should be able to simply delete them. Make sure that no temporary variables are created in that section that are used in other subsequent sections or if any all conditional blocks are properly closed.
To add additional information, usually we're thinking of adding:
- fields of the Test Plan itself
- fields of the Tests associated with the Test Plan
- fields of the Test Executions associated with the Test Plan
Eventually, also:
- fields of the Test Runs(s)
- fields of the covered issue(s) associated with the Test that is associated with the Test Plan
The later may be harder to implement, so we won't consider them here.
Exercise: add a field from the related Test issue
Let's say we have a "Severity" field on the Test issue that is connected to the Test Plan, and that we want to show it on the report.
We can copy the column "Summary" from the "Tests Summary" section and adapt it.
- insert new column in the table
- on the "Tests Summary" section,
- copy "Summary" (i.e., insert a column next to it and copy the values from the existing "Summary" column)
- change
- ${Tests[n].Summary} to ${Tests[n].Severity}
Performance
Performance can be impacted by the information that is rendered and by how that information is collected/processed.
The number of Test Runs and Tests depending on scenarios, can be considerably high, especially with CI/CD. As this report sum-up quite information, please use it wisely.
Data-driven tests may also add an overhead, as iterations need to be individually processed, for collecting all the reported/linked defects for example.
Some tips
- Use the variables/flags to adjust sections or the Test Plan that will be processed/shown in the report; more info in "Customizing the report"
- limit the number of input issues; in Xporter there's a global setting for this purpose
Known limitations
- Test Plan comments are not formatted
- Gherkin Scenario Outlines are not considered as data-driven (i.e., only one Test Run will appear)