You are viewing an old version of this page. View the current version.

Compare with Current View Page History

« Previous Version 14 Next »

Xray provides useful and detailed information about the Requirements Coverage in every project via the Requirement Coverage Charts tab panel and also the Requirement Status custom field.


Analysis Scopes

The status of the covered issue and of the linked Tests can be analyzed from multiple perspectives/scopes.


Tests and coverable issues can be analyzed from different perspectives/scopes.

As an example, the same Test or Story can be analyzed for example on version 1.0 and also on version 2.0. This will take into account the executions made for those versions, respectively.

Therefore, a story may be OK on version 1.0 but may be NOK on version 2.0, due to regressions.

In fact, Tests and coverable issues can be analyzed:

  • Latest
  • Version
  • Test Plan

Besides these scopes, can also be analyzed for some

  • Test Environment
  • Final statuses precedence over non-final ones

Latest

If you don't care about versions, or are not using versions at all, and just want to see the calculated statuses based on the latest runs, analysis by "Latest" can be used for that purpose.

This may be useful to have a quick idea about the latest results or calculated status for the Test or coverable issue.

Please note that whenever analyzing by latest results also considers the latest results made for the different environments (i.e. when "All Environments" is selected as Test Environment).

Version

The "Version" scope allows users to analyze Tests and coverable issues from a version perspective and answer questions such as "How is the requirement on version X?", "How are these Tests performing on version Y?".

Whenever analyzing by version, only Test Executions made for the given version are considered, through the Test Execution's Fix Version field.

As an example, a user story aimed for version 3.0 may be analyzed from the point of view of the executions made for version 3.0 or for the ones made afterwards on version 4.0.

Test Plan

Analysis by Test Plan gives the ability to evaluate the Test status or coverage status based on some planned testing (i.e. on the Tests and related executions made in the scope of the selected Test Plan).

Whenever analyzing by Test Plan, only Test Executions linked to that Test Plan are considered.

This kind of analysis provides to means to evaluate if a given coverable issue is covered or not by the Tests of some Test Plan and, if so, how it is based on the executions performed from the related planned Test Executions.

Test Environment

Analysis by Test Environment gives the ability to analyze the Test status or the coverage status of an issue for some Test Environment.

This kind of analysis gives the abilitty to answer questions such as: "How is this requirement on the environment X?", "How is the Test 

Whenever analyzing by Test Environment, only Test Executions made for the given Test Environment are considered.


Final Statuses Precedence

Final statuses precedence is used to perform the analysis based on "finished work" (non intermediate Test Runs).

The flag "Final statuses have precedence over non-final statuses" gives the additional ability to consider just Test Runs whose status is one configured as being a final status.

This helps answer questions such as: "Which is the latest 


As an example, a user story aimed for version 3.0 may be analyzed from the point of view of the executions made for version 3.0 or for the ones made afterwards on version 4.0; it can eventually be analyzed by taking into account the Tests and Test Executions linked to some Test Plan. Of if you don't care with exact versions and just want to see the calculated statuses based on the latest runs, analysis by "Latest" can be used for that purpose. Another analysis dimension is the Test Environment; if chosen, then only executions made for that Test Environment are considered for the shown statuses.

Analyzing Xray entities

Xray entities can be analyzed in different places, starting with the issue view screen and also in some specific reports containing those entities. 

Coverable issues

The status of coverable issues can be evaluated directly in the issue view screen and also on some reports, including the Tests Coverage report.

Issue screen

Within the issue screen, the coverage status can be evaluated for the specified scope.

Reports

Test Coverage

Coverable issues can be analyzed for some given scope using the Test Coverage report. More info about this report in Test Coverage Report

Tests

The status of Tests can be evaluated directly in the Test issue view screen and also on some reports, including the Tests List report and, indirectly, the Test Sets List report.

Tests can also be evaluated in the coverable issue screen, within the "Test Coverage" section.

Issue screen

Within the Test issue screen, the Test status can be evaluated for the specified scope. The calculated status is cached for performance reasons; since data can be unsynced between Jira and Xray, it's possible to enforce a recalculation and show an up-to-date value.

Reports

Tests List

The Tests can be analyzed for some given scope using the Tests List report. 

Test Sets

The status of Test Sets (i.e. of the Tests contained within a Test Set) can be analyzed using the Test Sets List report.

Reports

Tests Sets List

The Test Sets, and implicitly the Tests within them, can be analyzed for some given scope using the Test Sets List report.

Requirement Test Coverage Section

The Requirement Status and the Test Coverage are presented in the Requirement issue view.

Analysis by version

You can choose a specific analysis version to calculate the Requirement Status. This can be useful when testing multiple versions of your requirements at the same time or if you need to see the requirement coverage for previous versions. The analysis version can be No Version, in which case the requirement status will be calculated based on the latest execution for each Test, regardless of the execution version.

Analysis by Test Plan

You can also choose a specific Test Plan to calculate the Requirement Status. This will calculate the requirement status based on the executions for that Test Plan. This can be useful when you have multiple test plans and need to see the requirement coverage for a specific Test Plan.

Test Coverage Charts

The Test Coverage Report provides an overall view of the current test coverage of a project for a particular version/Test Plan.


The requirement coverage reports allow you to analyse the test status of requirement issues in the current project, provide an overview of how many requirements are OK, NOK, Not Run and Uncovered, and an overview of the requirement status by version or test plan.



Analysis by Version

You can choose a specific analysis version to generate the report. This will calculate the requirement status based on executions for that version. This can be useful when testing multiple versions of your requirements at the same time or if you need to see the requirement coverage for previous versions. The analysis version can be None, in which case the requirement status will be calculated based in the latest execution for each Test, regardless of the execution version.

Analysis by Test Plan

You can also choose a specific Test Plan to generate the report. This will calculate the requirement status based on the executions for that Test Plan. This can be useful when you have multiple test plans and need to see the requirement coverage for a specific Test Plan.

Analysis by Environment

Xray also allows you to specify a Test Environment to calculate the status of requirement issues. You can specify an Environment when using both Version or Test Plan analysis. If you want to aggregate the status for all environments, just choose the All Environments option. If you choose a specific environment, then Xray will only consider the Test Execution issues that are within this environment when calculating the Requirement Status.

Grouping Results

You can also choose a grouping field to group requirements within these charts. You can, for instance, generate a report based on requirement Priority, Resolution, or any other field of type "Select List" that you have configured for your requirement issues. 

Requirement Filters

In the Requirement Coverage charts, you can filter requirements to show in the charts. Use the basic filter fields:

  • Fix Version (you can also choose whether to include the previous versions -- equivalent in JQL to fixVersion <= X. The default value for this option can be changed in the Xray configuration.)
  • Workflow Status
  • Resolution
  • Assignee
  • Key or Summary
  • Component

or use a previously "Saved Filter" with Requirement issues.

Each time a user accesses the "Requirement Coverage" project page, the chart will be generated with the user's last chosen options. This includes the report and all the filters for the requirement issues.

There is also an options menu where you can choose the visualization type for the charts:  

  • hierarchical - only the parent requirement issues will be presented in the charts. Sub-requirements can still be visualized in the overall details table.
  • flatten - the requirement issues will not consider the parent/child relationship. All parent and child requirements will be considered for the chart. 

Troubleshooting


"I have an Xray Project to Test my Requirements project. I've created a Test and I have already executed the Test, but I always get requirements status Covered but Not Run."


You might be using environments on Test Executions that are affecting the aggregated status for the Tests. If you have a couple of Test Executions, one with a specific environment (e.g., Android) and another without any environment, then Xray will also consider the empty environment when calculating the aggregated status (this is, of course, if all Test Executions are within the same Version or Test Plan).

Probably, you have not configured your versions properly. Remember that coverage status of some issue for some version is calculated based on the Fix Version associated with the Test Execution issues.




  • No labels