Instead of creating one Test Plan for your release, you may create multiple Test Plans to track different Tests.

This may be useful if you want to have visibility on how certain groups of Tests are progressing.

You may create different Test Plans:

  • for Tests related to new features
  • for regression testing (i.e., for Tests related with features implemented in previous versions)
  • for security-related Tests
  • for performance-related Tests
  • for compliance-related Tests
  • for non-functional Tests


Please note

This approach is independent of the methodology being used. You can use the tips described next to extend your methodology and gain additional visibility over specific Tests.


Let's dive into two specific examples that are traditionally used in most teams and can be approached differently using the classic Test Plan approach or the new Dynamic Test Plans feature available with Xray Enterprise.

The examples will show a possible approach to define a Regression Test Plan and a Test Plan to validate new features using the classic or Dynamic Test Plans approach.

Examples using Test Plans

In the classic approach, the tester creates empty Test Plans for Regression Tests and to test the new feature.

Regression Test Plan

The regression test plan list is usually the most dynamic of the test lists, and because of that, testers need to manage which Tests are part of it in every iteration.

In the traditional approach, the tester creates a new Test Plan; in this example, we will call it a "Regression Test Plan."


As the tester does this at the beginning of the iteration, it is expected that some Tests do not exist yet, so the Test Plan will remain empty.


As each Test is created, the tester must manually add those to the Regression Test Plan.


Anytime you need to add or remove Tests, you must do it manually.

Test Plan for new feature

When the team is working towards a delivery, new features are created and need to be tested; in most cases, it matters to understand if the new feature is appropriately validated and if it is a feature that the team thinks should be part of the Regression Tests they should add it to the Regression Test Plan.

Understanding if the feature tests are successful, is of the most value to a team; this can be achieved by creating a Test Plan that will hold all Tests covering that new feature.

When the team starts working, the tester creates the Test Plan, one of which is the Test Plan that will hold all the feature-related Tests. The Test plan is created without any Tests, and the tester must manage it as the Tests are being created.

The tester creates a new "Authentication Test Plan" to hold all the Tests covering the Authentication feature.


As the Tests still need to be created, it will leave the Test Plan empty.

When Tests are created, the team will need to add those Tests to the Test Plan manually. The same thing happens to remove Tests that do not make sense anymore for this Test Plan; they must be removed manually.

Examples using Dynamic Test Plans

Xray Enterprise introduces a new feature to dynamically manage the Tests that are part of a Test Plan; this functionality is called: Dynamic Test Plans.

In the classic approach, the Tests must exist to be added to the Test Plan, and if you want to remove some from the Test Plan, you have to do it manually also.

Let's look at the same examples as before (Regression Test Plan and Feature Test Plan) and see how you can take advantage of Dynamic Test Plans in these situations.


Dynamic Test Plan is a Xray Enterprise Feature



Dynamic Test Plans is a feature of Xray EnterpriseIf you do not have Xray Enterprise installed, the Dynamic Test Plans are not available in the Test Plan issue, and it is impossible to configure dynamic test lists. When installing Xray Enterprise for the first time, please re-enable Xray to load all properties from Xray Enterprise.


Regression Test Plan

Remember that the Regression Test Plan is usually the Test Plan that needs more management to make sure we add and remove the correct Tests to prevent possible regressions.

We start exactly like the classic approach by creating a new Test Plan.

The difference from the classical behavior is how we define what Tests are part of the Test Plan.

For this example, and to keep things simple as they are many ways to approach this, we are considering that all Tests that are meant for Regression are labeled with the label: "Regression."


This allows us to create a JQL Filter to list all Tests that are part of the Regression.

We are not filtering the issue type = Tests because the Test Plan will only consider Tests to be added to it; if the filter returns more issue types, only Tests will be considered.


Once the filter is tuned to your needs, save it and give it a name.

Returning to the Test Plan creation, we can now switch to the "Test Plan Details" tab, add the JQL filter name in the field Test Plan Tests Filter, and save it.


You can see that the two Tests that appear in the filter previously are automatically added to the Test Plan.


Now whenever you add the Label "Regression" to a Test, it will be automatically added to the Test Plan. It will be automatically removed from the Test Plan whenever you remove that label.

Using the Labels to filter out Tests is an example; you can use whatever makes the most sense to you in the JQL Filter.


Test Plan for new feature

For this example and to keep things simple, one Epic represents a new feature, and we want to have the list of all Tests of all User Stories linked to it.

We have created a JQL filter named: "Authentication Feature Filter," listing all Tests of all User Stories linked to the Epic.

When creating the Dynamic Test Plan to validate the feature, we use the previously created filter, as shown below.

The Tests are automatically added to the Test Plan and are dynamically managed; every time the team adds or removes Tests from the User Stories, they will be automatically added/removed from the Test Plan.


Additional tips for building Dynamic Test Plan queries

We can use JQL functions and saved filters to obtain the tests we need. Some examples follow ahead.

Tests that cover requirements assigned to a specific component

Context

  • Tester wants to create a Test Plan with all the tests covering some components' requirements.

How to

  1. In the Issues navigator, create a filter (e.g., "saved_filter") for the following JQL query
    1. Sample JQL
      project = BOOK AND issuetype in (Story, Epic) and component = Accounting 
  2. Use the requirementTests JQL function. This will allow you to obtain the issues on the Issue navigator/search page.
    1. Sample JQL
      issue in requirementTests('saved_filter')

Tests that cover critical requirements

Context

  • Tester wants to create a Test Plan with all the tests that cover Critical requirements, or requirements in a given workflow status.

How to

  1. In the Issues navigator, create a filter (e.g., "saved_filter") for the following JQL query.
    1. Sample JQL
      project = BOOK AND issuetype in (Story, Epic) and priority  = Critical
  2. Use the requirementTests JQL function. This will allow you to obtain the issues on the Issue navigator/search page.
    1. Sample JQL
      issue in requirementTests('saved_filter')

Manual (step based) tests

Context

  • Tester wants to create a Test Plan with all the step-based test cases to be performed manually.

How to

  1. Use a JQL similar to the following one.
    1. Sample JQL
      project = BOOK and issuetype = Test and "Test Type"  = Manual

Automated tests

Depending on how teams implement automated tests, filtering them can be slightly different. In this example we assume that teams are using Generic tests as a way to abstract automated tests.

Context

  • Tester wants to create a Test Plan to track all the test automation results.

How to

  1. Use a JQL similar to the following one.
    1. Sample JQL
      project = BOOK and issuetype = Test and "Test Type" = Generic

Gherkin/BDD (e.g., "Cucumber")  tests

Context

  • Tester wants to create a Test Plan with all the Cucumber test scenarios.

How to

  1. Use a JQL similar to the following one.
    1. Sample JQL
      project = BOOK and issuetype = Test and "Test Type" = Cucumber

Tests in a given Test Repo folder

Context

  • The tester wants to create a Test Plan with all tests within a specific folder, and eventually in all of its sub-folders, in the Test Repository of a project. 

How to

  1. Use the testRepositoryFolderTests JQL function and a query similar to the following one, to obtain all the tests within the "Parent/Child" folder and child sub-folders.
    1. Sample JQL
      issue in testRepositoryFolderTests("BOOK", 'Parent/Child', "true")

Tests in a given workflow status

Context

  • The tester wants to create a Test Plan with all tests that have been reviewed and are currently approved.

How to

  1. Use a JQL similar to the following one.
    1. Sample JQL
      project = BOOK and issuetype = Test and status  = Approved