Instead of creating one Test Plan for your release, you may create multiple Test Plans to track different Tests.

This may be useful if you want to have visibility on how certain groups of Tests are progressing.

You may create different Test Plans for:

  • Tests related to new features
  • regression testing (i.e., for Tests related with features implemented in previous versions)
  • security-related Tests
  • performance-related Tests
  • compliance-related Tests
  • other non-functional Tests


Please note

This approach is independent of the methodology being used. You can use the tips described next to extend your methodology and gain additional visibility over specific Tests.


Let's dive into two specific examples that are traditionally used in most teams and can be approached differently using the classic Test Plan approach or the new Dynamic Test Plans feature available with Xray Enterprise.

The examples will show a possible approach to define a Regression Test Plan and a Test Plan to validate new features using the classic or Dynamic Test Plans approach.

Examples using Test Plans

In the classic approach, the tester creates empty Test Plans for Regression Tests and to test the new feature.

Regression Test Plan

The regression test plan list usually changes the most out of all the test lists, and testers need to manage which Tests are part of it in every iteration.

In the traditional approach, the tester creates a new Test Plan; in this example, we will call it a "Regression Test Plan."


As the tester does this at the beginning of the iteration, it is expected that some Tests do not exist yet, so the Test Plan will remain empty.


As each Test is created, the tester must manually add those to the Regression Test Plan, or at any moment later on based on some criteria.

   


In this case, anytime you need to add or remove Tests, you must always do it manually.

Test Plan for the new feature

When the team is working towards a delivery, new features are created and need to be tested; in most cases, it matters to understand if the new feature is appropriately validated and if it is a feature that the team thinks should be part of the Regression Tests they should add it to the Regression Test Plan.

Understanding if the feature tests are successful is of the most value to a team; this can be achieved by creating a Test Plan that will hold all Tests covering that new feature.

When the team starts working, the tester often creates several Test Plans, one of which is the Test Plan that will hold all the feature-related Tests. The Test plan is created without any Tests, and the tester must manage it as the Tests are being created.

For example, the tester creates a new Test Plan to hold all the Tests covering the Authentication feature.


As the Tests still need to be created, the Test Plan will be left empty.

When Tests are created, the team will need to add those Tests to the Test Plan manually. The same thing happens if they need to remove Tests that do not make sense anymore for this Test Plan - they must be removed manually.

Examples using Dynamic Test Plans

Xray Enterprise introduces a new feature to dynamically manage the Tests that are part of a Test Plan; this functionality is called Dynamic Test Plans.

In the classic approach, the Tests must exist to be added to the Test Plan, and if you want to remove some from the Test Plan, you have to do it manually also.

Let's look at the same examples as before (Regression Test Plan and Feature Test Plan) and see how you can take advantage of Dynamic Test Plans in these situations.


Dynamic Test Plan is a Xray Enterprise Feature



Dynamic Test Plans is a feature of Xray Enterprise. If you do not have Xray Enterprise installed, the Dynamic Test Plans are not available in the Test Plan issue, and it is impossible to configure dynamic test lists. When installing Xray Enterprise for the first time, please re-enable Xray to load all properties from Xray Enterprise.


Regression Test Plan

Remember that the Regression Test Plan is usually the Test Plan that needs more management to make sure we add and remove the correct Tests to prevent possible regressions.

We start exactly like the classic approach by creating a new Test Plan.

The difference from the classic behavior is how we define what Tests are part of the Test Plan.

For this example, and to keep things simple as there are many ways to approach this, we are considering that all Tests meant for Regression are labeled "Regression". You can use other criteria as long as they can be searchable using a Jira filter and JQL.


This allows us to create a JQL Filter to list all Tests that are part of the Regression. Once the filter is tuned to your needs, save it and give it a name.

We can but we don't need to explicitly filter by the issue type Test because the Test Plan will only consider Tests to be added to it.


On the Test Plan issue screen, we can configure it to use a "Saved Filter", implicitly making it dynamic.


You can see that the Tests that appear in the filter screenshot above are automatically added to the Test Plan.


Now whenever you add the Label "Regression" to a Test, it will be automatically added to the Test Plan. It will be automatically removed from the Test Plan whenever you remove that label.

Once again, using the Labels to filter out Tests is an example; you can use whatever makes the most sense to you in the JQL filter.


Test Plan for new feature

For this example and to keep things simple, one Story represents a new feature, and we want to have the list of all Tests linked to it.

We have created a JQL filter named "login_tests," listing all Tests that cover the requirement story XT-5 via "issueTests" syntax.

When creating the Dynamic Test Plan to validate the feature, we use the previously created filter, as shown below.

The Tests are automatically added to the Test Plan and are dynamically managed; every time the team adds or removes the link between tests and the requirement User Stories, the tests will be automatically added to/removed from the Test Plan.


Additional tips for building Dynamic Test Plan queries

In general, we use saved filters and JQL to obtain the tests we need. Some examples follow ahead.


JQL features in Jira Cloud

JQL in Jira Cloud has some limitations compared with Jira Data Center; Atlassian has been improving JQL support in Cloud but there are still some gaps; some of these can be overcome with other Jira apps as seen ahead. To achieve better results, we advice to you use ScriptRunner app (you may also be able to implement this using another app with similar capabilities) to create enhanced filters.


Tests assigned to a specific component

Context

  • A tester wants to create a Test Plan with all the tests assigned to some component.

How to

  1. Use a JQL similar to the following one.
    1. Sample JQL
      project = BOOK and issuetype = Test and Component = 'login'"

Tests that cover requirements assigned to a specific component

Context

  • A tester wants to create a Test Plan with all the tests covering some components' requirements.

How to

To be able to implement this use case, we'll use ScriptRunner Enhanced Search feature of the ScriptRunner app (you may also be able to implement this using another app with similar capabilities).


  1. In Apps > ScriptRunner Enhanced Search create a filter (e.g., "tests_for_login_component") for the following JQL query. To filter by status just add the clause "... and status = '...' ".

    1. Sample JQL
      issueFunction in linkedIssuesOf("project = CALC and issuetype = "Story" and Component = 'login'", "is tested by")
  2. Use the previous saved filter on the Test Plan configuration, as a regular Jira filter.

Tests that cover high-priority requirements

Context

  • A tester wants to create a Test Plan with all the tests that cover high-priority requirements.

How to

To be able to implement this use case, we'll use ScriptRunner Enhanced Search feature of the ScriptRunner app (you may also be able to implement this using another app with similar capabilities).


  1. In Apps > ScriptRunner Enhanced Search create a filter (e.g., "tests_for_stories_high_priority") for the following JQL query. To filter by status just add the clause "... and status = '...' ".

    1. Sample JQL
      issueFunction in linkedIssuesOf("project = CALC and issuetype = "Story" and Priority  = 'High'", "is tested by")
  2. Use the previous saved filter on the Test Plan configuration, as a regular Jira filter.


Tests that cover "risky" requirements

Context

  • A tester wants to create a Test Plan with all the tests that cover requirements having a certain risk level associated to them, set using a specific custom field.


Note that organizations can implement risk management in different ways.

How to

To be able to implement this use case, we'll use ScriptRunner Enhanced Search feature of the ScriptRunner app (you may also be able to implement this using another app with similar capabilities).


  1. In Apps > ScriptRunner Enhanced Search create a filter (e.g., "tests_for_stories_medium_risk") for the following JQL query.

    1. Sample JQL
      issueFunction in linkedIssuesOf("project = CALC and issuetype = "Story" and 'RiskLevel' ~ 'L2: Medium'", "is tested by")
  2. Use the previous saved filter on the Test Plan configuration, as a regular Jira filter.

Manual (step-based) tests

Context

  • A tester wants to create a Test Plan with all the step-based test cases to be performed manually.

How to

  1. Use a JQL similar to the following one.
    1. Sample JQL
      project = BOOK and issuetype = Test and testType  = Manual

Automated tests

Depending on how teams implement automated tests, filtering them can be slightly different. In this example we assume that teams are using Generic tests as a way to abstract automated tests.

Context

  • A tester wants to create a Test Plan to track all the test automation results.

How to

  1. Use a JQL similar to the following one.
    1. Sample JQL
      project = BOOK and issuetype = Test and testType = Generic

Gherkin/BDD (e.g., "Cucumber")  tests

Context

  • A tester wants to create a Test Plan with all the Cucumber test scenarios.

How to

  1. Use a JQL similar to the following one.
    1. Sample JQL
      project = BOOK and issuetype = Test and testType = Cucumber


Tests in a given workflow status

Context

  • A tester wants to create a Test Plan with all tests that have been reviewed and approved.

How to

  1. Use a JQL similar to the following one.
    1. Sample JQL
      project = BOOK and issuetype = Test and status = Approved