What you'll learn

  • Define tests using Selenium and TestNG
  • Run the test and push the test report to Xray
  • Validate in Jira that the test results are available

Source-code for this tutorial

Overview

In this tutorial we will focus on how to take advantage of the functionalities delivered by TestNG and Selenium.

TestNG is a testing framework inspired in Junit and Nunit that aims to add some functionalities.

We will use an extension, developed in house, that will use the new functionalities provided by TestNG to create richer reports in Xray.

The features available with the extension are:

  • link a test method to an existing Test issue or use auto-provisioning
  • cover a "requirement" (i.e. an issue in Jira) from a test method
  • specify additional fields for the auto-provisioned Test issues (e.g. summary, description, labels)



Prerequisites


In this tutorial, we will show you how to use an extension to produce a report to be pushed to Xray using Selenium and TestNG in Java.

 We will need:

  • Access to a demo site that we aim to test
  • JDK8 and Maven installed (although you have the option to use a docker image, if you do this requirement is not mandatory)
  • Include the extension in your dependencies


Code

The tests we have defined to demonstrate these new features include validating the login feature (with valid and invalid credentials) of the demo site, for which we created a page object that will represent the loginPage.


LoginPage.java
package com.idera.xray.tutorials;

import org.openqa.selenium.WebDriver;
import org.openqa.selenium.WebElement;
import org.openqa.selenium.support.FindBy;
import org.openqa.selenium.support.PageFactory;
import org.openqa.selenium.support.ui.ExpectedConditions;
import org.openqa.selenium.support.ui.WebDriverWait;
import org.openqa.selenium.By; 

public class LoginPage {

    private WebDriver driver;
    private RepositoryParser repo;
    private WebElement usernameElement;
    private WebElement passwordElement;
    private WebElement submitButtonElement;

    public LoginPage(WebDriver driver) {
        this.driver = driver;
        repo = new RepositoryParser("./src/configs/object.properties");
        PageFactory.initElements(driver, this);
    }

    public LoginPage open()
    {
        driver.navigate().to(repo.getBy("url"));
        return this;
    }

    public void setUsername(String username) {
        usernameElement = driver.findElement(By.id(repo.getBy("username.field.id")));
        usernameElement.sendKeys(username);
    }

    public void setPassword(String password) {
        passwordElement = driver.findElement(By.id(repo.getBy("password.field.id")));
        passwordElement.sendKeys(password);
    }

    public WebElement getSubmitButton(){
        submitButtonElement = driver.findElement(By.id(repo.getBy("login.button.id")));
        return submitButtonElement;
    }

    public LoginResultsPage submit()
    {
       getSubmitButton().submit();
       return new LoginResultsPage(driver);
    }

    public LoginResultsPage login(String username, String password)
    {
        setUsername(username);
        setPassword(password);
        return submit();
    }

    public Boolean contains(String text) {
        return driver.getPageSource().contains(text);
    }

    public String getTitle()
    {
        return driver.getTitle();
    }

    public Boolean isVisible()
    {
        WebDriverWait wait = new WebDriverWait(driver, 30000);
        return wait.until(ExpectedConditions.elementToBeClickable(getSubmitButton())).isDisplayed();
    }

}


We have another class to represent the Login Results Page, that is a representation of the page we will interact with multiple times in the testing activities.


LoginResultsPage.java
package com.idera.xray.tutorials;

import org.openqa.selenium.WebDriver;

public class LoginResultsPage {
    private WebDriver driver;

    public LoginResultsPage(WebDriver driver) {
        this.driver = driver;
    }

    public Boolean contains(String text) {
        return driver.getPageSource().contains(text);
    }

    public String getTitle()
    {
        return driver.getTitle();
    }
    
}


As we can see in the above code snippet, we use an object repository (RepositoryParser) to enable an extra layer of abstraction, with it we can change the locators of the elements without the need to recompile or even change the endpoint of the application to be tested against several different deployments with no need to recompile.

To achieve this, we have created an object.properties file that will hold the key/values to be loaded at execution time. In fact we have two ways to achieve this: using an XML file or using a properties file; in our case we have chosen to use a properties file.

This object repository file has information that can change but that does not require code changes and as such does not need to trigger a compilation if changed. Instead of including those in the code we are loading them at execution time, removing the need to compile again after the change. In our case we have the locators that will be used to find the page elements and the expected messages returned by each operation.


object.properties
url=http://robotwebdemo.herokuapp.com/
password.field.id=password_field
username.field.id=username_field
login.button.id=login_button
expected.login.title=Welcome Page
expected.login.success=Login succeeded
expected.error.title=Error Page
expected.login.failed=Login failed


In order to demonstrate this functionality we have defined two tests: a valid login test and an invalid login test, as you can see in the file below.


LoginTests.java
package com.idera.xray.tutorials;

import org.testng.Reporter;
import org.testng.Assert;
import org.testng.annotations.Test;
import org.testng.annotations.Listeners;
import org.testng.annotations.AfterTest;
import org.testng.annotations.BeforeTest;
import app.getxray.xray.testng.annotations.Requirement;
import app.getxray.xray.testng.annotations.XrayTest;
import org.openqa.selenium.OutputType;
import org.openqa.selenium.TakesScreenshot;
import org.openqa.selenium.WebDriver;
import org.openqa.selenium.chrome.ChromeDriver;
import org.openqa.selenium.chrome.ChromeOptions;

@Listeners({ app.getxray.xray.testng.listeners.XrayListener.class })
public class LoginTest 
{
    WebDriver driver;
    RepositoryParser repo;

    @BeforeTest
    public void setUp() throws Exception {
        ChromeOptions options = new ChromeOptions();
        options.addArguments("--no-sandbox"); // Bypass OS security model, to run in Docker
        options.addArguments("--headless");
        driver = new ChromeDriver(options);
        repo = new RepositoryParser("./src/configs/object.properties");
    }

    @AfterTest
    public void tearDown() throws Exception {
        driver.quit();
        driver = null;
        repo = null;
    }

    @Test
    @XrayTest(key = "XT-563")
    @Requirement(key = "XT-41")
    public void successLogin()
    {
        LoginPage loginPage = new LoginPage(driver).open();
        Assert.assertTrue(loginPage.isVisible());
        LoginResultsPage loginResultsPage = loginPage.login("demo", "mode");
        Assert.assertEquals(loginResultsPage.getTitle(), repo.getBy("expected.login.title"));
        Assert.assertTrue(loginResultsPage.contains(repo.getBy("expected.login.success")));
    }

    @Test
    @XrayTest(key = "XT-564", summary = "invalid login test", description = "login attempt with invalid credentials")
    public void nosuccessLogin()
    {
        LoginPage loginPage = new LoginPage(driver).open();
        Assert.assertTrue(loginPage.isVisible());
        LoginResultsPage loginResultsPage = loginPage.login("demo", "invalid");
        
        Assert.assertEquals(loginResultsPage.getTitle(), repo.getBy("expected.error.title"));
        Assert.assertTrue(loginResultsPage.contains(repo.getBy("expected.login.failed")));
    }
}


Let's look into the above code in more detail, the first highlight is regarding the Listener annotation annotation at the top of the class:

LoginTests.java
...
@Listeners({ app.getxray.xray.testng.listeners.XrayListener.class })
...


This annotation is used to register the Xray listener for the annotated test class or test method. In our case the Listener present in the extension: xray-testng-extensions (as we have referred in the Prerequisites section).


Next we are initializing the driver with the following options:

LoginTests.java
...    
	@BeforeTest
    public void setUp() throws Exception {
        ChromeOptions options = new ChromeOptions();
        options.addArguments("--no-sandbox"); // Bypass OS security model, to run in Docker
        options.addArguments("--headless");
        driver = new ChromeDriver(options);
        repo = new RepositoryParser("./src/configs/object.properties");
    }
...


Adding two arguments to the driver options:

  • --no-sandbox, to bypass the OS security model and be able to run in Docker
  • --headless, to execute the browser instance in headless mode

Also notice that we are doing these operations before each test (behaviour added with the annotation BeforeTest) and we are also initialising the object repository RepositoryParser (that only needs the file path to be loaded, which will enable us to change the file content without the need to change the code).

On the Tests definition we have added special annotations that will trigger special behaviour when processed by Xray, for this first test we are using: XrayTest and Requirement.

LoginTests.java
...    
	@Test
    @XrayTest(key = "XT-563")
    @Requirement(key = "XT-41")
    public void successLogin()
	{
...


This will allow the Test to be linked to the Test in Xray with the id XT-563 and link this Test to the Requirement in the Xray side: XT-41, we will see the information added to the report that will be generated further ahead.

On the second Test, invalidLogin, we have other examples of annotations, this time within the XrayTest we are adding a specific summary and description:

LoginTests.java
...
	@Test
    @XrayTest(key = "XT-564", summary = "invalid login test", description = "login attempt with invalid credentials")
    public void invalidLogin()
    {
        LoginPage loginPage = new LoginPage(driver).open();
        Assert.assertTrue(loginPage.isVisible());
        LoginResultsPage loginResultsPage = loginPage.login("demo", "invalid");
        
        Assert.assertEquals(loginResultsPage.getTitle(), repo.getBy("expected.error.title"));
        Assert.assertTrue(loginResultsPage.contains(repo.getBy("expected.login.failed")));
    }
...

For more information about the features available with this new extension please check xray-testng-extensions.


Execution

To execute the tests use  the following command:

mvn test


We also made available the possibility to execute the code inside a Docker container (note that a local directory should be mounted so that TestNG XML results are stored locally).

docker build . -t tutorial_java_testng_selenium
docker run --rm -v $(pwd)/reports:/source/reports -t tutorial_java_testng_selenium


Once the execution has ended the results are immediately available in the terminal. 


Report

The execution will also produce a TestNG report that will look like this one:

Junit Report
<?xml version="1.0" encoding="UTF-8"?>
<testng-results ignored="0" total="2" passed="2" failed="0" skipped="0">
  <reporter-output>
  </reporter-output>
  <suite started-at="2022-09-22T15:24:51 WEST" name="Xray-TestNG" finished-at="2022-09-22T15:25:01 WEST" duration-ms="9944">
    <groups>
    </groups>
    <test started-at="2022-09-22T15:24:51 WEST" name="xray" finished-at="2022-09-22T15:25:01 WEST" duration-ms="9944">
      <class name="com.idera.xray.tutorials.LoginTest">
        <test-method is-config="true" signature="setUp()[pri:0, instance:com.idera.xray.tutorials.LoginTest@71c8becc]" started-at="2022-09-22T15:24:51 WEST" name="setUp" finished-at="2022-09-22T15:24:54 WEST" duration-ms="2435" status="PASS">
          <reporter-output>
          </reporter-output>
        </test-method> <!-- setUp -->
        <test-method signature="invalidLogin()[pri:0, instance:com.idera.xray.tutorials.LoginTest@71c8becc]" started-at="2022-09-22T15:24:54 WEST" name="invalidLogin" finished-at="2022-09-22T15:25:01 WEST" duration-ms="7119" status="PASS">
          <reporter-output>
          </reporter-output>
          <attributes>
            <attribute name="summary">
              <![CDATA[invalid login test]]>
            </attribute> <!-- summary -->
            <attribute name="test">
              <![CDATA[XT-564]]>
            </attribute> <!-- test -->
            <attribute name="description">
              <![CDATA[login attempt with invalid credentials]]>
            </attribute> <!-- description -->
            <attribute name="labels">
              <![CDATA[]]>
            </attribute> <!-- labels -->
          </attributes>
        </test-method> <!-- invalidLogin -->
        <test-method signature="successLogin()[pri:0, instance:com.idera.xray.tutorials.LoginTest@71c8becc]" started-at="2022-09-22T15:25:01 WEST" name="successLogin" finished-at="2022-09-22T15:25:01 WEST" duration-ms="253" status="PASS">
          <reporter-output>
          </reporter-output>
          <attributes>
            <attribute name="summary">
              <![CDATA[]]>
            </attribute> <!-- summary -->
            <attribute name="test">
              <![CDATA[XT-563]]>
            </attribute> <!-- test -->
            <attribute name="description">
              <![CDATA[]]>
            </attribute> <!-- description -->
            <attribute name="requirement">
              <![CDATA[XT-41]]>
            </attribute> <!-- requirement -->
            <attribute name="labels">
              <![CDATA[]]>
            </attribute> <!-- labels -->
          </attributes>
        </test-method> <!-- successLogin -->
        <test-method is-config="true" signature="tearDown()[pri:0, instance:com.idera.xray.tutorials.LoginTest@71c8becc]" started-at="2022-09-22T15:25:01 WEST" name="tearDown" finished-at="2022-09-22T15:25:01 WEST" duration-ms="112" status="PASS">
          <reporter-output>
          </reporter-output>
        </test-method> <!-- tearDown -->
      </class> <!-- com.idera.xray.tutorials.LoginTest -->
    </test> <!-- xray -->
  </suite> <!-- Xray-TestNG -->
</testng-results>



Notice that in the above report some attributes were added to support the annotations we talked about previously, namely:

TEST-junit-report.xml
...
<attributes>
  <attribute name="summary">
      <![CDATA[invalid login test]]>
  </attribute> <!-- summary -->
  <attribute name="test">
      <![CDATA[XT-564]]>
  </attribute> <!-- test -->
  <attribute name="description">
       <![CDATA[login attempt with invalid credentials]]>
  </attribute> <!-- description -->
  <attribute name="labels">
       <![CDATA[authentication]]>
  </attribute>
...


We will not go into details as the names are self-explanatory (as they directly link to the annotations that we described previously).

Although the attributes are added to the report, Xray Cloud only supports linking to a test or requirement (through the test or requirement attribute) and define labels. 




Integrating with Xray

As we saw in the above example, where we are producing TestNG reports with the result of the tests, it is now a matter of importing those results to your Jira instance, this can be done by simply submitting automation results to Xray through the REST API, by using one of the available CI/CD plugins (e.g. for Jenkins) or using the Jira interface to do so.


API

Once you have the report file available you can upload it to Xray through a request to the REST API endpoint for TestNG, and for that the first step is to follow the instructions in v1 or v2 (depending on your usage) to obtain the token we will be using in the subsequent requests.


Authentication

The request made will look like:

curl -H "Content-Type: application/json" -X POST --data '{ "client_id": "CLIENTID","client_secret": "CLIENTSECRET" }'  https://xray.cloud.getxray.app/api/v1/authenticate

The response of this request will return the token to be used in the subsequent requests for authentication purposes.


TestNG XML results

Once you have the token we will use it in the API request with the definition of some common fields on the Test Execution, such as the target project, project version, etc.

curl -H "Content-Type: text/xml" -X POST -H "Authorization: Bearer $token" --data @"./target/surefire-reports/testng-results.xml" https://xray.cloud.getxray.app/api/v2/import/execution/testng?projectKey=XT&testPlanKey=XT-504

With this command we are injecting the results back to Xray in project XT associated to the TestPlan XT-504.


Jenkins

As you can see below we are adding a post-build action using the "Xray: Results Import Task" (from the Xray plugin available), where we have some options, we will focus in two of those, one called "TestNG XML" (simpler) and another called "TestNG XML multipart" (both explained below and that will require two extra files).


TestNG XML

  • the Jira instance (where you have your Xray instance installed)
  • the format as "TestNG XML"
  • the test results file we want to import
  • the Project key corresponding of the project, in Jira, where the results will be imported

Once the step is saved and you execute your pipeline the Test results will be injected into Xray. For more details about it check the section Xray imported results.


Jira UI

Create a Test Execution for the test that you have

Or within a Test

Fill in the necessary fields and press "Create."

Open the Test Execution and import the TestNG report.


Choose the results file and press "Submit."

The Test Execution is now updated with the test results imported. 

Details about the import details can be found in the next section.



Xray imported results

Giving that we are using an in-house extension to add extra details to the results of the execution, we will take a better look at what it means on the Xray side, starting to look at the attributes we added in the request to import the execution results:

  • TestPlanKey=XT-504
  • ProjectKey=XT

With these parameters we are injecting the results back to Xray in the project XT associated to the TestPlan XT-504.

These executions are linked to Tests, so it has automatically added the Tests to the Test Plan as we can see:

Two Tests were added:

  • XPTO-564 - Unsuccessful login
  • XPTO-563 - Successful Login


Let's look closer to each Test and the properties we added in the code, starting with the "Successful Login" Test, in code we have:



LoginTests.java
 	@Test
    @XrayTest(key = "XT-563")
    @Requirement(key = "XT-41")
    public void successLogin()
    {
        LoginPage loginPage = new LoginPage(driver).open();
        Assert.assertTrue(loginPage.isVisible());
        LoginResultsPage loginResultsPage = loginPage.login("demo", "mode");
        Assert.assertEquals(loginResultsPage.getTitle(), repo.getBy("expected.login.title"));
        Assert.assertTrue(loginResultsPage.contains(repo.getBy("expected.login.success")));
    }




In this Test we are using two new annotations that will allow us to set information on the Xray side, namely:

  • @XrayTest(key = "XT-563"), this line will associate this Test (successLogin()) to the Test in Xray with identifier XT-563
  • @Requirement("XT-41"), with this one we are defining what is the requirement that this Test will cover (creating the relation between them)


We can check that the above information is present in Xray by opening the Test XT-563:


We also have a Test Execution associated to the above Test (that was added as we have uploaded the results):


On the second Test we have a different use of the annotations and the use of the reporter to add extra information, as we can see below:


LoginTests.java
	@Test
    @XrayTest(key = "XT-564", summary = "invalid login test", description = "login attempt with invalid credentials", labels = "authentication")
    public void invalidLogin()
    {
        LoginPage loginPage = new LoginPage(driver).open();
        Assert.assertTrue(loginPage.isVisible());
        LoginResultsPage loginResultsPage = loginPage.login("demo", "invalid");
        
        Assert.assertEquals(loginResultsPage.getTitle(), repo.getBy("expected.error.title"));
        Assert.assertTrue(loginResultsPage.contains(repo.getBy("expected.login.failed")));
    }

In more detail, we have:

  • @XrayTest(summary = "invalid login test", description = "login attempt with invalid credentials"), that is adding labels to the Test that will be created when the results will be uploaded (if the Test already exists it will add this information).

The Summary is extracted from the name of the method. Summary and description passed in the attributes are not yet supported in Cloud version of Xray.


In Xray, if we open the Test Plan and look to the details of that Test (by clicking in the link over the XT-564) we can see that the labels match the ones we sent in the report:

Tips

  • after results are imported, in Jira, Tests can be linked to existing requirements/user stories (or in this case use the annotation from the code), so you can track the impact on their coverage
  • results from multiple builds can be linked to an existing Test Plan, to facilitate the analysis of test result trends across builds
  • results can be associated with a Test Environment, in case you want to analyze coverage and test results by environment later on. A Test Environment can be a testing stage (e.g. dev, staging, pre-prod, prod) or an identifier of the device/application used to interact with the system (e.g. browser, mobile OS).




References