What you'll learn

  • Define tests using Pytest with Selenium
  • Run the test and push the test report to Xray
  • Validate in Jira that the test results are available

Overview

Pytest is a popular Python testing framework that originated from the PyPy project. It supports various types of tests, including unit, functional, integration, and end-to-end.

Some key benefits of using pytest are increased ease-of-use, readability, and scaling.

Prerequisites

For this example we will use Pytest 8.1.1. It requires Python 3.8+ or PyPy3, we will use Python 3.12.2 on Windows (so be on the lookout for any of the standard syntax differences in commands between Win and Mac, especially around special characters like quotes).

Next, we will need 2 pytest plugins:

You can install the necessary dependencies using this file and the command

    pip install -r requirements.txt

Lastly, we will need access to the demo site that we aim to test.

Define the test suite

Prepare the page object and the test settings

Our test suite will validate the login feature (with valid and invalid credentials) of the demo site. For that we will create a python file describing the page object that will represent the loginPage (login_page.py):


login_page.py
import pytest

from selenium import webdriver
from selenium.webdriver.common.by import By

class LoginPage():
    USERNAME_FIELD=(By.ID,"username_field")
    PASSWORD_FIELD=(By.ID,"password_field")
    LOGIN_BUTTON=(By.ID,"login_button")
    
    def __init__(self, driver):
        self.driver = driver
    
    async def login(self, username, password):
        self.driver.find_element(*self.USERNAME_FIELD).send_keys(username)
        self.driver.find_element(*self.PASSWORD_FIELD).send_keys(password)
        self.driver.find_element(*self.LOGIN_BUTTON).click()

    async def get_title(self):
        return self.driver.title


Next, for a more robust integration with Xray, we will need to add custom markers for properties like requirements linkage. That is done in the conftest.py file which provides fixtures for an entire directory. Please note that this step is optional but recommended for the more complete data transfer. 


conftest.py
import pytest

def pytest_collection_modifyitems(session, config, items):
    for item in items:
        for marker in item.iter_markers(name="requirements"):
            requirements = marker.args[0]
            item.user_properties.append(("requirements", requirements))


The "name" attribute - in our example "requirements" - determines which property we will be handling with the marker. Please see the Tips section below for extra information.


To avoid the warning during the execution, we will register our custom markers by creating a python file called "pytest.ini" in the root directory and adding the following lines

pytest.ini
[pytest]
markers =
    requirements: link the test with some existing requirements in Jira by its issue key.


With those 3 files defined, we are ready to create the tests.

Create the tests

These are simple tests that will validate the login functionality by accessing the demo site, inserting the username and password (in one test with valid credentials and in another with invalid credentials), clicking the login button and validating if the returned page is the one that matches our expectation. 


tutorial_test.py
import pytest
import asyncio

from selenium import webdriver
from selenium.webdriver.common.by import By

from login_page import LoginPage


ENDPOINT =  "https://robotwebdemo.onrender.com/"

@pytest.fixture(params=["edge"], autouse=True)
def initialize_driver(request):
  driver = webdriver.Edge()
  request.cls.driver = driver
  driver.get(ENDPOINT)
  driver.maximize_window()
  yield
  print("Close Driver")
  driver.close()

@pytest.mark.requirements("FIN-121")
class TestLoginClass():
    
    @pytest.mark.asyncio
    async def test_LoginValidCredentials(self):
        app = LoginPage(self.driver)
        await app.login("demo", "mode")
        result = await app.get_title()
        assert result == "Welcome Page"

    @pytest.mark.asyncio
    @pytest.mark.skip(reason="Tutorial")
    async def test_LoginInvalidCredentials(self):
        app = LoginPage(self.driver)
        await app.login("demo", "mode1")
        result = await app.get_title()
        assert result == "Error Page"


A few parts to highlight:

  • You can define some fixtures in the test file rather than in the conftest.py. In our example, initialize_driver will handle starting the Edge browser (you can add more options there) and navigating to the demo site. You can pass the "autouse=True" option so that you do not need to explicitly call it for every function.


  • Since both tests validate the login feature, we will group them in a class "TestLoginClass". Keep in mind the default naming conventions for test files, test functions, and test classes to make sure pytest detects your assets.


  • Further, as both tests belong to the same requirement, we apply our custom marker from conftest.py - @pytest.mark.requirements("CAR-45") - at the class level for efficiency. We could also apply it to each test individually. 


  • Tests utilizing "await" need to explicitly have @pytest.mark.asyncio.


  • We will skip the second test with the built-in "skip" marker, just to show the difference in statuses on the Xray side.



With everything in place, we are ready to execute our suite. We will do so via this command:


pytest --junitxml=xrayreport.xml  tutorial_test.py


The results are immediately available in the terminal:

You can control the verbosity via optional arguments described in the pytest user guide - Reference 1, Reference 2



 The corresponding JUnit xml report will look like this. We can see how our custom marker is converted to the testcase property.

Junit Report
<?xml version="1.0" encoding="utf-8"?>
<testsuites>
<testsuite name="pytest" errors="0" failures="0" skipped="1" tests="2" time="27.000" timestamp="2024-03-21T11:06:19.795960" hostname="Default">
<testcase classname="tutorial_test.TestLoginClass" name="test_LoginValidCredentials[edge]" time="26.955">
<properties>
<property name="requirements" value="FIN-121" />
</properties>
</testcase>
<testcase classname="tutorial_test.TestLoginClass" name="test_LoginInvalidCredentials[edge]" time="0.000">
<properties>
<property name="requirements" value="FIN-121" />
</properties>
<skipped type="pytest.skip" message="Tutorial">tutorial_test.py:33: Tutorial</skipped>
</testcase>
</testsuite>
</testsuites>


The JUnit xml report is the key piece for the Xray integration, so we are ready for the next step.



Integrating with Xray via API

It is now a matter of importing those results to your Xray instance, which can be done in several ways. We will focus on submitting automation results to Xray through the REST API endpoint for JUnit in this tutorial and mention other methods in the Tips section.

Authenticate

The first step is to follow the instructions (in our example for v2) to obtain the $token we will be using in the subsequent requests. The request will look like this:

curl -H "Content-Type: application/json" -X POST --data '{ "client_id": "CLIENTID","client_secret": "CLIENTSECRET" }'  https://xray.cloud.getxray.app/api/v2/authenticate

Submit JUnit XML results

Once we have the $token, we will use it in the API request with the definition of the target project of the Test Execution and the key of the associated Test Plan.

curl -H "Content-Type: text/xml" -X POST -H "Authorization: Bearer $token"  --data "@xrayreport.xml" "https://xray.cloud.getxray.app/api/v2/import/execution/junit?projectKey=FIN&testPlanKey=FIN-117"


With this command we are creating a new Test Execution in the referred Test Plan with two tests that have summaries based on the names in pytest.



We can also see our tests on the requirement story we linked with the custom marker.


Tips

  • Other common ways of importing automation results are CI/CD pipelines (e.g. Jenkins with the Xray plugin and Python solutions) and the context menu option in the Test Execution Xray issue. These methods would work in a similar manner for pytest or other automation frameworks, you can check out the Playwright example or other existing Xray tutorials.


  • You have several ways of enhancing the integration with custom properties in the report: the arguments in the call to the Xray Rest API, custom markers, record_property, custom plugins (e.g. https://pypi.org/project/pytest-jira-xray/). You can see other JUnit examples in this section and the API documentation in References for the report structure as some attributes will mirror our "requirements" example (<property name="requirements" value="FIN-121"/>) while others will need a different syntax (e.g. <property name="test_description">    </property>).
    • Disclaimer: the pytest plugins are created by the community and are not officially supported by the Xray team.


Example - link a test to an existing Test issue in Jira, by its issue key

  • modify the conftest.py file with the second marker called "test_key"
conftest.py
import pytest

def pytest_collection_modifyitems(session, config, items):
    for item in items:
        for marker in item.iter_markers(name="requirements"):
            requirements = marker.args[0]
            item.user_properties.append(("requirements", requirements))
        for marker in item.iter_markers(name="test_key"):
            test_key = marker.args[0]
            item.user_properties.append(("test_key", test_key))
  • add the new marker to pytest.ini
pytest.ini
[pytest]
markers =
    requirements: link the test with some existing requirement in Jira by its issue key.
    test_key: link the test code to an existing Test issue in Jira by its issue key.
  • add the @pytest.mark.test_key("FIN-120") marker to the second test function
tutorial_test.py
...

@pytest.mark.requirements("FIN-121")
class TestLoginClass():
    
    @pytest.mark.asyncio
    async def test_LoginValidCredentials(self):
        app = LoginPage(self.driver)
        await app.login("demo", "mode")
        result = await app.get_title()
        assert result == "Welcome Page"

    @pytest.mark.asyncio
    @pytest.mark.skip(reason="Tutorial")
    @pytest.mark.test_key("FIN-120")
    async def test_LoginInvalidCredentials(self):
        app = LoginPage(self.driver)
        await app.login("demo", "mode1")
        result = await app.get_title()
        assert result == "Error Page"



  • Pytest supports BDD and parameterized tests which would follow an integration process similar to the one we describe above.


  • Results from multiple builds can be linked to an existing Test Plan, to facilitate the analysis of test result trends across builds.


  • Results can be associated with a Test Environment, in case you want to analyze coverage and test results by that environment later on. A Test Environment can be a testing stage (e.g. dev, staging, preprod, prod) or a identifier of the device/application used to interact with the system (e.g. browser, mobile OS).



References

  • No labels