Overview
JMeter is an open-source tool used for performance and load testing.
Normally used to measure web site performance, it can be also used in broader contexts.
Native features provide a reasonable set of samplers and reports; however, this may be extended using plugins.
JMeter does not provide, by default, a SLA/SLO mechanism. Basic SLAs may be implemented using assertions (e.g. duration/response assertion or custom assertion) though.
JMeter has a GUI but it can be run in command line mode using its CLI. It can produce JTL/CSV based reports or XML based reports; the latter provide additional information.
JMeter concepts
The following table provides an overview of JMeter concepts; if you're used to it, you can probably skip it.
By having these concepts present, we may reflect on their mapping somehow to Xray.
JMeter concept | What it means? |
---|---|
Test Plan | a high-level testing scope, consisting of multiple "users"/threads doing multiple acctions |
Thread Group | users |
Controller | what drives the actions and flow of tests |
Sampler (controller) | request |
Logic (controller) | a way to group and determine which samplers to run |
Transaction (controlller) | one type of logic controller that provides a way to group multiple samplers and its samples (i.e. requests) in order to obtain an additional sample based on them |
Sample | obtained sample (i.e. the "response") |
(sampler) Assertion | Assertions are used to perform additional checks on samplers, validating samples accordingly with a criteria, marking it as successful or not. |
Listener | test results/samples listener (e.g. for producing reports) |
Mapping of concepts to Xray
JMeter is not a functional testing tool; it's essentialy a load tool simulating multiple users (threads), doing several actions as they would in a typical usage scenario.
Mapping of concepts may no be straighforward thiugh.
If we aim to have visibility of the performance testing results, we need to think in the following questions:
- What can we consider the Test?
- How can we assess if was successful or not?
- What information is relevant for analysis?
Test
The Test could be the whole JMeter's test plan; this is a valid and simple approach. It depends on how you use the test plan.
A Test could also represent each user/thread on that test plan; this would create tons of Tests that would be meaningless as they would not clearly identify anything in particular and could not be reused whatsoever.
Another approach would be to use each sampler as a Test. However, samplers are normally grouped and nested under other controllers. Thus, a better approach would be to represent all controllers (samplers and logic controllers) as Tests.
Test status
Determining whether a test was successful or not, first depends on what you define as being the "Test".
In this tutorial we'll consider each controller as a Test in Xray. Classifying it as failed or not can be done based on the nested assertion results or simply on the implicit sampler' (un)successful classification.
Other relevant performance test results
As part of performance testing, the following metrics are common:
- errors (count, %)
- total elapsed time (e.g average, min, max, std dev, 90th percentile)
- latency time/TTFB (e.g average, min, max
- connect time (e.g average, min, max)
- requests throughput/requests per time unit (e.g. average)
- received bytes (total, throughput)
- sent bytes (total, throughput)
- requests (count)
Some of these may be considered as KPIs and used to define SLA/SLOs. JMeter does not provide a way to implement SLAs though.
SLAs are usually marked as being successful/met, warning or as failed/unmet.
Requirements
- JMeter
- JMeter Plugins Manager and some plugins (jmeter-http, jpgc-httpraw, jpgc-graphs-basic, jpgc-graphs-additional, jpgc-synthesis, jpgc-cmd)
- Jenkins (optional)
Description
The overall approach to have visibility of the performance results in Xray will be as follows:
- run JMeter in command line
- generate results in JTL (CSV based) format
- post-process results to
- generate a JUnit XML report, mapping each controller as a Test
- generate dashboard report, containing multiple reports/charts
- produce aggregate report or similar (e.g. synthesis report) in CSV
- produce one or more charts
- submit results to Xray along with the previously generated report assets
- fill out the "Description" field of the corresponding created Test Execution issue with
- link to project/job in Jenkins
- link to dashboard HTML report in Jenkins workspace
- aggregate report content formatted as a table
- fill out the "Description" field of the corresponding created Test Execution issue with
JPetStore example
In this example, we're load testing a fictitious pet store site name JPetStore (this site is kindly provided by Octoperf for demo purposes).
The testing scenario exercises 20 users, with a ramp-up period of 240s, doing a standard user path/scenario: go to the site, login, view a category, then a product, add to cart, buy it and logout.
There are several transactions, grouping one or more HTTP requests (i.e. using the HTTP Request sampler).
However, there are no explicit assertions; thus, all failures (i.e. samples marked as being unsuccessful) will be based on the standard HTTP response codes.
Tests can be run using JMeter GUI or using the command line jmeter
, which is the preferred approach if you wish to make it part of your CI.
We'll use Jenkins as our CI tool and we'll configure a freestyle project for running our tests.
Setup: checking out the JMeter project and setup of auxiliary variables
We need to setup some variables related to the Jira instance to be able to attach some files to the Test Execution issue later on, if we want to, using the attach_files_to_issue.sh
shell script.
These are somehow redundant with the Xray instance configuration but are necessary if we wish to expose them.
We start by defining one variable for the Jira server base URL as build pararamter.
Using the Credentials Binding plugin, we will populate two variables for the Jira instance's username and password; these will be, in turn, obtained from the credentials already stored and linked to the Xray instance configuration in Jenkins.
The "code" will be checked out from our source code versioning system (e.g. Git), which contain the JMeter project(s) saved in .jmx format along with some additional scripts.
Configuring the Build steps
The "build" is composed of several steps, starting with the one that runs JMeter.
We need to process the JTL file and produce a report that can be submited to Xray; we'll use a JUnit XML based report that will be generated using a specific tool.
About JMeter to JUnit XML converts
There are several JMeter JTL to JUnit XML converters out there. However, most of them do neither a implement a mapping of concepts that is useful nor provide additional information about the failures.
This tutorial uses a modified version (pre-built JAR) of the the jmeter-junit-xml-converter code.
It will produce a JUnit XML report containing:
- one Test Suite per each Thread
- multiple <testcase> elements, one per each controller
- add information about the duration (i.e "time" attribute) on each <testcase>
- add failure message, if available
The modified jmeter-junit-xml-converter
utility will produce a junit.xml and an alternate_junit.xml file; we want the latter as it better suits our needs. We'll call it using the converter.sh
shell script along with a parameter that will allow us to uniquely identify the Tests afterwards (e.g. "jmeter.jpetstore").
Optionally, we'll add two build steps to store the tabular aggregate report in an environment variable (e.g. AGGREGATE_TABLE) as a string. This requires the Environment Injector plugin.
Configuring the Post-build actions
Bonus tip!
The Jenkins' Performance plugin can optionally be used to create some trend charts in Jenkins and also as means to mark the build as failed or unstable depending on absolute or relative thresholds.
Test results can be submitted to Xray either by using a command line tool (e.g. curl
) or by using a specific CI plugin which in our case will be the "Xray – Test Management for Jira Plugin".
We could choose the "JUnit XML" as the format in the "Xray: Results Import Task", that would be simpler to setup.
However, if we use the "JUnit XML multipart" format, we can further customize the Test Execution issue. We'll use this as means to provide a link to the Jenkins build along with a link to dashboard report generated by JMeter. We may also provide the aggregate report table stored previously as an environment variable.
If using this format, you'll need to provide the Test Execution's issue type name (or the id) and the project key.
You may also specify the Test Plan, Revision and Test Environments fields but you'll need to obtain their custom field ID from Jira's administration. Note that these IDs are specific to each Jira instance. In the following example, "customfield_10033" corresponds to the Revision CF, "customfield_11805" to the Test Environments CF and "customfield_11807" to the Test Plan CF.
Bonus tip!
You may also attach some files (e.g. charts, reports) to the created Test Execution issue.
The Jenkins plugin exports the XRAY_TEST_EXECS variable containing the issue key of the Test Execution that was created.
For the time being, the Jenkins plugin can't upload other files; however, we can make a basic shell script (e.g. attach_files_to_issue.sh
) for that.
After running Jenkins job, we may track some performance trend charts directly in the project's page. This requires previous configuration of the Performance Plugin as mentioned earlier.
As we submitted the processed test results to Xray (alternate_junit.xml), we can now track them in Jira.
A Test Execution will be created containing a summary of results along with some useful links to access additional information in Jenkins.
Using the link provided in the description field of the Test Execution, we can access an extensive dashboard report generated by JMeter and stored in Jenkins project's workspace.
In order to correctly view it, you may need to change one settings in Jenkins: go to Manage Jenkins > Script console and execute:
System.setProperty("hudson.model.DirectoryBrowserSupport.CSP", "")
Finally, we should be able to correctly display the HTML based dashboard report.
The attachments section on the Test Execution issue provide direct access to some reports and also to a zipped file containing the dashboard report generated by JMeter.
The execution details of a specific Test Run show multiple entries, each one representing a sample.
The following screenshot showcases the details of the sample produced by the Transaction Controller named "AddToCart". We can see that it was executed multiple times, in the context of different "users" (i.e. JMeter's threads).
JPetStore with assertions example
This example (JMeter project file) is similar to the previous one with the exception that it contains some assertions: one standard Size assertion and a custom BeanShell assertion that looks at the duration and marks the sample as unsuccessful after "maxErrors" failures .
We'll use a set of variables defined at JMeter's test plan-level to assist in the assertion logic.
After results are imported to Xray, we can see each sample result in the Test Run associated to the controller (i.e. HTTP Request sampler).
Room for improvement
- abstract the whole JMeter test plan as a Test
- use Robot Framework XML report instead of JUnit to provide more granular details
- provide the possibility of linking test(s) to an existing requirement in Xray
- implement SLAs on top of results