What you'll learn

  • Define a static analysis with SonarQube
  • Define the default Quality Gate in SonarQube
  • Convert the Quality Gate result into a JUnit and ship it to Xray

Source-code for this tutorial

  • code is available in GitHub

Overview

SonarQube is an open source static code quality and security testing tool that can be integrated with your CI/CD tool or executed locally.

SonarQube helps delivering Clean Code.

Clean Code is code that's easy to read, maintain, understand and change through structure and consistency yet remains robust and secure to withstand performance demands. It lets you get the most value out of your software. - Sonar


Sonar defines Clean Code as code that has the following attributes: consistency, intentionality, adaptability, and responsibility.

The focus is mainly on new code, to ensure it is "clean" and "safe", even though it's possible to set conditions related to the whole code.

It analyzes the source code against a set of coding rules related to the respective Clean Code attributes to ensure that. The applicable coding rules depend on the enabled quality profile and the source code language being used. An issue is created for each coding rule that is broken; it can be one of: bug, vulnerability, security hotspot, or code smell.

In SonarQube, each project is associated to one Quality Gate, which provides an high-level go/no go decision on the code changes that were made based on a code quality policy.

Each Quality Gate is composed of conditions, that assess code quality metrics in the context of the project, producing measures. As an example, one of these metrics is "new issues" (i.e., issues found on the new code that break the clean code attributes mentioned above).

I fall conditions are met, then the Quality Gate is "green" (i.e., ok); otherwise, it will be "red" (i.e., failing).


Learn more

To know more about Quality Gates in general, please check the tutorial we have available here.


In this tutorial we'll import the measures, including number of new issues, code coverage threshold information, and duplication of lines.

After the SonarQube analysis a report is produced in the tool, we extract the relevant information using the API and convert it into a JUnit XML compatible report to be imported into Xray.


Pre-requisites


For this example, we will useSonarQube to assess the code quality and security level of the source code against a Quality Gate.

We will use Spring Pet Clinic open source code and introduce a security issue in it.

 We will need:



SonarQube

SonarQube has different execution possibilities; we are using the Docker container of the community edition for this example.

Run SonarQube Container

The first step is to start the SonarQube container, todo so we have followed the instructions to started locally.

$ docker run --rm \
    -p 9000:9000 \
    -v sonarqube_extensions:/opt/sonarqube/extensions \
    <image_name>


Once started it is available locally (in our case we started it in the port 9000) accessing: localhost:9000


Target Application

SonarQube will assess the code quality and security of your source code, for this example we are using the code of the Spring Pet Clinic GitHub repository.

To extract the code from Spring Pet Clinic GitHub repository we used the following command:

git clone https://github.com/spring-projects/spring-petclinic.git


Start SonarQube Analysis

The SonarQube instance has, out of the box, a default Quality Gate with a set of rules based on the Clean as You Code principle. For this example we have created a local project and we are using the default quality gate.

To run the SonarQube analysis we executed the following command in the root directory of the project:

mvn clean verify sonar:sonar \                                                                                                           
  -Dsonar.projectKey=PetShop \
  -Dsonar.projectName='PetShop' \
  -Dsonar.host.url=http://localhost:9000 \
  -Dsonar.token=<TOKEN>


The above command executes the SonarQube analysis and ships the results back to your SonarQube instance (defined by the URL and Token).

In the SonarQube interface we can see the results:

The SonarQube analysis of the code was returning success, so we inserted a security liability to make it fail. In this case we have added the definition of a password in the code.


Extract result using SonarQube API

We have extracted the analysis results using the SonarQube API and converted them to a Xray compatible report, in this case a JUnit XML file.

To achieve that we have created a python script that accesses the SonarQube API to  extract the results of the quality gate and create a JUnit XML report with those results.

fetchDataSonarQube.py
import requests
import base64
import logging
import datetime
import json
import jmespath
import sys, getopt

def generateJUnitXML(jsonContent):
  junitXML = '<?xml version="1.0" encoding="UTF-8" ?>\n'
  junitXML += '<testsuites>\n'
  junitXML += '  <testsuite name="ProjectMetrics" failures="' + str(len(jmespath.search("projectStatus.conditions[?status=='ERROR']", jsonContent))) + '" timestamp="' + datetime.datetime.now().isoformat() + '">\n'
  
  for condition in jsonContent['projectStatus']['conditions']:
    junitXML += '    <testcase name="' + condition['metricKey'] + '" status="' + condition['status'].lower() + '">\n'


    if condition['status'] == 'ERROR':
      junitXML += '      <failure message="Metric ' + condition['metricKey']+ " is " + condition['status'].lower() + '">\n'
      junitXML += '        Metric Key:'+ condition['metricKey']+ '\n'
      junitXML += '        Comparator:'+ condition['comparator']+'\n'
      junitXML += '        Error Threshold:'+ condition['errorThreshold']+'\n'
      junitXML += '        Actual Value:'+ condition['actualValue']+'\n'
      junitXML += '      </failure>\n'

    junitXML += '      <system-out>' + json.dumps(condition) + '</system-out>\n'
    junitXML += '    </testcase>\n'

  junitXML += '  </testsuite>\n'
  junitXML += '</testsuites>\n'

  return junitXML

def fetchData(authToken, endpoint):
  try:    
    #sqp_4f7646f293704964bb4dedcc67cfdf6ebd529a56
    #127.0.0.1:9000
    headers = {
      'Authorization': 'Basic ' + base64.b64encode(bytes(authToken +':','utf-8')).decode('utf-8') 
    }
    print('Fetching data from SonarQube')

    response = requests.get('http://'+ endpoint +'/api/qualitygates/project_status?projectKey=PetShop', headers=headers)

    print('Response: ' + response.text)
    jsonContent = response.json()

    # Process jsonContent and generate JUnit XML report
    junitXMLReport = generateJUnitXML(jsonContent)

    # Save the report to a file
    fs = open('junit.xml', 'w')
    fs.write(junitXMLReport)

    print(junitXMLReport)
  except Exception as error:
    logging.error('Error fetching data:', error)

def main(argv):
    try:
        opts, args = getopt.getopt(argv,"ha:e:",["afile=","efile="])
        for opt, arg in opts:
            if opt == '-h':
                print ('fetchDataSonarQube.py -a <AUTHORIZATION_TOKEN> -e <ENDPOINT>')
                sys.exit()
            elif opt in ("-a", "--afile"):
                authToken = arg
            elif opt in ("-e", "--efile"):
                endpoint = arg
    except Exception as err:
        print ("An exception occurred:", err)

    fetchData(authToken, endpoint)

if __name__ == "__main__":
   main(sys.argv[1:])efault function () {


The script produces a JUnit report with the information of the quality gate defined in SonarQube.

junit.xml
<?xml version="1.0" encoding="UTF-8" ?>
<testsuites>
  <testsuite name="ProjectMetrics" failures="1" timestamp="2024-03-21T09:30:03.174683">
    <testcase name="new_coverage" status="ok">
      <system-out>{"status": "OK", "metricKey": "new_coverage", "comparator": "LT", "errorThreshold": "80", "actualValue": "100.0"}</system-out>
    </testcase>
    <testcase name="new_duplicated_lines_density" status="ok">
      <system-out>{"status": "OK", "metricKey": "new_duplicated_lines_density", "comparator": "GT", "errorThreshold": "3", "actualValue": "0.0"}</system-out>
    </testcase>
    <testcase name="new_violations" status="error">
      <failure message="Metric new_violations is error">
        Metric Key:new_violations
        Comparator:GT
        Error Threshold:0
        Actual Value:2
      </failure>
      <system-out>{"status": "ERROR", "metricKey": "new_violations", "comparator": "GT", "errorThreshold": "0", "actualValue": "2"}</system-out>
    </testcase>
  </testsuite>
</testsuites>





Integrating with Xray

As we saw in the above example where we are producing a JUnit XML report with the result of the quality gate, it is now a matter of importing those results to your Jira instance. This can be done by simply submitting automation results to Xray through the REST API, by using one of the available CI/CD plugins (e.g. for Jenkins), or using the Jira interface to do so.

In this case we will show you how to import via the API.


API

Once you have the report file available you can upload it to Xray through a request to the REST API endpoint. For that, the first step is to follow the instructions in v1 or v2 (depending on your usage) to include authentication parameters in the following requests.


JUnit results

We will use the API request with the addition of some parameters that will set the Project to where the results will be uploaded and the Test Plan that will hold the Execution results.

In the first version of the API, the authentication used a login and password (not the token that is used in Cloud).

curl -H "Content-Type: multipart/form-data" -u admin:admin -F "file=@Junit.xml" http://<YOUR_SERVER>/rest/raven/1.0/import/execution/junit?projectKey=XT

With this command we are creating a new Test Execution that will have the results of the Tests that were executed.

Once uploaded the Test Execution will look like the example below


We can see that a new Test Execution was created with 3 Tests, one for each metric. The summary of each Test is extracted from the report.

In order to check the details we click on the play icon next to each Test, that will expand a menu where we choose "Execution Details", this will take us to the Test Execution Details Screen

In the Test Execution details we have the following relevant information:

  • Summary - Matches the measures defined in the SonarQube Quality Gate
  • Execution Status - Indicates the status of the Quality Gate condition.
  • Context - Defined in the Python script to reflect the contained information.
  • Output - Information on the measure for that context.


Bringing the information of code quality to your project will give you an overview of the entire testing process and bring that visibility up front for the team to have all the elements necessary to deliver quality products.



Tips

  • after results are imported in Jira, Tests can be linked to existing requirements/user stories, so you can track the impacts on their coverage.
  • results from multiple builds can be linked to an existing Test Plan, to facilitate the analysis of test result trends across builds.
  • results can be associated with a Test Environment, in case you want to analyze coverage and test results using that environment later on. A Test Environment can be a testing stage (e.g. dev, staging, preprod, prod) or an identifier of the device/application used to interact with the system (e.g. browser, mobile OS).




References