Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.
Info
titleWhat you'll learn
  • Define a static analysis with SonarQube
  • Define the default Quality Gate in SonarQube
  • Convert the Quality Gate result into a JUnit and ship it to Xray
Note
iconfalse
titleSource-code for this tutorial
typeInfo
  • code is available in GitHub

Overview

SonarQube is an open source static code quality and security testing tool that can be integrated with your CI/CD tool or executed locally.

SonarQube has the concept of Quality Gates available out of the box, that allow the assessment of security rules in your source code in a systematic way. To know more about Quality Gates check the tutorial we have available here.

After the SonarQube analysis a report is produced in the tool, we extract it using the API and convert the output into a JUnit compatible report to be imported into Xray.



Pre-requisites


Expand

For this example, we will use SonarQube to assess the code quality and security level of the source code against a Quality Gate.

We will use Spring Pet Clinic open source code and introduce a security issue in it.

 We will need:



SonarQube

SonarQube has different execution possibilities, we are using the Docker container of the community edition for this example.

Run SonarQube Container

The first step is to start the SonarQube container, todo so we have followed the instructions to started locally.

Code Block
languagebash
$ docker run --rm \
    -p 9000:9000 \
    -v sonarqube_extensions:/opt/sonarqube/extensions \
    <image_name>


Once started it is available locally (in our case we started it in the port 9000) accessing: localhost:9000


Target Application

SonarQube will assess the code quality and security of your source code, for this example we are using the code of the Spring Pet Clinic github repository.

To extract the code from Spring Pet Clinic github repository we used the following command:

Code Block
languagebash
git clone https://github.com/spring-projects/spring-petclinic.git


Start SonarQube Analysis

The SonarQube instance has, out of the box, a Quality Gate define with its default rules. For this example we have created a local project and are using the default quality gate.

To run the SonarQube analysis we executed the following command in the root directory of the project:

Code Block
languagebash
mvn clean verify sonar:sonar \                                                                                                           
  -Dsonar.projectKey=PetShop \
  -Dsonar.projectName='PetShop' \
  -Dsonar.host.url=http://localhost:9000 \
  -Dsonar.token=<TOKEN>


The above command executes the SonarQube analysis and ships the results back to your SonarQube instance (defined by the URL and Token).

In the SonarQube interface we can see the results:

Info
iconfalse

The SonarQube analysis of the code was returning success, so we inserted a security liability to make it fail. In this case we have added the definition of a password in the code.


Extract Result Using SonarQube API

We have extracted those results throught the SonarQube API and convert them to a Xray compatible report, in this case, JUnit.

To achieve that we have created a python script that accesses the SonarQube API extract the results of the quality gate and create a JUnit report with those results.

Code Block
languagepy
themeDJango
titlefetchDataSonarQube.py
collapsetrue
import requests
import base64
import logging
import datetime
import json
import jmespath
import sys, getopt

def generateJUnitXML(jsonContent):
  junitXML = '<?xml version="1.0" encoding="UTF-8" ?>\n'
  junitXML += '<testsuites>\n'
  junitXML += '  <testsuite name="ProjectMetrics" failures="' + str(len(jmespath.search("projectStatus.conditions[?status=='ERROR']", jsonContent))) + '" timestamp="' + datetime.datetime.now().isoformat() + '">\n'
  
  for condition in jsonContent['projectStatus']['conditions']:
    junitXML += '    <testcase name="' + condition['metricKey'] + '" status="' + condition['status'].lower() + '">\n'


    if condition['status'] == 'ERROR':
      junitXML += '      <failure message="Metric ' + condition['metricKey']+ " is " + condition['status'].lower() + '">\n'
      junitXML += '        Metric Key:'+ condition['metricKey']+ '\n'
      junitXML += '        Comparator:'+ condition['comparator']+'\n'
      junitXML += '        Error Threshold:'+ condition['errorThreshold']+'\n'
      junitXML += '        Actual Value:'+ condition['actualValue']+'\n'
      junitXML += '      </failure>\n'

    junitXML += '      <system-out>' + json.dumps(condition) + '</system-out>\n'
    junitXML += '    </testcase>\n'

  junitXML += '  </testsuite>\n'
  junitXML += '</testsuites>\n'

  return junitXML

def fetchData(authToken, endpoint):
  try:    
    #sqp_4f7646f293704964bb4dedcc67cfdf6ebd529a56
    #127.0.0.1:9000
    headers = {
      'Authorization': 'Basic ' + base64.b64encode(bytes(authToken +':','utf-8')).decode('utf-8') 
    }
    print('Fetching data from SonarQube')

    response = requests.get('http://'+ endpoint +'/api/qualitygates/project_status?projectKey=PetShop', headers=headers)

    print('Response: ' + response.text)
    jsonContent = response.json()

    # Process jsonContent and generate JUnit XML report
    junitXMLReport = generateJUnitXML(jsonContent)

    # Save the report to a file
    fs = open('junit.xml', 'w')
    fs.write(junitXMLReport)

    print(junitXMLReport)
  except Exception as error:
    logging.error('Error fetching data:', error)

def main(argv):
    try:
        opts, args = getopt.getopt(argv,"ha:e:",["afile=","efile="])
        for opt, arg in opts:
            if opt == '-h':
                print ('fetchDataSonarQube.py -a <AUTHORIZATION_TOKEN> -e <ENDPOINT>')
                sys.exit()
            elif opt in ("-a", "--afile"):
                authToken = arg
            elif opt in ("-e", "--efile"):
                endpoint = arg
    except Exception as err:
        print ("An exception occurred:", err)

    fetchData(authToken, endpoint)

if __name__ == "__main__":
   main(sys.argv[1:])efault function () {


The script produces a JUnit report with the information of the quality gate defined in SonarQube.

Code Block
languagexml
themeDJango
titlejunit.xml
<?xml version="1.0" encoding="UTF-8" ?>
<testsuites>
  <testsuite name="ProjectMetrics" failures="1" timestamp="2024-03-21T09:30:03.174683">
    <testcase name="new_coverage" status="ok">
      <system-out>{"status": "OK", "metricKey": "new_coverage", "comparator": "LT", "errorThreshold": "80", "actualValue": "100.0"}</system-out>
    </testcase>
    <testcase name="new_duplicated_lines_density" status="ok">
      <system-out>{"status": "OK", "metricKey": "new_duplicated_lines_density", "comparator": "GT", "errorThreshold": "3", "actualValue": "0.0"}</system-out>
    </testcase>
    <testcase name="new_violations" status="error">
      <failure message="Metric new_violations is error">
        Metric Key:new_violations
        Comparator:GT
        Error Threshold:0
        Actual Value:2
      </failure>
      <system-out>{"status": "ERROR", "metricKey": "new_violations", "comparator": "GT", "errorThreshold": "0", "actualValue": "2"}</system-out>
    </testcase>
  </testsuite>
</testsuites>





Integrating with Xray

As we saw in the above example where we are producing JUnit report with the result of the quality gate, it is now a matter of importing those results to your Jira instance. This can be done by simply submitting automation results to Xray through the REST API, by using one of the available CI/CD plugins (e.g. for Jenkins), or using the Jira interface to do so.

In this case we will show you how to import via the API.


UI Tab
titleAPI

API

Once you have the report file available you can upload it to Xray through a request to the REST API endpoint. For that, the first step is to follow the instructions in v1 or v2 (depending on your usage) to obtain the token we will be using include authentication parameters in the subsequent following requests.


JUnit results

Once you have the token, we

We will use

it in

the API request

passing the target project

with the addition of some parameters that will set the Project to where the results will be uploaded and the Test Plan that will hold the Execution results.

In the first version of the API, the authentication used a login and password (not the token that is used in Cloud).

Code Block
languagebash
themeDJango
curl -H "Content-Type: text/xml" -X POST -H "Authorization: Bearer $token"  --data @"Junitmultipart/form-data" -u admin:admin -F "file=@Junit.xml" httpshttp://xray.cloud.getxray.app/api/v2/<YOUR_SERVER>/rest/raven/1.0/import/execution/junit?projectKey=XT

With this command we are creating a new Test Execution that will have the results of the Tests that were executed.

Once uploaded the Test Execution will look like the example below

Image RemovedImage Added


We can see that a new Test Execution was created with 3 Tests, one for each metric. The summary of each Test is extracted from the report.

In order to check the details we click on the details play icon next to each Test, that will expand a menu where we choose "Execution Details", this will take us to the Test Execution Details Screen

Image RemovedImage RemovedImage AddedImage Added

In the Test Execution details we have the following relevant information:

  • Summary - Matches the measures defined in the SonarQube Quality Gate
  • Execution Status - Indicates the overall status of the execution of the security tests.
  • Context - Defined in the Python script to reflect the contained information.
  • Output - Information on the measure for that context.


Bringing the information of security tests to your project will give you an overview of the entire testing process and bring that visibility up front for the team to have all the elements necessary to deliver quality products.



Tips

  • after results are imported in Jira, Tests can be linked to existing requirements/user stories, so you can track the impacts on their coverage.
  • results from multiple builds can be linked to an existing Test Plan, to facilitate the analysis of test result trends across builds.
  • results can be associated with a Test Environment, in case you want to analyze coverage and test results using that environment later on. A Test Environment can be a testing stage (e.g. dev, staging, preprod, prod) or an identifier of the device/application used to interact with the system (e.g. browser, mobile OS).




References

Table of Contents
classtoc-btf

CSS Stylesheet
.toc-btf {
    position: fixed;
}