Xray uses different issue types for implementing the testing process (with the exception of Test Runs).

This means that you are able to manually log time on those issues using Jira out-of-the-box. If you have some other apps or integrations that are based on this field, then you are going to have it available there also.


Please note

Only some issue types have time tracking enabled by default: Test Execution, Test Plan.

However, you can then easily add Time Tracking features to the other issue types (Test, Precondition, Test Set) by editing the corresponding view/default screens and adding the "Time Tracking" field.


The question then arises: How do I log the time I spent executing some tests runs?

First, Xray automatically calculates the elapsed time using the recorded time from the Timer. For old/legacy Test Runs, before the Timer existed, the Test Run elasped time was based on the "Started on" and "Finished on" dates of the Test Run. Some reports (e.g., Test Runs Report) show this value.

Concerning the logged time, for time tracking purpose, the time that you spend executing some Test Run is not logged automatically in Jira; you can manually log it on the Test Execution issue though, either from the Test Run execution screen using a logging shortcut/Timer or from the corresponding Test Execution issue using the standard time tracking mechanism from Jira. Remember that no matter what, if logged, this time is always logged on the Test Execution issue and is available on the related "Time Spent" custom field. It's not mandatory to log time; use it, if you need it.


In sum, the elapsed time (which is calculated automatically from the Timer) and the actual logged time are two different things. You can choose to log exactly the displayed elapsed time, report another value or even not log anything.


Possible usage scenarios for logging time

Below are possible usage scenarios for logging time in Xray issue types.


Issue typePurposeNotes
Testlog time related to specificationDon't log time related to execution here because Test issues are like test templates.
Preconditionlog time related to specification
Test Setlog time related to the creation of Test Set, eventually considering the time related to the specification of related Tests and Preconditions
(Sub)Test Executionlog time related to the execution of related Test runs
Test Planlog time related to environment setup, planningDon't log time related to execution twice (both in the Test Plan and in the related Test Executions).

Possible usage scenarios for using time estimates

Below are possible usage scenarios for using time estimates, through the "Original Estimate" field, in Xray issue types.


Issue typePurposeNotes
Test
  • estimated execution duration for this Test
  • estimated specification time for this Test
Using "estimated execution time" may be more adequate; however, if you choose that approach, you may have to decide where to report the estimation for the specification (e.g., in a Test Set or in some specificaiton task).
Precondition
  • estimated execution duration for this Precondition
  • estimated specification time for this Precondition
See notes for Test issue
Test Setestimated time for specifying all the Tests being grouped within this Test Set
(Sub)Test Executionestimated time for running all related Test Runs
Test Plan
  • estimated time related only with environment setup, planning
  • estimated time for all the effort associated with the validation of the Tests within the plan, including environment setup and multiple iterations
Your estimated time aim for the Test Plan should match the aim of the logged time.

Recommendations

Using time estimates and logging time depend on your use case. You can choose to use both or just one of them.

General recommendations:

  • Use the "Original Estimate" on Test issues to estimate the execution duration, so that you can make some calculations based on it.
  • If you provide testing services, using time estimates for Test Sets may be useful if you want to have an idea of the associated effort of the Tests you aim to implement related with that Test Set.
  • The Timer on the execution screen records the elapsed time; you may pause it whenever you need it; this elapsed time is not logged by default, and may be different from the logged time as mentioned ahead.
  • Log execution time in Test Executions right after you finish a given Test Run, using the log time shortcut on the Test Run execution screen, which will pick by default the elapsed time recorded with the Timer; you may also log a custom elapsed time.
  • On the Test Executions List report, or if listing Test Executions issue elsewhere, use the "Time Spent" custom field to show the logged time on the Test Execution.