Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.

Table of Contents


Xray uses different issue types for implementing the testing process (with the exception of Test Runs).

This means that you are able to manually log time on those issues using Jira out-of-the-box. If you have some other apps or integrations that are based on this field, then you are going to have it available there also.


Info
titlePlease note

Only some issue types have time tracking enabled by default: Test Execution, Test Plan.

However, you can then easily add Time Tracking features to the other issue types (Test, Pre-Condition, Test Set) by editing the respective view/default screens and adding the "Time Tracking" field.

=== texto adaptado da cloud


The question then arises: How do I log the time I spent executing some tests runs?

...

Concerning the logged time, for time tracking purpose, the time that you spend executing some Test Run is not logged automatically in Jira; you can manually log it on the Test Execution issue though, either from the Test Run execution screen using a logging shortcut/Timer or from the corresponding Test Execution issue using the standard time tracking mechanism from Jira. Remember that no matter what, if logged, this time is always logged on the Test Execution issue and is available on the related "Time Spent" custom field. It's not mandatory to log time; use it, if you need it.

...

In sum, the elapsed time (which is calculated automatically from the Timer) and the actual logged time are two different things. You can choose to log exactly the displayed elapsed time, report another value or even not log anything.

========== text orginal

The question arises: How do I log the time I spent executing some tests runs?.

The time that you spend executing some Test Run is not logged in Jira automatically; you have to log it manually in the corresponding Test Execution issue.

Note that Xray is able to automatically calculate the elapsed time from each Test Run based on the "Started on" and "Finished on" dates of the Test Run. Some reports (e.g., Test Runs ReportTest Executions Report) use this value for the calculation of the overall elapsed time or simply display it. 

Thus, the elapsed time (which is calculated automatically) and the actual logged time are two different things. You can choose to log exactly the displayed elapsed time, report another value or not even  log anything.


Table of Contents

Possible usage scenarios for logging time

...

  • Use the "Original Estimate" on Test issues to estimate the execution duration, so that you can make some calculations based on it.
  • If you're doing testing services, using time estimates for Test Sets may be useful if you want to have an idea of the associated effort of the Tests you aim to implement related with that Test Set.
  • The Timer on the execution screen records the elapsed time; you may pause it whenever you need it; this elapsed time is not logged by default, and may be different from the logged time as mentioned ahead.
  • Log execution time in Test Executions right after you finish a given Test Run, using the log time shortcut on the Test Run execution screen, which will pick by default the elapsed time recorded with the Timer; you may also log a custom elapsed time.
  • On the Test Executions List report, or if listing Test Executions issue elsewhere, use the "Time Spent" custom field to show the logged time on the Test Execution.