Page History
Xray uses different issue types for implementing the testing process (with the exception of Test Runs).
That means that you are able to manually log time on those issues, using JIRA out-of-the-box. And if you have some other apps or integrations that are based on this field, then you are going to have it available there also.
Some reports (e.g. Test Runs Report, Test Executions Report) use the elapsed time from each Test Run. This time is not logged in JIRA automatically.
Info | ||
---|---|---|
| ||
Only some issue types have time tracking enabled by default: Test Execution, Test Plan. However, if you want, you can then easily add Time Tracking features to the other issue types (Test, Pre-Condition, Test Set), by editing the respective view/default screens and add the "Time Tracking" field. |
But then the following question arises: How do I log the time I spent executing some tests runs?.
The time that you spend executing some Test Run is not logged in JIRA automatically; you have to log it manually, normally in the respective Test Execution issue.
Note that Xray is able to automatically calculate the elapsed time from each Test Run, based on the "Started on" and "Finished on" dates of the Test Run. Some reports (e.g. Test Runs Report, Test Executions Report) use this value for the calculation of the overall elapsed time or simply display it.
Thus, elapsed time (which is calculated automatically) and actual logged time are two different things. You can choose to log exactly the shown elapsed time, to report another value or even not to log anything.
Table of Contents |
---|
Possible usage scenarios for logging time
The following table presents some possible usage scenarios for logging time in Xray issue types.
Issue type | Purpose | Notes |
---|---|---|
Test | log time related with specification | don't log time here related with execution, since Test issues are like test templates |
Pre-Condition | log time related with specification | |
Test Set | log time related with creation of Test Set, eventualy considering the time related with the specification of related Tests and Pre-Condtions | |
(Sub)Test Execution | log time related with the execution of related Test runs | |
Test Plan | log time related with environments setup, planning | don't log time related execution twice (both in the Test Plan and in the related Test Executions) |
Possible usage scenarios for using time estimates
The following table presents some possible usage scenarios for using time estimates, through the "Original Estimate" field, in Xray issue types.
Issue type | Purpose | Notes |
---|---|---|
Test |
| using "estimated execution time" may be more adequate; however if you choose that approach, you may have to decide where to report the estimation for the specification (e.g. in a Test Set or in some specificaiton task) |
Pre-Condition |
|
| see notes for Test issue | |
Test Set | estimated time for specifying all the Tests being grouped within this Test Set | |
(Sub)Test Execution | estimated time for running all the related Test Runs | |
Test Plan |
| your estimated time aim for the Test Plan should match the aim of the logged time |
Recommendations
Using time estimates and logging time depends on your use case; you can choose to use both or just some of them.
The following are some generic "recommendations" that may or not be applicable to you:
- use the "Original Estimate" on Test issues to estimate the execution duration, so you can then afterwards make some calculations based on it;
- if you're doing testing services, using time estimates for Test Sets may be useful if you want to have an idea of the associated effort of the Tests you aim to implement related with that Test Set;
- log execution time in Test Executions, right after you finish running a given Test Run based on the elapsed time. In order to obtain the elapsed time, you can use the Test Runs report. A more seamless way may be implemented in the near future.