The Service level expectation graph displays the average time to integrate code reviews, week by week, for the selected period.
You can view this data by day, week, or month. However, please note that the display of the last six months or the last year does not allow the display by day.
Psst! Be careful not to confuse this graph with the one of the same name available in the Process Axis since they do not use the same data (different sources for the two axes).
Example of use
For the following graph, the displayed period is the last three months, and the display mode is by week. Overall, we can see that 85% (85th percentile) of code reviews are integrated in three days or less. However, we can also see that some integrations require more time.
A purplish dotted line indicates the 85th percentile. This is the measure used to indicate the 85th percentile of the cycle time of the merged requests over the requested duration. The indicator presents the information rounded up. Thus, the 85th percentile of 2 days 1 hour will show three days.
The indicator is calculated as follows:
- Retrieve the set of merged pull requests for the last three months.
- Evaluate the cycle time of all pull requests.
- Determine the 85th percentile of the resulting set.
For example, we can see that a code review had a cycle time of 18 days for one week. As this graph is interactive, hovering over an item with your mouse will display more details for that review.
Similar to the graph of the same name available in the Process Axis, a review with a longer cycle time may reflect a more complex task or a longer than average wait time for feedback.
Psst! This indicator is handy for spotting outliers, understanding their cause, and fixing them. You can also see how reliable your delivery pipeline is (i.e., if your pull requests are regularly merged in the same amount of days or if it’s somewhat random).
To learn more about the variation indicator and its calculation, check out this article!