Reports¶
The Reports feature of Power Pack allows you to gain deep insights into how the code review process is working for your team.
Choosing a report¶
All reports are available under the Reports link in the top navigation bar. From there, select the desired report:
- Time to First Feedback
This report shows a cumulative graph of the time elapsed between when a review request is first published and when the first code review is published.
- Time to Close
This report shows a cumulative graph of the time elapsed between when a review request is first published and when it is closed.
- Review Request Statistics
This report shows a table of review requests sorted by author, with various statistics about the issues raised.
- Code Review Statistics
This report shows a table of code reviews sorted by author, with various statistics about the issues raised and how often a “Ship It” is given.
- Code Review Relationships
This report visualizes the reviewer relationships within a team.
Selecting data¶
All reports require selecting some data to display. This is done by specifying a range of dates and a set of users or groups. The controls for this can be found at the top of the report box:
- Select People
Click this item to set which people you’d like to include in the report. You can either click the “Everyone” item to include everyone on the entire Review Board server, or you can add individual users and groups. In the search box, you can enter user names or the names of review groups.
Once you’ve added people, you can click “Save Selection” to save the search. These favorites can then be used by clicking on the star icon in the left of the “Select People” pop-up.
- Report Period
The report period controls allow you to choose what time period you’d like to show data for. The relative selections (such as “Last week” and “Quarter to date”) will always display relative to the time that the report is accessed. This allows you to bookmark the page and reload it for any reports which need to be run on a periodic basis. You can also select “Custom” and enter a specific date range to view.
Reports will run automatically when the parameters are changed.
Time to First Feedback¶
The “Time to First Feedback” report shows a cumulative graph of the amount of time elapsed between when a review request is first published and when the first code review for that change is published. This can be used to determine about how long users have to wait before they get feedback on a code change.
This report can help you gain insight into how quickly developers are able to iterate on a change. In an ideal case, most review requests should have a first review quite quickly (within a few hours, or at most a day or two). If the bulk of this graph falls above that, you may need to make some changes to your internal processes to prioritize code reviews.
The two dashed boxes show two different metrics of particular interest: the percentage of review requests which get an initial code review in less than one day, and the median time users have to wait (meaning that 50% of review requests get a code review within that time or less).
This example shows a very proactive team. Half of all review requests receive an initial code review within 15 minutes, and nearly all have been reviewed within one day.
Specific correlations in this graph can be determined by hovering the mouse over the blue line.
Time to Close¶
The “Time to Close” report shows a cumulative graph of the amount of time elapsed between when a review request is first published and when it is marked as closed. This can be used to determine about how long the entire code review process takes in order to approve a code change.
This report can help you gain insight into how quickly code changes take to complete. The ideal time depends greatly on the particulars of the developers and the code base in question. Complex, mission-critical code should take longer to go through code review than simple or internal projects. Likewise, experienced developers should be able to complete the code review cycle faster than junior developers who are being actively mentored. By comparing different subsets of data with this graph, you can make some determinations about the particular goals for your project.
The two dashed boxes show two different metrics of particular interest: the percentage of review requests which are closed in less than one day, and the median time it takes for a code change to go through the code review process (meaning that 50% of review requests are closed within that time or less).
This example shows a team that is a little slower. Half of all review requests are closed within a couple days, but some (about 10%) take 10 days or more.
Specific correlations in this graph can be determined by hovering the mouse over the blue line.
Review Request Statistics¶
The “Review Request Statistics” report shows a table of users along with some aggregate metrics on the review requests that they’ve posted. The different columns are:
- Review Requests
The number of review requests posted by the user in the given time-frame.
- Issues/Review Request
The average number of issues reported against each review request posted by the user.
- % of Issues Dropped
The percentage of issues opened against review requests posted by the user which are closed as “Dropped” instead of “Fixed”. This can help determine how many of the issues opened were valid.
Each column can be sorted in either ascending or descending order by clicking on the column header.
Code Review Statistics¶
The “Code Review Statistics” report shows a table of users along with some aggregate metrics on the code reviews that they’ve done for other users. The different columns are:
- Code Reviews
The number of code reviews done by the user in the given time-frame.
- Ship It!
The percentage of code reviews done by the user which give a Ship It!
- Issues/Code Review
The average number of issues reported in each code review done by the user.
- % of Issues Dropped
The percentage of issues opened by the user which are closed as “Dropped” instead of “Fixed”. This can help determine how many of the issues opened were valid.
Each column can be sorted in either ascending or descending order by clicking on the column header.
This report can give you some insight into the number and quality of code reviews done by each individual user. For example, if someone does a lot of code reviews, but never opens any issues and always clicks “Ship It!”, they’re probably not looking at the code in sufficient depth.
Code Review Relationships¶
The “Code Review Relationships” report shows a visualization of who performs code reviews for whom as a chord graph. Every user is listed along the outside of the circle, and chords are drawn between pairs of users who have done code reviews for each other. The width of the chord at each end represents how many code reviews were done by that user, and the color of the chord is the color of whichever user has done more code reviews for the other.
To get users’ real names, you can hover your mouse over the username blocks at the edge of the circle. For specific counts of code reviews in each direction, hover your mouse over a chord.
In this example, one user (username “nclements”) performs significantly more code reviews than the rest of the team.
Saving and Sharing Reports¶
Once a report is run, the URL in the browser can be bookmarked or shared to return to the same results.
Compatibility¶
The reports feature requires a fairly modern browser with SVG support (Google Chrome, Mozilla Firefox 30 or newer, or Microsoft IE 9 or newer).