About this report
The program increment burnup report shows the story burnup for the PI, providing a detailed view of:
- All story types
- Stories with the Type parameter set to Defect
- Stories with the Type parameter set to any value other than Defect
Some teams like to write defects as user stories, and others don’t. Like nearly everything, it depends on context, environment, industry, company size, and a lot of other variables. This report is especially useful for Portfolio Managers and Release Train Engineers for insight into the Teams' ability to deliver the in-scope stories as planned while monitoring stories that are based on defects.
The burnup chart shows how much work has been completed as well as the total amount of work. The accepted work, in the form of Level of Effort (LOE) points, is on the vertical axis; time, in the form of sprints in the PI, is on the horizontal axis. The chart is considered a combined chart, meaning it has plotting elements of a line graph and a bar graph. The display can be filtered by PI, release vehicle, and team by selecting the Extra Configs button.
To navigate to this report:
If you’re using the new navigation:
- Select Programs in the top navigation bar and select the program you want to view information about.
- On the sidebar, select Reports in the list of options.
- Select Program increment burnup; the report displays.
If you’re using the old navigation:
- Select the Reports icon from the left Navigation menu.
- Start typing the report's name in the Search box.
- Once found, select the report.
Note: You can also use the categories on the left to search for the needed reports.
- Be sure to select a Portfolio, PI, and Program to generate this report.
- PI must exist in the system and be tied to a program.
- Features must be created and tied to the PI.
- Stories, with point values, must be created and tied to features.
- The Type parameter should be set for all stories.
How are report values calculated?
- Defect Burnup: Cumulative LOE for accepted stories with Type set as Defect, by sprint.
- Productive Burnup: Cumulative LOE for accepted stories with Type set as anything other than Defect (and NULL), by sprint.
- Scope w/o Defects: Cumulative LOE for all stories for Type set as anything other than Defect (and NULL) for the PI.
- Scope: Cumulative LOE for ALL stories for all Types for the PI.
- Predicted Burnup: Using the total LOE of the PI for ALL stories for Type set as anything other than Defect (and NULL), divide that total by the number of sprints and get a predicted number of points to have at each sprint; at each sprint the number is accumulated.
- Defect Trend: Using the total LOE of the PI for ALL stories for all Types, divide that total by the number of sprints and get a predicted number of points to have at each sprint; at each sprint the number is accumulated.
- LOE points for a story are credited on the day the parent feature is accepted.
- If there are un-estimated child stories, those stories are not reflected in the scope or accepted numbers, as they do not have any points allocated to them.
- If LOE points are added or removed during the PI, the scope line will adjust upwards or downwards accordingly.
How to interpret this report
The program increment burnup report is best used as an informational tool for assessing the maturity and stability of product development efforts. At a macro level, the report is dependent upon the life cycle state of your program or product. Products or programs in the early stages of product development, characteristically have less PI stability and many more defects than those of long-lived/running programs or products. Therefore, if your product is young, you should expect to see a broad spread between the Productive Burnup and the Defects Trend, with a goal of narrowing the spread. However, organizational maturity with Agile practices impacts the chart as well. Organizations that are new to Agile are also likely to observe fluctuation in Sprint Velocity (illustrated as the black bars above). When this happens, the spread can widen as well. Performance evaluation using this report is best achieved by comparing results PI over PI, with the goal of seeking a more narrow spread.