New Areas
There are two issuesinternal quality audits, and measuring and analysiswhich we weren't dealing with, but which must be covered to reach CMMi Level 2. The first one gets integrated in the quality assurance area, and the second is a new area altogether.
A formal QA process had a certain impact: We were focused on testing, but QA activities as such were not considered. We were not having internal interviews checking whether things were getting done.
QA asked us to perform regular checks on our practices. We then ran checklists at the end of each sprint, making sure we followed our own rules. Repositories are where they should be, backups scheduled, tasks correctly linked to their corresponding requisites, solved tasks have their resolution fields correctly filled in and have an associated developer, worked hours, and so on. A plan is also created specifying when these checks have to be performed.
Measuring and analysis is an area we weren't addressing until the last CMMi adoption phase. Indeed, we were gathering different data about our development, but we never had time to analyze it. We set some measures aligned to our business practices and project-management concerns. We had a look at our internal tracking tool to see how long bug solving took during the last sprint, how much time we were working on new functionalities, doing design, coding, and so on. Estimate deviation was also computed and presented to the team during sprint review meetings. We already had this data in different Defect Control reports, but we weren't paying attention to it.
Because we analyze our metrics, we were able to decrease unplanned working time. The time we were working on initially unplanned tasks was a big percentage of the total sprint time, and during the last sprints it has been getting shorter.
What Went Right
What has helped us improve? For one thing, we're more confident of our process. We know we are doing what we are supposed to do according to our own procedures. CMMi greatly helped here. Also, project management tasks were easy to adapt.
We were only using expert-based estimation techniques or our best efforts most of the time. We moved to a more structured estimation making use of historical data and PERT estimation. Finally, we introduced risks as a new work item category on Defect Control. This way we didn't need an extra tool to deal with risks, and the available query mechanisms also helped here.
The Tough Points
The really painful points were related to requisite management. Defining dependencies between requisites was a big task, as was getting used to defining and managing fine-grained requirements. The traceability matrix was also tough.
Conclusion
At the end of the day, CMMi helps us do what we say we are doingforcing us to follow our own process. It also makes us aware of our own working practices, even the ones we aren't performing on a daily basis.
The effort to get adapted to CMMi has fired an internal process that makes us keep an eye on best software practices. This is not a consequence of CMMi, but the improvement process itself. Now we are running internal training on patterns, good coding practices, and restarting code inspections and informal reviews, something we run on the past and wanted to pick up again. And we have gained a deeper knowledge on CMMi itself, something that, as toolmakers, helps us better understand our customers.