Project Retrospectives … Post-Mortems Revisited

By Larissa Moss

Remember post-mortems from the olden days? Way back when, we used to dread them because it meant that the system we just implemented was DOA (dead on arrival). Not that we had to wait until we put the system into production to know that it (and we) were in trouble. But who had the time to address the telltale signs of a system going wrong when a deadline was looming over our heads, and managers were breathing down our necks making not-so-subtle threats about our career and employment opportunities.  So, the system went in, we kept our jobs, and management was content for a day (or two), until the phone went off at 3 AM. The system had crashed. It was time for a post-mortem! Management wanted to know what went wrong, when did we know it, who dropped the ball, and how could we have avoided the situation. Unfortunately, all of the answers came too late to prevent the system from failing in the first place. And, the lessons learned were rarely taken into consideration on the next project and were almost never shared with other project teams. So, why bother with post-mortems? What’s done is done. Let’s fix it and move on to the next system.

On the other hand, we could take the opposite approach and actually apply lessons learnedduring the development of our applications (systems) to improve their quality. This approach is especially appropriate in data warehousing, where we build BI applications in small increments using collectively architected databases in the DW. Since this is a never-ending process of continuous refinement and expansion, we can benefit greatly from periodic project reviews to increase the quality of our deliverables and to improve our system development process.

 

Software Release Concept

A few months ago, in my article Extreme Scoping™: An Agile Project Management Approach, I introduced the software release concept. In that article I suggested to break each BI application or DW project into multiple software releases. I wrote:

“The scope of each software release will contain only a small portion of the application requirements, hence the name ‘extreme scoping.’ Each software release becomes a separate project. Each project is developed using an iterative prototyping approach. Each prototype is time-boxed (anywhere from 10 to 120 days). Each software release produces a production-worthy deliverable (even if you decide not to put it into production). After several software releases, you will have a completed and fully-functioning application.”

In other words, the idea of “get it right the first time” has never worked, regardless how many times we performed post-mortems.

 

Project Retrospectives

The term retrospective is very popular in agile methodologies like Scrum or XP (eXtreme Programming). It refers to a project review that is conducted after every software release of an application that is being developed. Unlike a post-mortem, a retrospective is performed regardless whether the deliverable from a software release runs perfectly or has problems. It is imperative that lessons are learned during the development of an application in order to improve its quality and to increase the speed of the development process.

Project retrospectives are also excellent forums for IT managers and business executives to become comfortable with the dynamic development process and the software release concept of BI applications. In addition, retrospectives are excellent venues for sharing the lessons learned with other project teams as well as with other business managers in the organization.

 

Topics For Project Retrospectives

As you can see from the list below, no topic is off-limits. Here are some questions to get you started.

Schedule

  • Were we able to deliver all of the requirements we had planned for the last release?
  • If not, why not? 
  • Was the scope realistic?
  • Was the schedule realistic? 
  • What slowed us down?
  • How can delays be prevented in future releases?

Budget

  • Is the application staying within budget?
  • If not, why not?
  • Is the budget realistic?
  • How can cost overruns be prevented in future releases?

Satisfaction

  • Are the users seeing the benefits they expected?
  • Are the BI tools satisfying the analytical business needs?
  • Are users satisfied with the data so far?
  • If not, why not?
  • Are the users satisfied with the functionality so far?
  • If not, why not?

Scope

  • Were scope changes requested during the last release? 
  • Were scope changes made to the overall BI application as a result of the last release?
  • What is the impact analyzed and measured? 
  • What was the impact? 
  • Could the changes have been anticipated or avoided?
  • What did we learn about scope changes and the existing change management procedure?

Negotiating Skills

  • Did anything in the last release have to be renegotiated with the users (scope, schedule, resources, budget, quality)? 
  • Was the renegotiating process with the users painless or did it create friction between the business people and IT?
  • What needs to be done to improve the renegotiating process in future releases?

Staffing

  • Did we lose any key people during the last release?
  • Why did they leave?
  • What was the impact of them leaving?
  • How can we avoid that type of loss in future releases?
  • Was the core team staffed properly?
  • Were there too few or too many team members?
  • Were the roles and responsibilities assigned appropriately?
  • Did the team work well together?
  • Was there friction?
  • What was the reason for the friction?
  • How can team spirit and team morale be increased in future releases?

Skills and Training

  • Were the skills of the team members sufficient?
  • Was “emergency training” required during the last release?
  • Was the training effective?
  • What should be done differently next time?

Project Planning and Reporting

  • Did the team track their “actual time” truthfully?
  • If not, why not?
  • Were the activities estimated correctly?
  • If not, do we know why they were overestimated or underestimated?
  • Does our procedure for tracking time and reporting project status work?
  • How can we improve it?
  • What other lessons were learned about project planning, tracking, and reporting?

Development Approach

  • Were the appropriate steps, activities, and tasks selected from the methodology?
  • If not, why not?
  • Were important tasks left out? 
  • Were unnecessary tasks included?
  • Did we use prototyping or other agile development techniques like XP or Scrum?
  • If not, why not?
  • Did the agile development techniques work for us?
  • If not, why not?

Contractors, Consultants, Vendors

  • Were outside consultants or contractors used?
  • Were they used effectively?
  • Did they take time to transfer their knowledge to our staff?
  • What lessons did we learn from negotiating with outside vendors?
  • Are the vendors following the rules or trying to go around them? 
  • How can that situation be controlled or improved in future releases?

General

  • Is communication effective?
  • Were business representatives available when needed?
  • What other lessons did we learn? 
  • What should be done differently in future releases?

 

Organizing Project Retrospectives

Project retrospectives can be formal or informal. During the development of an application, most retrospectives can be informal and limited to the core team and the participating users. However, the final retrospective may be a formal meeting to which additional stakeholders are invited. In either case, take the following items into consideration when organizing a project retrospective.

Preparation

  • The issues log needs to be examined to see which issues were effectively resolved and which ones were not.
  • The change control procedure needs to be assessed for its effectiveness. 
  • The project plan should be reviewed to determine if all the appropriate tasks were included. 
  • The estimated and actual task completion times on the project plan should be studied to determine which tasks were underestimated and which were overestimated. 
  • Any problems with the technology platform should be noted, such as problems with tools, vendors, hardware, network, etc. 
  • The budget should be reviewed to see if the actual expenditures came close to the estimated ones. 
  • The effectiveness of the training sessions should also be assessed. 
  • All of these items are potential topics for discussion at the project retrospective.

Frequency and Timing

  • While the application is being developed, schedule the retrospective the day after the software release is declared done, regardless whether done means that it went into production or not. 
  • Once the application is completed (after the final software release), it may be advisable to wait for two or three weeks before holding the final formal review. This will give the project team time to iron out the final glitches in production. It will also give the project manager and the business sponsor time to:
    • Review the project charter, project plan, project reports, project activities, and the budget
    • Collect information and metrics about the usage of the BI application, the DW databases, and the metadata repository
    • Organize the meeting

Venue

  • Most of the project retrospectives can be held onsite in a large conference room. 
  • Pagers and cell phones should be used for emergencies only and should be set to “vibrate.” 
  • The room should be set up as a conference room, and should be supplied with:
    • Several flipcharts or whiteboards
    • An overhead or data projector
    • Markers and masking tape
    • Two laptops, one for the facilitator and one for the scribe
    • Coffee – lots of strong coffee if the last release was difficult
    • Lots of ice cream to celebrate if the last release went well

Duration

  • A well-organized and frequently performed retrospective can be accomplished in four to six hours.
  • The final formal retrospective (and occasionally an interim retrospective) may last a full day, with the option of a follow-up session within a week, if necessary. 

Attendees

  • All team members from the core team, including the development track teams, and the extended team should be invited to participate in the project retrospectives. (Refer to my previous article on Self-Organizing Project Teams for an explanation of the project team structure.)
  • Team members must be prepared to contribute. That means that they must review the agenda and be prepared to discuss the topics listed on it. They must also review any documents that are pointed out to them ahead of time and be prepared to discuss them. In short, every project team member should be an active participant!

Content

  • An agenda should be sent out for each project retrospective. The following items should appear on the agenda.
    • A list of all topics to be discussed, including introduction and wrap-up, with estimated time allocations for each topic. The time estimates must take into account the complexity of the topic and the number of people participating.
    • A list of attendees. Everyone who is invited should be given the opportunity to add to the agenda and to ask for any pertinent documents to be reviewed.

Session Flow for Project Retrospectives

Project retrospectives are very structured and follow a prescribed procedure, which the attendees must follow. Certain people have the responsibility for conducting certain parts of the meeting.

Business Sponsor

It would be best if the business sponsor could open the meeting with a brief acknowledgement and appreciation for the last software release deliverable before turning the meeting over to the project manager.

Project Manager

The project manager should discuss the flow, the rules, and the expectations of the project retrospective, and turn the meeting over to a skilled facilitator.

Facilitator

The facilitator should lead the group through the topics on the agenda. The responsibilities of a facilitator include:

  • Ask the person owning a topic on the agenda to introduce the topic.
  • Solicit comments and feedback from the other participants.
  • Assure that the meeting doesn’t get bogged down on any given topic.
  • Monitor the allocated time for each topic and interrupt the discussion when the time limit has been reached. At that point, the facilitator must ask the project manager for a decision on whether to schedule a follow-on meeting or to continue the discussion at the expense of dropping another topic on the agenda.

“Scribe”

A scribeis a person who was not involved with the project. The main purpose for having a third-party scribe is to have a knowledgeable but neutral note taker who can:

  • Document the highlights of all conversations and comments.
  • Document identified action items and to whom they were assigned.

 

At the end of each project retrospective, all action items are reviewed and the person to whom an action item was assigned will give an estimate for a completion date or a reply date (a date on which an estimate will be provided for the effort to complete the action item). The group must decide who will get the task to follow up on the action items, and whether a follow-on working session is necessary and how soon.

 

Conclusion

As George Santayana once said: “Those who cannot remember the past are condemned to repeat it.”  (Life of Reason, Reason in Common Sense, Scribner’s, 1905, page 284). This statement is as applicable to DW and BI projects as it is to life and politics. In order to know how to improve the next project (the next software release), you have to learn from the mistakes made on the last software release. The project retrospective is the vehicle to discover the mistakes and to take corrective action before completing the application (system). Excluding this step would result in repeating the same mistakes in an environment that is growing rapidly and is affecting more people with each release. We have learned from experience that correcting mistakes on small systems is much easier than correcting mistakes on large systems. The DW environment with all of its dependent BI applications can quickly become a very large system!

About the Author

Larissa Moss is president of Method Focus Inc., and a senior consultant for the BI Practice at the Cutter Consortium. She has 27 years of IT experience, focused on information management. She frequently speaks at conferences worldwide on the topics of data warehousing, business intelligence, master data management, project management, development methodologies, enterprise architecture, data integration, and information quality. She is widely published and has co-authored the books Data Warehouse Project Management, Impossible Data Warehouse Situations, Business Intelligence Roadmap, and Data Strategy. Her present and past associations include Friends of NCR-Teradata, the IBM Gold Group, the Cutter Consortium, DAMA Los Angeles Chapter, the Relational Institute, and Codd & Date Consulting Group. She was a part-time faculty member at the Extended University of California Polytechnic University Pomona, and has been lecturing for TDWI, the Cutter Consortium, MIS Training Institute, Digital Consulting Inc. and Professional Education Strategies Group, Inc. She can be reached at methodfocus@earthlink.net

References and Additional Reading

Adelman, Sid, and Larissa Terpeluk Moss. Data Warehouse Project Management.  Upper Saddle River, NJ: Addison-Wesley, 2000.

Augustine, Sanjiv. Managing Agile Projects. Upper Saddle River, NJ: Prentice Hall, 2005.

Beck, Kent. Exrtreme Programming Explained: Embrace Change. Second Edition. Upper Saddle River, NJ: Addison Wesley Longman, 2005.

Cockburn, Alistair. Agile Software Development. Upper Saddle River, NJ: Addison Wesley Longman, 2001.

Highsmith, Jim. Agile Project Management, Creating Innovative Products. Boston, MA: Pearson Education, 2004.

Moss, Larissa, and Shaku Atre. Business Intelligence Roadmap. Upper Saddle River, NJ: Addison-Wesley, 2004.

Schwaber, Ken. Agile Project Management with Scrum. Redmond, WA: Microsoft Press, 2004.

Schwaber, Ken, and Mike Beedle. Agile Software Development with Scrum. Upper Saddle River, NJ: Prentice Hall, 2002.

 
Free Expert Consultation