Agile Antipattern: Overpromising Review
In the realm of software development teams, a recurring and concerning phenomenon has caught my attention — a phenomenon that I deem the “Overpromising Review.”
Within the context of agile methodologies like Scrum, a review serves as an opportunity to showcase the accomplishments of the previous iteration. However, I have observed more than once that some teams are inclined to focus more on what they intend to deliver rather than what they have actually accomplished.
Such a trend is detrimental to all parties involved, and it is imperative that we address this issue head-on.
The Veil of Overpromising
The essence of the Overpromising Review often resides within statements like:
“During the last sprint, we adjusted the registration functionality to no longer require the user to enter their personal data. They can directly create an account with only their email address. Bob will showcase this on the staging system as we still need to remove two minor bugs, but we’re confident we can deploy everything to production next Thursday.”
The Troublesome Nature of Illusory Progress
The chief concern underlying this practice is that the team claims credit for work yet to be completed. They make a (potentially erroneous) promise to all attendees of the review — the promise that the feature is finished when, in fact, it is not.
There may be valid reasons for deferring the feature’s release beyond “next Thursday.” Perhaps the resolution of the two “minor bugs” takes longer than anticipated. To an attendee who has eagerly awaited this feature, this delay might be disappointing. They may be on the verge of celebration, only to be told, “Wait just a little while longer.”
For me, this raises an important question: “If the feature is not accessible to our end users, why are we presenting it as ‘done’?”
Charting a New Course
In one of my former teams, we discovered a remarkably simple solution to this conundrum: We committed to showcasing only those features that were genuinely available on the production system. Anything that was “almost done” was simply not presented during the review.
When there are only a few lingering bugs or the deployment to the production system is the final step, there is no harm in featuring the functionality during the next review. After all, one of the fundamental tenets of agility is to foster short development and feedback cycles.
Such an approach proves to be a win-win situation. The development team can genuinely take pride in their achievements during the preceding iteration, while those eagerly awaiting the feature can confidently begin using it or promoting its availability.
Deviating from the Rule, with Care
I can anticipate the outcry from some quarters:
“But Chris, there are scenarios where we genuinely require feedback before deploying to production. Does this mean we should forego gathering that crucial input?”
Of course not.
Feedback collection is of utmost importance. Gathering feedback as early as possible holds even greater significance. However, it is crucial to set appropriate expectations.
If the aim is to present a preliminary solution during the review, seeking feedback to inform subsequent decisions, that approach is perfectly valid. But it is essential to explicitly communicate the purpose to all participants. Avoid showcasing unfinished features as “almost done” solely to initiate conversations or to highlight ongoing efforts.
Instead, preface the presentation with a clear statement: what is being showcased is not yet finished. Explain that it serves as a means to gather insights, aiding the decision-making process and propelling the feature towards completion.
Wrap-up
Honesty holds paramount importance, not only towards the participants of the review but also towards oneself.
If the work is not yet complete, there is no shame in acknowledging it. Saying “we are still working on it” is not a mark of failure but a testament to the iterative nature of software development, where progress is made step by step.