Friday, July 1, 2011

The meaning of Done - my definition

The classical question What does it mean done? is one of the most known questions in software delivery. It is very common for conference speakers and trainers to spend some time on this during presentations as this subject is very fertile and audience reactions are vivid.

Many books define the meaning of done in various contexts, for instance in a context of internal processes, software delivery model, business domain, etc. Each developer has its own definition of done. And finally each Project Manager has his particular view on this. Project Managers must take care to introduce uniform definition of done for development Teams so that the understanding among Team members is common. As a warming up excersice let's quote some well-known definitions of done:
  • Code checked-in into source code repository and / or builds successfully
  • Functionality works fine on Test environment
  • Functionality has been tested by internal QA
  • Functionality has been tested during UAT
  • Functionality has been presented on Sprint Demo
I am sure you have your own definition of done or at least discussed this subject in Teams you participate. 

When defining my own definition of done I had two goals: 
  1. to create my personal definition of done and 
  2. to incorporate the requirements and consequences of the definition as deeply and smoothly as possible into software delivery processes I use. To my understanding the definition of done needs to be integrated deeply into delivery processes. Without creating strong anchor in the delivery processes having even the best definition of done will suffer from people not following it (either consciously or unconsciously).


So finally, here it is - my personal definition of done:

A task is done when all Clients of the task accept its conformance with their requirements.

You would say this is obvious. Well, it is at first, but believe me - as simple as it sounds it is a very powerful definition. Let me share with you all indirect implications / consequences of such definition.

First, notice that the definition leaves the field for defining arbitrary acceptance criteria by not hardcoding any single general rule in it. This means each task can potentially have different acceptance criteria. In other words, acceptance criteria are defined on per task level which is much more flexible and powerful than defining acceptance criteria on project level as in examples above. My experience is that there is no single general rule that can be defined on project level - sometimes a formal Proof of Concept is required and sometimes on the contrary - verbal assurance that a task is completed works fine for a Client. Hence the need is not to fix anything too eagerly.

Second, the way the definition is formulated forces a task executor / implementer to think in terms of task's Clients. This constitutes remarkable difference when compared to a standard way implementers think of their clients. What usually happens is implementers are assigned a task (or commit to a task themselves) and they consider either Scrum Master or Product Owner their client. No-one else is on the list. Nothing can be more wrong, but this is the most common setup I saw in multiple organizations. Very often it happens that Product Owner designated from Client side is not present and Scrum Master from vendor's side takes up a role of Product Owner Proxy. In a consequence, task implementers think of him as of their Client. Also, it still happens very often that development Team gets a spec and implements the spec requirements contacting a Product Owner designated by Client if need be. Usually in such cases the spec does not describe requirements of a single person, but contains inputs from various members of Client personnel accordingly to fields of specialization. So Product Owner is de facto a proxy of the real authors of requirements. Either of the above proxying models shows that Product Owner is not always the author of requirements and thus he should not be treated as a Client of task by definition.

What is even more powerful in my definition of done is that it enforces the moment when Clients need to be defined. An implementer needs to identify task's Clients as the first thing, even before he starts implementing the task. He cannot start the implementation before he knows the requirements and he cannot gather requirements before identifying Clients. In a natural way Identification of Clients becomes the first step in Task Delivery Process.


So who might be a Client of a task? There are plenty of options. Task Clients may be external - on client-side, may be internal - a team-mate (or in general a set of team-mates) whose tasks are dependent on the task, a QA, a document-writer, etc. Interestingly I saw examples of developers adding themselves on the task's client list. It sounds extreme, but is perfectly legal approach - people wear multiple hats and play multiple roles so being Myself-Blue-Hat it is valid to ask questions to Myself-Red-Hat for requirements. In general, my experience shows that there are often multiple clients for a task.


After a task implementer identifies all of the task's Clients, he verifies the list of Clients with Scrum Master or Product Owner. Once this is done the task implementer follows the standard way – gathers requirements from the Clients, implements the task and gets acceptance from all the task's Clients.

2 comments:

  1. Great post. I think that 'done' is different business models. For me (QA tester) done means that deliverables matches the requirements before the release of the product. I always look at the stories delivered to QA from client perspective thus I behave like, I want to be treated like the client. But there is always the barrier that somehow breaks this rule : the real client changes his requirements during the implementation phase/ UAT phase thus the internal tests results become slightly obsolete.
    The second thing that can change 'done' meaning is time pressure. In this business model real client are delivered stories/features that are implemented but contain some issues. Then internal tests results should be attached to release notes to inform him that we are aware of them and we will fix it. But still from business perspective it's 'done' and the deadline was not missed.

    ReplyDelete
  2. See also Mitch Lacey's How Do We Know We Are Done @ http://www.scrumalliance.org/articles/107-how-do-we-know-when-we-are-done

    ReplyDelete