About Me

My photo
Related, I keep most of my code here https://github.com/rdammkoehler

Saturday, May 3, 2008

Granularity Optimization

[I've been working on this entry for quite a while, I've just finally decided to publish it.]

When planning a project it is critically important to find an optimal task size. There are several reasons behind this. First and foremost, any estimate is destined too be inaccurate. Additionally, it is impossible to effectively assign more than one resource to a task. Lastly, if tasks are defined to specifically they become brittle and must be remade more frequently. 

With respect to inaccuracy, an estimate made early in a project, when the details are not well understood, or made under pressure to simply publish a schedule is commonly incorrect by as much as 50%, sometimes more. If the project team iteratively re-estimates tasks the accuracy of the estimate might progressively improve as the start-date for the task approaches. However, the nature of estimating tells us that we are simply making the best educated guess we can about the effort required to achieve some goal. None of us are able to predict the future with consistent accuracy and precision. So, if by using an iterative re-estimation technique we can manage a accuracy of +/-10% on our estimate we're doing pretty well. But, if a task is estimated at 30 days (240 hours) and we are wrong by 10% we have at least three things to consider. 

First, if we finish 10% early (27 days) we may create idle time for the person(s) working the task. Idle time can lead to one or more dangerous effects; gold-plating being the most dangerous. As they say, Idle Hands are the Devils Tools. There are also resource utilization considerations, but they are less relevant and, at least in my opinion, irrelevant. 

Second, if we finish 10% late (33 days) we may slip a delivery or otherwise impact future tasks which has a cascading effect on the overall project. This is a more pronounced problem to be sure. I think the consequences should be obvious to the reader.

Third, in either case we over pay for the task in terms of money; for early delivery we pay for idle time (or the Devil's work), for late delivery we pay for the additional work.

In order to make effective assignments of resources ideally we have one task for one worker. This is true in most kinds of work. Those who choose to employ paired programming, for all intents and purposes are using two people in the place of one. Often times large tasks are created and assigned to a team member; e.g. Build sub-system X. But as time passes and sub-system X is not complete, more people may be assigned to do the work; to help. This is not an effective way to bring a task back in line with the schedule. If one team member is in the midst of developing a solution for a task, and suddenly more team members are added to the task, everyone's productivity slows. The original team member must slow down in order to communicate the work he has done, the work that remains, and all the considerations that he may have made to this point. The new team members can't make any progress until all this information has been communicated. Finally, all of the assignees must now communicate with each other about what has been and will be done. 

Worse still, if too many people are assigned to a task you can exceed the maximum partition-ability of the task. Think two workers, one shovel. If an when you exceed the maximum partition-ability of a task, the result is that the 'extra workers' must be idle in order to prevent conflicts. Furthermore, communication about the task must also increase which degrades all worker performance. The next result is that the task takes even longer to complete than originally scheduled.

The final concern about task granularity is over analysis and the creation of too many too detailed tasks. This becomes a problem simply because changes in the design and architecture specifics of a project will cause these tasks to become invalid. They must then be discarded or replaced. If this occurs too frequently the overhead of managing the task list outweighs the advantages of the details. This will impact team performance creating 'start delays' of tasks and also potentially increasing the amount of waste within a project; especially if some of these detailed tasks are taken from a backlog to fill gaps in the utilization of team members. 

Finding the balance between too much detail and not enough is a significant challenge for a project planner; it is one of the distinguishing characteristics of a great planner. Keeping in mind the impact of inaccuracy, the maximum partition-ability of a task,  and the risk of over analysis while planning can help smooth out a task plan while still providing a solid time/effort management mechanism.

No comments: