Alistair Cockburn
(pronounced "Coburn", the Scottish way )


Just-In-Time Methodology Construction

Alistair Cockburn

Humans and Technology
arc@acm.org

Abstract

Projects and teams differ, so one methodology won't fit all of them. People vary and tend to inconsistency, but they are good at communicating, looking around, and often take initiative to do whatever is needed. Is it possible to build methodologies that are habitable and effective, and accommodate these two statements?

In this paper, I show one can dynamically construct a flexible methodology that draws upon the native strengths of people, giving consideration to their native weaknesses. The mechanisms involved help with communication of both technical issues and project dependency tracking. Team morale and cooperation receive first-order attention. Work products are developed just to the point that intelligent colleagues can find their way forward. Each project's process is tailored over the course of the project to suit that project and team need.

Keywords

just-in-time methodology, dynamic methodology, flexible process, lightweight methods, software engineering practices, human factors.

  1. Synopsis of the argument

This paper builds on "A methodology per project" [Cockburn99a] and "Characterizing people as first-order, non-linear components of software development" [Cockburn99b] to discuss creating a lightweight, people-centric methodology tuned just for a particular project team, and to do so during the course of the project.

The heart of the argument is this:

Every project and team is different, but any one organization exhibits common traits, strengths and weaknesses. Through interviews, those can be identified, and used to seed the methodology.

Principles of methodology construction also seed the methodology: People factors can easily dominate process matters on a project [Cockburn99b], [Weinberg]. Light methodologies based on interpersonal communication are strong ones [Cockburn99a]. The optimal "lightness" varies by project, depending on criteria such as staff size and proximity, and the system's damage potential [Cockburn99a].

The power of a light methodology centers around using informal, face-to-face communication channels (rather than written documents) to bind the large amount of information flowing within the project. The team draws on the ability of people to "look around, ask, and discuss" while tracking project dependencies, requirements and design issues. This makes the methodology lighter and more economic, but also more "habitable" -- pleasant to live in. Experienced team leads frequently use these sorts of mechanisms to run a project to successful completion.

Based on the above, it is even possible to construct a "family" of methodologies that adapt to the local team's size and culture. Such a family is outlined at the end of the paper.

The paper has the following sections.

  1. Methodology per project and people's characteristics

The first paper that this one builds upon is "A methodology per project" [Cockburn99a]. It ends with the summary:

A methodology contains ten basic elements: roles, skills, activities, techniques, tools, teams, deliverables, standards, quality measures and project values. The main result of the paper is that there are necessarily multiple methodologies. Different methodologies are needed depending on the project size (number of people being coordinated) the criticality of the systems being created, and the priorities of the project. For any point in the size/criticality space, the designers of the methodology select a scope of concerns to address (which project roles, activities, deliverables, and standards to cover) and optimize some quality of the project, working from their personal experiences, including their wishes, fears and base philosophy. Comparison of methodologies should include these dimensions, and their relationship to the needs of the project or organization.

Four principles in methodology design were elaborated, which can be summarized in short form as:

The second of the two papers that this one builds upon is "Characterizing people as first-order, non-linear components of software development" [Cockburn99b]. It ends with the summary:

"The fundamental characteristics of "people" have a first-order effect on software development, not a lower-order effect. Consequently, understanding this first-order effect should become a first-order research agenda item, and not neglected as a second-order item. I suggest that this field of study become a primary area in the field "software engineering" for the next 20-50 years.

"I presented several characteristics of people that have recognizable effects on methodology design.

"The first is that we are sensitive to communication timing and modalities. The prediction is that physical proximity and ease of communication has dominant effect.

"The second is that people tend to inconsistency. The prediction is that methodologies requiring disciplined consistency are fragile in practice.

"The third is that people vary, not just daily, but from group to group. Methodologies don't currently, but do need to deal with this cultural variation.

"The fourth is that people like to be good citizens, are good at looking around and taking initiative. These combine to form that common success factor, "a few good people stepped in at key moments."

"Being good at communicating and looking around counter inconsistency, leading to the prediction that methodologies can make good use of low-precision artifacts whose gaps are covered by personal communication. Project histories also support this prediction, subject to the normalization of an adequately skilled staff, including management.

In the title [of ' Characterizing people as first-order, non-linear components of software development'], I refer to people as "components". That is how people are treated in the process / methodology design literature. The mistake in this approach is that "people" are highly variable and non-linear, with unique success and failure modes. Those factors are first-order, not negligible factors. Failure of process and methodology designers to account for them contributes to the sorts of unplanned project trajectories we so often see."

Connecting these two sets of results, we get:

  1. That every project deserves its own, tailored process and methodology

  2. That lightweight and face-to-face are desirable attributes.

  3. That people communicate best face-to-face.

  4. That daily variability, inconsistency and laziness are the weaknesses to allow for ("high-discipline processes are fragile").

  5. That a process should build upon good citizenship, the ability of people to look around, talk to each other, and take initiative, as the native strengths of people.

  1. Source of the material

    This paper is based on interviews, research and on-project work conducted from 1991 to 2000. Much of the interview work is quoted or summarized in [Cockburn98], [Cockburn99a], [Cockburn99b].

    Corroboration of the ideas has come from interviews with organizations reporting similar success with similar sorts of programs. The IBM Object Technology Practice in the UK has run pre-project "methodology tuning" workshops since 1998. Siemens in Munich described using a process-tuning activity at the start of projects. A military subcontractor in Florida described how his ISO-9000 certified group set up a specific, light process for each job using process checklists.

    The post-project debriefing technique is described also in Kerth's work, where it is called a "post-partum" review [Kerth].

    Altering the project's team structure and practices at the end of each increment was first described to me in the project called "Ingrid" in 1992 [Cockburn98]. It was invented out of necessity on that project. Since then I have augmented that practice with a mid-increment methodology check. I explicitly used the technique in project "Winifred" [Cockburn98], and at the Central Bank of Norway [Cockburn99a]. End-of-increment methodology tuning was used explicitly by another consultant on project "Insman" in Germany [Cockburn99a]. All three projects delivered successfully and claimed good results from the technique.

    Variable methodologies have been used for years, as described in the IBM OTP and Siemens cases above, and in [MartinOdell] and [Graham97]. As far as I know, Crystal is the first that is tuned dynamically over the course of the project.

    The practice of replacing written documents with face-to-face conversation is very old. It was advocated in the 1960s by Weinberg [Weinberg] and in the 1980s and 1990s by DeMarco [DeMarco99], [DeMarco97]. It has been an underground success tactic for many people, and recently has resurfaced independently, in eXtreme Programming [Beck99], the Crystal light methodology family [Cockburn01a], and Adaptive Software Development [Highsmith99].

  2. Dynamic Methodologies

Once we understand that every project deserves its own methodology, it is clear that any initial suggestion is just a best first guess. To improve that initial suggestion within the life of the project requires that the project uses incremental development, generally nominated to be between one week and four months in length.

The methodology is tailored:

  1. Before the project

  2. At the start of the project

  3. During the first increment

  4. After the first increment

  5. During subsequent increments

Let us look at those five periods.

Before the project

To derive the best first guess at the project's specific methodology, interview people from previous projects in the organization. The interviewees are quite likely to describe both the strengths and weaknesses of the organization. Here are two examples from different organization.

"Successful projects had good, close communication between the developers and the user groups (or management). Unsuccessful ones had poor communication"

This quote implies that getting good communication patterns and goodwill in communications should be a project management priority right from the start.

"Our user interface team all have Ph.D.s in psychology and sit together three floors above the programmers. We have some difficulty due both to the different education or world-view of the people, and the communication distance."

This second quote implies that there will need to be extra mechanisms in the process to increase contact and reviews between those two groups of people.

The interviewer should not have too many preconceptions about what answers to expect. Interviewees mention things that seem to have nothing to do with "process", but turn out to be significant anyway. Examples include the second quote above, and any mention of people's personalities, room lighting, and group social activities. Suitable questions are: "What was the history of the project?" "What did you like / not like about the way the project executed?" "What would you like to keep the same / change on your next project?" "Which things are more important?" All suggestions need to be recorded.

Collate the interviews and look for common properties. These show the strengths and weaknesses of the organization. My interviewees often put people issues at the top of the list, as the two previous examples show. I also commonly find reference to the use (or dislike) of CASE tools, reviews, requirements flux, frequency of iterations, testing, and coding standards.

At project start

At this point, construct a draft methodology for the project, particularly the teams, standards, work products and milestones. Skip over technique descriptions, as those are too bulky to write and read on the project. Settle these questions:

How long are the iterations and increments?

What reviews are done?

Where do people sit?

What work products need to be produced?

What standards are mandatory, what are recommended (for tools, drawings, tests, and code)?

What does time reporting look like?

What can be done to keep communication and morale up?

Review the draft with the team and management, making changes as needed, until consensus is achieved.

The entire process takes between two and five days, and results in a draft that is tuned to the organization and the individuals, and is ready to use on the project.

During the first increment

Approximately in the middle of the first increment, run the same sort of interview with the team members, individually or in a group meeting, depending on the size and locations of the people on the team. The dominant question to address is,

"Are we going to make it, working the way we are working?"

Typically, lack of communication, fluctuating requirements, and questions about standards are the issues that show up in the middle of the first increment. Change only those parts that are easy to change, or catastrophic if not changed. Save any fine tuning for later increments.

This activity takes one to three hours.

After each increment

Hold a team workshop after each increment, and ask:

"What did we learn?"

"What can we do better?"

The answers are likely to cross every boundary of the project, from management intervention, to timecards, to group communication, to seating, to project reviews and standards, and team composition.

As with the initial methodology suggestion, review the results with the development team, and managers.

This activity might take four to twelve hours. Some teams do this during regular working hours, some go off-site, to combine this with team building, relaxation and setting up the next increment.

During subsequent increments

Subsequent mid-increment checks can be omitted if the project is using increments three weeks or less.

The dominant questions in the mid-increment check is,

"Is this working?"

"Do we need to do something better in this increment?"

Sometimes the team tries a new idea on the second or third increment, an idea that simply does not work well. The team cannot afford to stay with such an idea, and needs an opportunity to roll back to the previous mode of working, or try out a new idea. Project "Winifred" [Cockburn98] went through three different team structures during their third increment. The team had decided that the team structure used in increments one and two was inadequate. The first two new suggestions tried in increment three, however, did not work well. The third new structure tried did work, and the team used that structure for the rest of the project.

Expect mid-increment changes to coding and review standards, team structures and informal communication structures.

This activity may take two to six hours, depending on the magnitude of the changes.

The result of twice-per-increment methodology reviews is that the team is able to take into account their most recent experiences. They produce, during the execution of the project, a methodology specifically tailored to their situation.

  1. Building on People's strengths

People issues can overshadow process issues [Cockburn99b]. Although the twice-per-increment methodology tuning sessions should settle them, we should hope not to start from scratch on every project. This section is for a discussion of using the strengths and weaknesses of people in forming some basic advice for the pre-project suggestion.

Lighter is faster, using people as communicators

Light methodologies are plausibly better than heavy methodologies (with certain caveats and costs [Cockburn99a]), and they are particularly suited when time-to-market is important. The question arises, though: If fewer work products are produced and they are produced in a more casual fashion, what binds together the information that gets generated on the project? The answer comes from two of the strengths of people, that they generally are interested in being good citizens, and communicate best in an informal, face-to-face setting [Cockburn99a].

Using these two strengths means that the methodology will rely on some amount of oral tradition and ongoing communication to bind and refine the information being generated within the project. Since spoken communication is more expressive and less expensive than written communication [Cockburn99a], the oral component can be quite large, often much larger than expected. An example is in the eXtreme Programming methodology [Beck99], where not even the requirements are written down in detail, but live in ongoing conversations between usage experts who are full participants on the development team. So far there are good results being reported on XPs heavy reliance on this oral tradition [C3Project].

The amount of oral tradition that can be tolerated depends on the project's characteristics (team size, location, project criticality, turnover, etc.) and the team itself. The exact amount to be used is one of the items that gets altered in the methodology tuning sessions.

Managing intra-project dependencies

Intra-project dependencies benefit by being moved to a largely oral binding, just as requirements and design notes do.

Often, the designated process requires someone to create a detailed task plan identifying the dependencies between requirements, architecture and function work products, and then try to keep the task-dependency sheet up to date. That last proves to be extremely difficult, since the work products are continually changing state. With even just four people, this is a difficult task; it ranges from cumbersome to impossible for larger projects.

Based on the ideas we have established so far, the alternative strategy is to rely on two strengths of people: being good at discussing, and being good at looking around. Let the team leads track and resolve their dependencies pairwise. On a weekly, semi-weekly or daily basis, the project coordinator will collect their dependency needs in a list, and get the team leads together to discuss it. The purpose of the discussion is to discover dependency needs that are not being met (and irreconcilable differences). Note that while he project coordinator summarizes the state of the dependencies for risk management reporting purposes, it is the team leads' individual communication that binds the dependency information on the project.

People are variable and inconsistent

If a common weakness of people is that they are variable and inconsistent, then the methodology must deal with the issue. Two approaches have been defined up to now, each potentially effective:

Two methodologies that exemplify the first strategy are eXtreme Programming [Beck99] and the Personal Software Process / Team Software Process [Humphreys97] [Humphreys00].

The first strategy can produce an effective methodology - if the mechanisms being held in place are effective. Strict adherence to ineffective practices leads to an ineffective methodology. This is less trivial a statement than it may seem at first, since there is no consensus in the industry as to which practices are effective or ineffective, under what various circumstances. The project leaders might enforce strict adherence to practices they considered effective, and be surprised at the negative result they encounter (see, for example, project "Reel" in [Cockburn99b]).

For the first strategy to work, it is important that the high-discipline practices be considered "tolerable" to the team. Otherwise they will avoid or fall away from the practices, losing both the discipline and the benefits. This is what causes high-discipline methodologies to be fragile. Although there is not a lot of data to examine, at least one testimonial exists that XP and TSP use tolerable mechanisms [XP3], [Webb].

Two methodologies that exemplify the second strategy are Adaptive Software Development [Highsmith99] and the Crystal methodology family [Cockburn01a] [Cockburn01b]. In the second strategy, the methodology is designed to be tolerant of individual variations and less reliant on people being consistent. In such a methodology, the team members form consensus on the minimum compliance characteristics needed in the work products and practices. Standardization is encouraged, but not enforced.

What allows the second approach to work is that the developers generally take pride in their work, so that they have personal interest in seeing that their work is acceptable.

It is not yet possible to say that one approach is better than the other, but we can make two predictions. Strict adherence to strict and effective practices should be more productive but harder to attain. Laissez-faire or tolerant practices should be easier to get adopted, but possibly less productive.

  1. methodology Families

At this point, we have established the principle of binding ongoing project information with a varying amount of oral versus written communication. We have the technique of just-in-time methodology construction. We also have a chart to identify the sort of project being undertaken (Figure 1).

 

Figure 1. Projects, organized as People x Criticality x Priority. A C3 project is one with up to 3 people, whose potential for damage is loss of comfort; an E30 project is one with up to 30 people, whose potential for damage is loss of essential moneys (from [Cockburn99a]).

The grid shown in Figure 1 comes from [Cockburn99a]. The point of the grid is to highlight that a three-person, loss-of-comfort project should properly use a different methodology than a three-person, loss-of-life project, or a 30-person, loss-of-comfort project. The loss-of-life project should use more methodology elements to ensure correctness, and the 30-person project will need more methodology elements to coordinate the large numberof people involved. The third dimension to the chart illustrates that a project working against time-to-market pressures should be set up differently from one working against traceability pressures. The chart can be used to nominate the amount of oral communication and other methodological elements for the project.

We now have the building blocks to describe families of methodologies that are dynamic and sensitive to the characteristics of people. One such is Crystal. Crystal is being developed in [Cockburn01a] and [Cockburn01b], so I include only the points here that are relevant to the discussion at hand.

All Crystal methodologies are founded on a common value, "strong on communication, light on work products". Methodological elements can be reduced to the extent that running software is delivered more frequently and interpersonal communication channels are richer. These are choices that can be made and altered on any project.

System criticality is generally not at all open for negotiation. Staff size and location should be open for negotiation, but often are not. Both constrain the methodology. More communication mechanisms must be introduced as more people or more geographically dispersed people are employed. Other mechanisms must be introduced as system criticality increases.

The Crystal family partitions projects into life-critical and non-life critical. It would also differentiate between loss of comfort and loss of essential money types projects, but project interviews to date do not show a significant difference in working styles across that range of projects. The differences seem to be those that can be handled in the methodology tuning workshops.

Crystal distinguishes between project sizes at a growth ratio of around 2 or 2.5 (Figure 2). Doubling the number of people on a project changes its communication characteristics significantly enough to call for additional coordination and communication mechanisms: the number of people that can practically be put within earshot of each other, within easy walking distance, and so on.

Crystal is segmented into color-coded bands with common characteristics: "Clear" for the smallest and lightest, then "Yellow", "Orange", "Red", "Maroon", "Blue", "Violet" and so on, for larger groups using larger methodologies.

 

Figure 2. The Crystal methodologies are named by color.

Each color brings its own rules and basic elements, each is as light as practical for that range of projects, and each gets tuned to the project at hand using the techniques described earlier in this paper.

In the following, I illustrate the principles that have been discussed so far, by outlining first a fairly "heavy" member of the Crystal family, Crystal Orange (for projects up to 40 people), then contrasting that to the lightest member of the Crystal family, Crystal Clear (for projects up to 6 people), and finally, contrasting that to XP, which is lighter still (but more disciplined).

Crystal "Orange"

Crystal "Orange" is outlined in [Cockburn98], although without the "Crystal" name there. There, the following project characterization is made:

"..for a medium-sized production project in an industrial setting. The characteristic of such are project are:

It is a common sort of project, requiring trade-offs between complete, extensive deliverables and rapid change in requirements and design. I have kept the number of deliverables low, to reduce the cost of maintaining them, yet included enough to keep the teams communicating. I tailored job assignments and teams to allow the fluidity usually needed on this kind of project. Many other sorts of projects also need provisions for fluidity and can take advantage of this methodology. "

Crystal Orange is a methodology suited for a D40 project: up to 40 people, sitting in one building, working on a system that might cause loss of discretionary moneys (a company's billing or payroll system would fall into this category). It calls out more team structures and more team coordination than is needed on a 20-person project. It is lacking in the subteaming structures that are needed on an 80-person project, and it is missing design- and code verification activities as would be used on life-critical systems. It might be extended to an E50 type of project depending on the team.

Briefly, Crystal Orange calls for the following (all described in more detail in [Cockburn98], see the section on Methodology):

Crystal "Clear"

For comparison, let us look at the lightest member of the family.

Crystal Clear is for projects in the D6 category: for up to 6 people, sitting in one room or adjoining offices. With only 6 people, there is no need to coordinate multiple subteams, as was the case in "Orange". Given that the people sit close together, there is less work needed to keep the participants aware of their part and status in the project. Clear might be usable on an E8 project.

Crystal Clear calls only for the following (described in more detail in [Cockburn01a]):

eXtreme Programming

eXtreme Programming can be substituted for Clear. As Clear, it calls for frequent deliveries, close communications, user on staff, automated regression test suites, attention to team morale, and a strong oral culture. XP started as a methodology for D6-D8 projects, and has been successfully been used on a D14 category project [C3Project]. But XP has one major difference from Clear.

eXtreme Programming and Clear are opposite with regard to discipline and tolerance. XP sits at the end of the band with fewest written work products and strictest standards. It relies on strict adherence to design standards, programming standards, pair programming, aggressive code refactoring, and fully 100% running unit tests. In keeping with a low-tolerance, high-discipline methodology, it includes the role of "coach" and various group mechanisms to keep the practices in place.

XP is designed to highlight productivity, using strict standards, while Clear is designed to tolerate variation across people, meeting a much smaller set of rules.

Given that Clear is prioritzed for tolerance, and XP is a high-discipline, low-tolerance methodology, it should be clear that XP practices can be substituted into Clear, but the Clear tolerances cannot be substituted into XP.

  1. Summary

This paper builds on two previous results.

From those, this paper makes three contributions.

First, the technique of dynamic, or just-in-time methodology construction. The technique uses open-ended interviews of previous projects and the current project, twice per delivery increment, to establish and tune rules and conventions for the project participants, and to track the communication effectiveness and morale of the team. Running this methodology-tuning workshop twice per increment allows the project team to discard ineffective ways of working, trying new ones until a set is found that suits them.

Second, the project can be preloaded with a light, effective, and habitable methodology that uses people's natural ability to communicate informally, face-to-face. It finds a balance incorporating the largest oral and smallest written component needed to hold in place the information generated on the project. The balance point differs depending on various project characteristics and on the team members.

Third, it is possible to construct a spectrum of family methodologies with common principles that also can be tuned during the course of the project. Crystal, a color-coded family of methodologies, was outlined. The color bands differ primarily in the communication elements of the methodology, without changing the basic principles: replacing written deliverables with short, rich communication channels and delivery of running software.

A project's personal methodology is seeded with a methodology from the project's color band and is then modified based on pre-project interviews. The result is tuned twice per increment over the course of the project, creating a project-personal methodology within the life of the project.

References

[Beck99] Beck, K., Extreme Programming Explained: Embrace Change, Addison-Wesley, 1999.

[C3Project] The "C3" Team, "Chrysler goes to 'Extremes'", in Distributed Object Computing, October, 1998, pp. 24-28.

[Cockburn98] Cockburn, A., Surviving Object-Oriented Projects, Addison-Wesley, 1998.

[Cockburn99a] Cockburn, A., "A methodology per project," submitted to IEEE Software. Online as Humans and Technology Technical Report, TR 99.04, http://members.aol.com/humansandt/papers/methyperproject/methyperproject.htm.

[Cockburn99b] Cockburn, A., "Characterizing People as Non-Linear, First-Order Components in Software Development", 4th International Multiconference on Systemics, Cybernetics, and Informatics, Orlando, FL, July, 2000. Online as Humans and Technology Technical Report, TR 99.05, at http://members.aol.com/humansandt/papers/nonlinear/nonlinear.htm.

[Cockburn01a] Cockburn, A., Crystal/Clear: A Human-Powered Methodology for Small Teams, Addison-Wesley, 2001, in preparation. Online draft at http://members.aol.com/humansandt/crystal/clear.

[Cockburn01b] Cockburn, A., Software Development as a Cooperative Game, Addison-Wesley, 2001, in preparation, online draft at http://members.aol.com/humansandt/crystal/game.

[DeMarco97] DeMarco, T., The Deadline, Dorset House, 1997.

[DeMarco99] DeMarco, T., Lister, T., Peopleware: Productive Projects and Teams, 2nd Ed., Dorset House, 1999.

[Graham97] Graham, I., Henderson-Sellers, B., Younessi, H., The OPEN Process Specification, Addison-Wesley, 1997.

[Highsmith99] Highsmith, J., Adaptive Software Development, Dorset House, 1999.

[Humphreys97] Humphreys, W., Introduction to the Personal Software Process, Addison-Wesley, 1997.

[Humphreys00] Humphreys, W., Introduction to the Team Software Process, Addison-Wesley, 2000.

[Kerth] Kerth, N., "An approach to postmorta, postparta, and project reviews", online at http://c2.com/doc/ppm.pdf.

[MartinOdell] Martin, J., Odell, J., Object-oriented Methods, Pragmatic Considerations, Prentice Hall, 1996.

[Schwaber] Schwaber, K., "Scrum development process", online at http://www.jeffsutherland.org/oopsla/schwapub.pdf.

[Webb] Webb, D., Humphrey, W., "Using TSP on the TaskView Project", in CrossTalk, The Journal of Defense Software Engineering, Feb 1999, pp. 3-10, online at http://www.stsc.hill.af.mil/crosstalk/1999/feb/webb.asp

[Weinberg] Weinberg, J., The Psychology of Computer Programming, Silver Edition, Dorset House, 1998.

[XP] eXtreme Programming, as described on the web: http://extremeprogramming.com.