52858.fb2
The late Wayne Stevens, designer of the IBM Consulting Group's Information Engineering methodology in the early 1990s, was well aware of this trap.
Whenever someone proposed a new object-centered / object-based / object-hybrid methodology for us to include in the methodology library, he would say, "Try it on a project, and tell us afterwards how it worked." They would typically object, "But that will take years! It is obvious that this is great!" To my recollection, not one of these obvious new methodologies was ever used on a project.
Since that time, I use Wayne Stevens' approach and see the same thing happen.
How are new methodologies made? Here's how I work when I am personally involved in a project:
· I adjust, tune, and invent whatever is needed to take the project to success.
· After the project, I extract those things I would repeat again under similar circumstances and add them to my repertoire of tactics and strategies.
· I listen to other project teams when they describe their experiences and the lessons they learned.
But when someone sends me a methodology proposal, I ask him to try it on a project first and report back afterwards.
Used once
The successor to "untried" is "used once." The methodology author, having discovered one project on which the methodology works, now announces it as a general solution. The reality is that different projects need different methodologies, and so any one methodology has limited ability to transfer to another project.
I went through this phase with my Crystal Orange methodology (Cockburn 1998), and so did the authors of XP. Fortunately, each of us had the good sense to create a "Truth in Advertising" label describing our own methodology’s area of applicability.
We will revisit this theme throughout the rest of the book: How do we identify the area of applicability of a methodology, and how do we tailor a methodology to a project in time to benefit the project?
Methodologically Successful Projects
You may be wondering about these project interviews I keep referring to. My work is based on looking for "methodologically successful" projects. These have three characteristics:
· The project was delivered. I don't ask if it was completed on time and on budget, just that the software went out the door and was used.
· The leadership remained intact. They didn't get fired for what they were doing.
· The people on the project would work the same way again.
The first criterion is obvious. I set the bar low for this criterion, because there are so many strange forces that affect how people refer to the "successfulness" of a project. If the software is released and gets used, then the methodology was at least that good.
The second criterion was added after I was called in to interview the people involved with a project that was advertised as being "successful." I found, after I got there, that the project manager had been fired a year into the project because no code had been developed up to that time, despite the mountains of paperwork the team had produced. This was not a large military or life-critical project, where such an approach might have been appropriate, but it was a rather ordinary, 18-developer technical software project.
The third criterion is the difficult one. For the purpose of discovering a successful methodology, it is essential that the team be willing to work in the prescribed way. It is very easy for the developers to block a methodology. Typically all they have to say is, "If I do that, it will move the delivery date out two weeks." Usually they are right, too.
If they don't block it directly, they can subvert it. I usually discover during the interview that the team subverted the process, or else they tolerated it once but wouldn't choose to work that way again.
Sometimes, the people follow a methodology because the methodology designer is present on the project. I have to apply this criterion to myself and disallow some of my own projects. If the people on the project were using my suggestions just to humor me, I couldn't know if they would use them when I wasn't present.
The pertinent question is, “Would the developers continue to work that way if the methodology author was no longer present?”
So far, I have discovered three methodologies that people are willing to use twice in a row. They are
· Responsibility-Driven Design (Wirfs-Brock 1991)
· Extreme Programming (Beck 1999)
· Crystal Clear (Cockburn 2002)
(I exclude Crystal Orange from this list, because I was the process designer and lead consultant. Also, as written, it deals with a specific configuration of technologies and so needs to be reevaluated in a different, newly adapted setting.)
Even if you are not a full-time methodology designer, you can borrow one lesson from this section about project interviews. Most of what I have learned about good development habits has come from interviewing project teams. The interviews are so informative that I keep on doing them.
This avenue of improvement is also available to you. Start your own project interview file, and discover good things that other people do that you can use yourself.
Author Sensitivity
A methodology's principles are not arrived at through an emotionally neutral algorithm but come from the author's personal background. To reverse the saying from The Wizard of Oz, "Pay great attention to the man behind the curtain."
Each person has had experiences that inform his present views and serve as their anchor points. Methodology authors are no different.
In recognition of this, Jim Highsmith has started interviewing methodology authors about their backgrounds. In Agile Software Development Ecosystems (Highsmith 2002), he will present not only each author's methodology but also his or her background.
A person's anchor points are not generally open to negotiation. They are fixed in childhood, early project experiences, or personal philosophy. Although we can renormalize a discussion with respect to vocabulary and scope, we cannot do that with personal beliefs. We can only accept the person's anchor points or disagree with them.
When Kent Beck quipped, "All methodology is based on fears," I first thought he was just being dismissive. Over time, I have found it to be largely true. One can almost guess at a methodology author's past experiences by looking at the methodology. Each element in the methodology can be viewed as a prevention against a bad experience the methodology author has had.
· Afraid that programmers make many little mistakes? Hold code reviews.
· Afraid that users don't know what they really want? Create prototypes.
· Afraid that designers will leave in the middle of the project? Have them write extensive design documentation as they go.
Of course, as the old saying goes, just because you are paranoid doesn't mean that they aren't after you. Some of your fears may be well founded. We found this in one project, as told to us over time by an adventuresome team leader. Here is the story as we heard it in our discussion group: Don't Touch My Private Variables
A team leader wanted to simplify the complex design surrounding the use of not-quite-private methods that wrote to certain local variables.
Someone in our group proposed making all methods public. This would simplify the design tremendously.
The team leader thought for a moment and then identified that he was operating on a fear that the programmers would not follow the necessary programming convention to keep the software safe. He wanted the programmers to use those public methods only for the particular programming situation that was causing trouble.
He was afraid that in the frenzy of deadlines, they would use them all the time, which would cause maintenance problems. He was willing to try the experiment of making them public and just writing on the team's whiteboard the very simple rule restricting their use.
I said, "Maybe your fears are well founded. How about if you don't just trust the people to behave well, but also write a little script to check the actual use of those methods over time? This way you will discover whether your fears are well founded or not."
The team leader agreed. The team leader went on vacation for two weeks. When he returned, he ran the script and found that the programmers had, in fact, been using the new, public methods, ignoring the note on the whiteboard.
(One person at the table chimed in here, "Well, sure, those were the only documented methods!")
This story raises an interesting point about trust: As much as I love to trust people, a weakness of people is being careless. Sometimes it is important to simply trust people, but sometimes it is important to install a mechanism to find out whether people can be trusted on a particular topic.
The final piece of personal baggage of the methodology authors is their individual philosophy. Some have a laissez-faire philosophy, some a military control philosophy. The philosophy comes with the person, shaping his experiences and being shaped by his experiences, fears, and wishes.