Software Craftsmanship: Engineering by Coincidence

I was extremely disappointed to read a recent article on Coding Horror reflecting on an IEEE editorial written by Tom DeMarco. If you have not already, please read Tom DeMarco’s article now. It’s only two pages and it’s well written.

With all due respect, Tom DeMarco is wrong.

And Jeff Atwood made things worse.

According to Atwood’s interpretation of DeMarco, since we can’t control software projects, there is no sense in trying to engineer software.
What DeMarco seems to be saying -- and, at least, what I am definitely saying -- is that control is ultimately illusory on software development projects. If you want to move your project forward, the only reliable way to do that is to cultivate a deep sense of software craftsmanship and professionalism around it.
Atwood’s conclusion simply is not supported by DeMarco’s article. DeMarco made two points in his piece.
  1. We don’t have as much control over software as we think we do -- even when we can measure the software on which we work.
  2. We should be focusing more on the upfront "conception" activities than the areas that currently receive the most attention, construction.
My interpretation of "conception" activities are things like requirements, architecture, and design -- details that ultimately help you figure out whether it makes sense to build the thing you think you want to build. By framing DeMarco’s argument as "craftsmanship" vs. "engineering" Atwood misses the whole point and reopens the tired art or engineering debate. Overlooked by Atwood, DeMarco never questioned the idea that software should be engineered.
I’m gradually coming to the conclusion that software engineering is an idea whose time has come and gone. I still believe it makes excellent sense to engineer software. But that isn’t exactly what software engineering has come to mean. The term encompasses a specific set of disciplines including defined process, inspections and walkthroughs, requirements engineering, traceability matrices, metrics, precise quality control, rigorous planning and tracking, and coding and documentation standards. All these strive for consistency of practice and predictability.
DeMarco is really saying that the engineering part of software engineering has become overshadowed by a collection of best practices for building software. In my mind this isn’t necessarily a bad thing. All it means is that what has become known as "software engineering" is different than the original definition intended by the NATO Conference on Software Engineering.

But by discounting current software engineering practices, DeMarco dismisses the real engineering that went into advancing the field to where it is today.

DeMarco seems to imply that what we really want software engineering to be--the application of systematic, disciplined, quantifiable approaches to the development of software--and what software engineering has become cannot coexist. Essentially, to reach a state where metrics and measures, quantifiable approaches, are used correctly and consistently by the software development community we must stop using the term "engineering" to describe the current set of practices.

This is backwards thinking.

Engineering is more than something you do; it’s also a way of thinking about problems and solutions. Reaching the point in software engineering that we are today required systematic, disciplined, and quantifiable thinking. Over time results of this thinking have been codified into the set of best practices that most developers now take for granted.

For example, we know that there is a 100x or more difference in costs between defects discovered later in the software lifecycle than earlier. We know that certain practices can effectively remove defects at different costs and at different times throughout the lifecycle (for example, inspection vs. prototyping vs. unit testing vs. system testing). We also know that historical data is an excellent indicator of future performance on software projects.

Systematic, disciplined, quantifiable thinking was required to make these discoveries.

Because of these codified best practices, it is not always necessary to conduct experiments on a project to trust that they are working. I know unit testing combined with regular system integrations will flush certain defects from my software before those defects become a problem during system testing. I know that statistical analysis of collected task tracking data will help me better predict how long future tasks of a similar size will take. It doesn’t matter whether I completely understand the engineering behind the practice or whether I simply follow the process or use the tool. The benefits will be the same.

Does that make me less of an engineer? I don’t think so.

Using best practices codified as processes, methods, or tools on a software project means you are engineering software whether you like it or not. With many of these practices, the control mechanisms are already built in so you don’t realize that you’re already controlling your project. As DeMarco points out, it simply isn’t necessary that every engineering detail be painstakingly scrutinized for a project to be successful. For many projects, the essence of the project is sufficient to overcome the accidents encountered when engineering by coincidence.

But just because you engineer by coincidence it doesn’t make you a software craftsman. To prove it, I’m calling Jeff Atwood out. Jeff, I dare you and the Stack Overflow team to take the PSP Challenge. Take a course on the Personal Software Process, honestly give it a try -- use actual software engineering for a few weeks -- then tell me that software engineering is dead. But don’t knock it until you’ve tried it.


Popular posts from this blog

Dealing with Constraints in Software Architecture Design

If you aren't Agile… Then what are you?

Managing Multiple Ruby Versions with uru on Windows