Quality is Conformance OF Requirements

by Alan S. Koch, PMP

What is your definition of "quality"? Many people say that a system is of high quality when it meets the needs of its users. But the needs of the users are rather difficult to monitor, and even more difficult to measure. So we in the software industry prefer to measure quality in terms of conformance to requirements.

The requirements specification is a lot easier to deal with than are people's needs. The spec can be read and re-read, and it always says the same thing. It is predictable and consistent, and I can count on it to be there when I need it. Sadly, these things cannot be said about people, and are even less applicable to those people's needs!

But there is one very large and glaring problem with defining quality in terms of the specification. This problem is so significant and so large that you can drive the wrong system through it. This problem is that the specification just might not be right! Anything that people produce may be defective, and that includes the specifications that they write. And if the specification is wrong, then how can we say that a system that conforms to it is "high-quality"?

There is no denying the fact that the quality of our requirements specifications is critically important. After all, everything we do on our projects—absolutely everything—is driven by it. We plan to produce what it describes. We devise an architecture that facilitates it. We design and code to it. And we test to see if the system does indeed satisfy it.

But what is the reality that we experience? If we're lucky, the change requests start flowing early in the project. We haven't even finished the design, and they have already changed their minds! How can we ever be expected to hit a moving target? Why can't they make up their minds? (And if we are unlucky, we finish building the product before the customer gets around to telling us that it's not what they wanted!) No matter where I go, programmers always have the same complaint. When I ask for the biggest problems they must deal with, their customers' changing requirements is always among the top five. Invariably!

But when I talk to those customers, I get a different story. In their minds, actual changes to the requirements are rare. Most of those things that are written in "change" requests do not represent a change in what they need; rather they represent a change that must be made to the system in order to bring it closer to what they actually need. They haven't changed their minds; the developers got it wrong—again!

So, who is lying?

No one is lying. What we are seeing is a breakdown in communication. When the systems folks and the customer were discussing the requirements, they stopped before they had reached a meeting of the minds. Communication is not merely a matter of people throwing words and symbols at each other. The purpose of those words and symbols is for each of the participants to build in his or her mind the meanings that the others have in their minds. Until an actual sharing of meaning has taken place, a meeting of the minds has not yet been achieved.

So, what does "meeting of the minds" have to do with constructing high-quality requirements? Everything! If we agree that a requirements spec that fails to reflect the customers' true and complete needs is defective, then a requirements process that does not bring about a meeting of the minds is equally defective. In other words, requirements defects are caused by a defective requirements process.

If we are to learn to avoid these sorts of problems, we must address and correct our defective requirements process. In order to know what is wrong with our process, we must pay attention to the underlying causes of the failures. There are many reasons why our communication processes might be breaking down, so we first need to identify the nature of the problem.

Perhaps the problem lies in the terminology they use. Because they are subject matter experts in their specialized domain and we are not, we may be misinterpreting their words and phrases. We may apply generic meanings when a more specialized meaning is intended. In these situations, we may believe we understand each other when in fact; the same words are producing different meanings in our minds. By the same token, our use of specialized software terminology may result in misunderstanding on their part because they apply generic meanings to specialized words.

These types of misunderstandings can be discovered and corrected early if our requirements elicitation process includes the practice of feeding back meaning, rather than merely parroting words. If we capture the words our customer uses, write them down, then get them to review what we wrote, we are doing nothing more than ensuring that we got the words right; it will do nothing to ensure that our meanings are accurate. But if we feed our interpretations of their words back to them in a different form (maybe pictures instead of words), or at least in different words, then a disconnect in the meaning is much more likely to be detected.

Perhaps the problem lies in assumptions they are making. Their familiarity with the things they do and how their business process works can result in some things being so common or so obvious to them that they don't think those things need to be said. They might not even think to mention them, or some things may have become so automatic that they don't even notice them any more. Most of us, if asked to describe how to drive a car, would fail to mention the fact that we re-position the steering wheel every few seconds to maintain our position in the traffic lane. That action is so automatic that we are generally not conscious of it. When this happens, critical information may be totally missing from the minds of the developers, and communication has broken down.

This is a much harder problem to solve, because neither party may realize that an understanding gap exists. Feedback techniques like those described above may be helpful in detecting these omissions, or they may not. These omissions can be reliably detected only when the customer has the opportunity to examine what is being built. So our process may need to introduce those opportunities much earlier in the development process. Instead of waiting until the product is complete, having the customer review designs, prototypes, or mock-ups will give them the opportunity to detect missing functionality before development has progressed too far.

There are many other ways in which communication may be failing; the preceding examples represent two of the most common, but you must examine the requirements problems that you are experiencing on your projects and track them down to their root causes. Once you understand where and why the breakdown took place, you are in a position to do something about it. Having identified the part of your process that is defective, you can take steps to correct that defect.

A final word of caution: The defective part of the process is not the customer (any more than it is you)! The purpose of any process is to make the people who do the work effective. If things are breaking down, it is the fault of a process that is failing to properly support the people who are doing the work. Examine the process and determine how to fix it. The only people-fix that may be necessary is training for someone who lacks the necessary knowledge or skills.

Requirements elicitation is the most challenging part of most development projects. It is the part when we must deal with those troublesome human interfaces. (Not the interfaces between people and the computer, but our own interfaces with other people.) But like any other challenge, well engineered processes can help us to master that challenge, and achieve a truly high-quality product that meets not only the specification, but also the customer's need.

©Copyright 2000-2018 Emprend, Inc. All Rights Reserved.
About us   Site Map   View current sponsorship opportunities (PDF)
Contact us for more information or e-mail
Terms of Service and Privacy Policy