How happy are your users or customers with the software you have delivered recently? The statistics across our industry are horrifying – a Standish Group Chaos survey of 8000+ projects in 1998 found that 28% of projects are failures, 46% are challenged and only 26% are considered successful. This is an improvement: in 1994 the failure figure was 40%. This type of return on investment in other industries would probably result in malpractice claims.
What creates customer satisfaction? Let’s call it software quality, something I believe is sadly lacking in many of the systems developed today.
Trouble is, there isn't an authoritative definition of software quality, a broad concept with many components. Perhaps this is symptomatic of the problem – not enough attention has been paid to defining what is required and specifying it in an unambiguous manner.
Both the ISO and IEEE have definitions that refer to various “quality attributes”. Robert Glass, an authority in the field, says software quality is made up of a number of “ities”, including: functionality, reliability, understandability, usability (human engineering), modifiability, testability, portability, efficiency. Others have added characteristics such as flexibility and modularity.
Let’s look at functionality. By this I'm asking whether the software does what is required of it.
The Chaos survey found that of the eight main reasons given for project failures, five are requirements-related. Getting the requirements right is probably the single most important thing we can do to achieve customer satisfaction.
I recall a number of conversations with customers in which they said things like “that’s what I asked for, but now that I see it it’s not what I want”.
The level of detail and rigour in requirements-gathering can be considered a scale with two extremes – on the one hand you have systems that have high criticality and exposure (medical monitoring equipment, rail service scheduling) and on the other are ad hoc -- generally small -- systems with minimal risk and exposure (an incorrect address format in the mailing list of the local amateur dramatic society). The requirements gathering and documenting process applicable to a specific project will fall somewhere on the scale between these two extremes.
If you are building a mission-critical, safety-critical system, you must have a tightly defined and documented requirements-gathering process, where every element of functionality is explained with a NASA-like level of rigour that will provide certainty and assurance -- and possibly even legal backing -- that what gets built is what is actually needed. But even with high levels of process and rigour, ome monumental blunders have occurred (there is a difference between inches and centimetres).
A less critical system can have the requirements documented in a more flexible manner, and often development can start from somewhat loosely defined requirements. But the customer or specialist will need to make themselves available throughout the requirements-gathering, design and development phases of the project to provide immediate (or nearly so) documented answers to questions that arise from the loose or ambiguous requirements statements.
Where this process has the potential to fail is when it is not documented clearly enough during the prototyping/build phase. If a requirement is not documented (even as a simple bullet point), it cannot be tested and verified when the system is delivered, and becomes a potential point of contention. Design and build for testing and verification.
Don Mills, the resident quality and testing guru at Software Education, says a requirement that is not specific cannot be considered a specification, it is an “ification” – open for interpretation and ambiguity. If the requirement statement is not clearly and precisely stated then the designer and/or developer must decide what is wanted. These people are skilled IT professionals, not business subject matter experts, and the probability is fairly high that they will not interpret an ambiguous or unclear requirement in the same way as the subject matter expert, potentially leading to conflict and eventually to an unhappy customer. Even assuming, of course, that you have delivered a system that actually meets the requirements.
Hastie is a Wellington-based trainer in systems analysis and design and a practising software developer with over 20 years in the industry. Send letters for publication in Computerworld to Computerworld Letters.