Value working software over comprehensive specifications
This is published under the terms of the licence summarized in the footnote.
All free-to-read materials on the Avancier web site are paid for out of income from Avancier’s training courses and methods licences.
If you find them helpful, please spread the word and link to the site in whichever social media you use.
EA frameworks encourage people to draw up and maintain comprehensive plans and specifications.
Agile software developers discourage them (and encourage reliance on developers).
Extremists in each school (say, John Zachman and Scott Ambler) hold irreconcilable positions.
The challenge for our profession is to work out where the best compromise lies.
That's what makes a study of these matters interesting.
This paper discusses one of the four Agile Manifesto principles – the one in the title.
Does agile development preclude model-driven development?
One the earliest agile principles was “Code and tests are the primary deliverables and measure of progress”.
Which is to say:
<![if !supportLists]>· <![endif]>We focus on program code rather than on specifications
<![if !supportLists]>· <![endif]>We focus on tests rather than on specifications
Waterfallists disapprove: they say fixing a problem in requirements analysis is N times cheaper than a fixing a problem in development.
Their argument goes: “The cost of code defects is only about 10% of the cost of failure.
About 35% of the cost of failure is due to design defects.
About 55% is due to requirements defects.
So, don't spend as much on code testing as on reviewing design and requirements first.”
Our “Software is not hardware” paper explains why agilists conclude exactly the reverse from the same statistics about test-revealed defects. They say:
Agilists say that completing detailed requirements specifications makes for a rigid process that puts walls between users and developers.
And, given the ever-changing nature of a business and its software, it is doomed to fail.
Agile methods excel where requirements are unclear, or where the instruction is simply, make it “faster, cheaper, better", no matter how.
But even where requirements can be detailed, agilists are against weighty requirements documentation.
Look at what maintenance teams do.
They must maintain the solution configurations used in development and testing.
Few of them maintain the original requirements configuration.
They find the requirements specification pyramid that helped during elaboration becomes less helpful when
Maintenance teams find at least some of the early specification and design artifacts to be an overhead during change management.
Agilists say to minimize the weight of documentation.
Their working assumption is that you should not detail every high-level requirement through lower-level requirements specification and design documents.
Look at what happens during development.
Agilists observe that stakeholders and business users often:
And on the IS/IT side, analysts, architects and coders don’t do their job well.
A requirements configuration is both incomplete and flawed.
A requirements configuration is never as fully detailed as a solution configuration.
Also, the requirements probably contain more flaws than the solution, because requirements are not executable and testable.
It is better to minimise the volume of requirements documentation, minimize the weight of specification and design artifacts between the top and bottom of the pyramid.
Even when a weighty hierarchical requirements specification pyramid is called for
(and it may be essential in a large project, during elaboration, during the journey from high level to low level)
the higher levels of the requirements specification pyramid get overtaken by lower levels, and the requirements are overtaken by the solution.
Once the software is coded, built and tested, the value of mid-level specification and design artifacts diminishes.
It isn’t just that some prototypes, design sketches and test cases are redundant once the built system is running.
Or that some detailed specification and design notes are transferred into comments in code or in test cases.
It is that the situation changes dramatically as soon as the customer can see the inputs and outputs of the running solution.
The customer loses interest in the requirements catalogue and presents requirements as change requests to the built solution.
Developers and testers turn these change requests directly into changes to code and tests.
But agile development shouldn’t mean no logical models are maintained.
Agilists talk a lot about the importance of a flexible approach to requirements.
But in the main, they talk about functional requirements.
They don’t talk so much about the non-functional and legal requirements.
These requirements are, by their very nature, more rigid.
And they are the requirements the architect must focus on.
Even in the most agile project, it is a good idea to reverse-engineer a requirements catalogue from the on-going work –
because it gives you a vehicle for communicating with customers and users, for example, to help you discuss priorities, cost and effort projections.
The larger the project, the more people need specifications that are more abstract than code and tests.
There is a balance to be struck between working on a model or design to get it right and incremental development.
The footnote illustrates the point with a short extract from long and wide-ranging discussions with a leading agilist.
Agile developers sit down with users and start coding almost immediately.
They stop frequently to evaluate progress and make necessary changes based on user input without the need for a substantial requirements specification pyramid.
“Graham: Suppose you agilists gathered up all user stories and notes about business rules - iron out the duplications and the conflicts - and collect them into a coherent glossary.
Won’t you'll find yourself building a domain/data model.
Are agilists against this form of documentation?
Agilist: Agilists are against spending so much time writing and refining text that software doesn't come out of the project in a timely way.
That really is the genesis of the agile movement.
Agilists got fed up with watching projects run for 2 years, collecting business rules, and then getting canceled because no one delivered any software.
Graham: Sure. My sympathy is with you there. Though I have seen sequential projects succeed.
Over a few months of requirements analysis I completed a domain/data model then handed it over to development.
The model sailed through development with no significant change over many more months.
The up-front effort was much appreciated by the development team, since it eliminated the need for iterative restructuring.
The team pored over notes I left on business rules - found those a valuable knowledge source.
Agilist: The agile approach would be to gather as many business rules as needed to make the first delivery –
opinion differs on whether after 1 week, 1 month or 1 quarter, but in that time range; and then move on to the next set of business rules.
It's called "incremental development" and it's been documented as a best practice since the early 1980s. Agilists just make a big fuss about it.
Graham: The fuss started over here about 1994, when DSDM was getting off the ground. I am convinced incremental development is the most cost effective approach.
Agilist: At the end, you may have the same domain model, and even documented in exactly the same way, but it was constructed in parallel with delivering software.
Graham: Yes often. Probably most times. And sometimes you end up with lash up.
The first implemented data model is simply wrong - some relationships are missing for example.
Changing the database schema (or rather changing the test data and the programs) is tedious and costly.
So people work around the bad structure. And live with the pain for years. Sometimes forever. Done properly, models reduce such pain.”
There is sometimes a presumption that the quality of a software product can be assessed by visual inspection of the user interface, which is true of some but not all applications.
Also, agilists like to assume that all team members are high-skilled and multi-skilled, and reading the code is natural.
In practice, stakeholders often need to discuss reader-friendly descriptions of the business rules that are to be coded.
And it is reasonable to maintain some architecture-level models in complete detail - most obviously, a data model.
There are different ways to improve requirements capture.
You can work harder at requirements analysis, complete a deeper hierarchical requirements specification pyramid and put more effort into quality reviews of documentation.
Or you can adopt a more agile approach.
Each approach can work; each has sometimes failed; neither guarantees a project will end in success.
The right answer depends on your circumstance.
In practice, you usually have to do a bit of both and find the right balance between them.
This section discusses one increasingly popular agile approach to requirements capture, which is to design and code your tests before your software.
TDD means more than defining test cases and test data before development. It means coding the tests:
The resulting tests are sometimes called executable specifications.
TDD is an iterative method – code the first test – code the first bit of program - code the second test - code a bit more program etc.
So the project moves forward in very short test-code cycles. A bank of regression test data is built up.
Q) Do you have any more of a chance to write an accurate test than you do to write accurate requirements? Yes.
Text requirements statements are abstract, imprecise and ambiguous.
Tests cases and test data are concrete, precise, unambiguous and 'executable'.
TDD is highly iterative: test case - code - test case - code.
So the majority of the test and the code are accurately aligned at any point in time.
Requirements are uncovered through incremental solution development, and software designs are refactored to minimize the code written.
Q) Is TDD about unit testing or system testing? TDD doesn’t map well to either unit or system testing in a waterfall method.
It is a different approach, with something of both unit testing and system testing about it:
Is TDD about unit testing?
TDD does involve development tools and skills.
And some developers use TDD for unit and link testing alone.
But users and requirements analysts can provide test cases for TDD.
And some agilists speak of using the TDD test cases in regression tests.
Is TDD about system testing?
Tests cases are written before the code - to express (and incrementally develop) the requirements.
The test cases are called "executable requirements".
But TDD doesn’t map well to system testing in a waterfall method, because of its need for tests to be coded.
Reservations: Defining test cases and test data as early as possible is surely right.
And customers should help suppliers to do it.
However, there are some questions about TDD that need to be addressed:
<![if !supportLists]>· <![endif]>How far does it reduce the need for weighty requirements specification?
<![if !supportLists]>· <![endif]>How does it integrate with higher levels of system testing, and regression testing?
<![if !supportLists]>· <![endif]>How does it work on a large project with several banks of test data?
<![if !supportLists]>· <![endif]>How does it square with swift change management?
<![if !supportLists]>· <![endif]>How far does TDD increase the volume of test cases and test data in the configuration that needs to be managed?
See our paper on test-driven design.
Footnote: Creative Commons Attribution-No Derivative Works Licence 2.0 07/02/2015 15:18
Attribution: You may copy, distribute and display this copyrighted work only if you clearly credit “Avancier Limited: http://avancier.website” before the start and include this footnote at the end.
No Derivative Works: You may copy, distribute, display only complete and verbatim copies of this work, not derivative works based upon it.
For more information about the licence, see http://creativecommons.org