Agile 2 – On Agile Software Development
Copyright Graham Berrisford. One of several hundred papers at http://avancier.website. Last updated 17/04/2019 10:58
This is the second in a series of papers.
The previous paper traced the development of agile methods from the 1940s
Here, we can start from the manifesto published in 2001.
The Agile manifesto – 2001
We value individuals and interactions - over processes and tools
We value working software - over comprehensive documentation
We value customer collaboration - over contract negotiation
We value responding to change - over following a plan
An agile software development method is not only iterative, but also flexible about the requirements, the solution and the process being followed.
· It favours negotiation over planning, and flexibility about requirements.
· It presumes user involvement and feedback is essential to success.
· It encourages early testing for usability and performance.
· It favours short-cycle iterative development.
· It looks to deliver a “minimal viable product” and extend that incrementally.
· It encourages a team to change, rebuild and re-test a software system on a daily basis.
Traditionally, agile methods were designed for a single application development team.
If agile has an underlying theory, it is at least partly a psycho-bio-sociological one.
Agile methods for a software development team draw from socio-cultural systems thinking.
This section is distilled from this source by Brendan Wovchko
“Scrum and Kanban aren’t competitors, they are experiments every team should try.
I’m willing to bend my rule about prescription and offer some advice on five conditions under which I’ve found Kanban to be a better fit than Scrum.
1. Low Tolerance for Change
Many organizations [resist] change… even if they aren’t getting results.
Their solution isn’t to rethink how they work, it’s usually to make their teams work longer and harder—which isn’t a real solution.
Scrum is a threat to change-adverse organizations primarily because of how quickly it transforms roles and meetings.
Kanban doesn’t use transformative change, it embraces evolutionary change.
Kanban employs a start with what you do now mindset that introduces teams to the shallow end of the pool before taking them toward the deep end of maturity.
Kanban doesn’t require any changes to roles or meetings when first getting started.
2. Obvious at All Levels
Kanban… is immediately intuitive to anyone.
A Kanban board is an instant sense-making device; it requires zero explanation to understand.
However…. Kanban is far more sophisticated than a simple tool for visualization.
Kanban is built for speed but most teams will never master the behaviors that produce those results because their commitment to learning Kanban stops at visualization.
3. Fluid Priorities
Scrum produces the best results when a team commits to a batch of work - and is empowered to remain focused for the duration of their iteration.
Scrum requires informed and empathetic stakeholders who are bought-in to being agile.
Kanban is able to survive conditions where agile culture doesn’t yet exist because it encourages optionality.
A Kanban team prefers to not commit to work in batches and they don’t commit to work until they start it.
This means a Kanban team can… respond to emergencies or changing priorities without needing to renegotiate commitments.
4. Small Teams
The ideal size of a Scrum team is a two pizza team.
This ensures that a team is small enough to be efficient and large enough where the time investment in meetings makes sense.
If your team is smaller, Kanban is the best option.
5. Complex Collaboration
My favorite attribute of Scrum is the cross-functional team… comprised of people from different departments and disciplines.
Its not just engineers and testers anymore; today, we have user experience, visual design, writing, editing and many other activities.
If your team has a large number of activities, the strategies of sprinting ahead or using a scaling framework may introduce unnecessary complexity and delay.
Kanban doesn’t utilize time boxing to create predictability, it uses lead time—so it is capable of sustainably supporting an unlimited number of activities and collaborations.
I’d continue to encourage you to resist the idea that Scrum and Kanban are competitors or enemies.
Both are a means to help teams and their stakeholders achieve sustainable success.
Don’t assume that whichever you best understand or most frequently use is superior.
Adopt a true agile mindset and experiment with both!”
Gurus in agile development and architecture frameworks promote general principles.
It is important to recognise where principles can be in conflict.
A theme here is that balancing tradeoffs is more important than following uni-directional principles.
For many decades, general system design principles have included:
· Decomposition: modularise a large monolithic system into subsystems.
· Decoupling: strive for tight cohesion within a subsystem and loose coupling between subsystems (Constantine, 1968).
The aim of these principles is to facilitate the design, management and change of subsystems.
Taken too far however, both principles have negative effects, and designers must strike a balance between extremes.
Simple subsystems or simple messaging?
This table contrasts the qualities of dividing a system into larger or smaller subsystems
Decomposing a system into small, simple subsystems creates complexity in the structure of the system.
It increases the volume and complexity of inter-subsystem messaging.
There are many different ways to implement messaging, ranging from tightly to loosely coupled.
Size matters in the sense that smaller, subsystems typically merit tighter coupling than larger subsystems.
Local agility or global disintegrities and delays?
Note first that decoupling is not a single concept - our architect classes cover a dozen or more ways to decouple subsystems.
In “Applying UML and patterns”, OO thinker Craig Larman said to pick your battles.
“It is not high coupling per se that is the problem; it is high coupling to elements that are unstable in some dimension, such as their
· interface [the list of processes/services that are provided or required]
· implementation [internal procedures, physical features or technologies]
· mere presence [availability when needed]”
“If we put effort into “future proofing” or lowering the coupling when we have no realistic motivation, this is not time well spent.”
Skillful decoupling of subsystems should help people manage and change each subsystem on its own.
But increasing the agility of small subsystems can make it harder to meet higher/wider goals.
E.g. a principle of “microservices” is to “decentralise data management” by dividing one coherent database structure into smaller ones.
The aim is to divide the work of application development into smaller applications, to be developed by relatively autonomous development teams
In the right circumstances this is a good approach, but it can lead to downsides.
Generally speaking, designing a system to be flexible can lead to complexities, delays, disintegrities.
Agility or simplicity?
To make a system more agile usually requires a redesign that makes the system more complex.
Setting out to build an agile system adds time and cost to development.
Agile system principle
Agile development principle
Can be contrary to
Keep the system simple
Design for future flexibility
Can be contrary to
You ain’t gonna need it
Agility or slower?
An agile system is usually slower than a rigid one.
E.g. shifting business rules from procedural instructions into data variables (that end users can change) tends to make a system slower.
Agility or integrity?
Continual improvement at a microscale risks disintegrity at a macro scale.
E.g. we change our digital course materials during a course.
Inevitably, our printed course material, digital course material and web site get out of step.
The best we can promise customers is “eventual consistency” – probably.
In other businesses (e.g. money handling, safety-critical) data integrity can be mission critical.
So, decoupling subsystems and allowing inconsistency between them can be dangerous.
Vitality Chicago has posted some Standish Group analysis of Agile vs. Waterfall.
The numbers favour agile development.
But there is a problem with reports comparing projects that are not like for like.
Because people naturally use agile methods on "easy" projects.
Especially in a product development or maintenance and extension context.
Indeed, some agile methods were reverse engineered from successes in those contexts.
All is well given flexible requirements for a system that can start out small and simple.
People are forced to use a more "waterfall" approach in more “difficult” projects.
When given rigid requirements for a large and complex system that must be complete and right first time.
When the application requires a large and complex database structure and/or data migration.
As sometimes is required in a legacy replacement situation.
Architects need a score chart to assess projects up front.
Because a project that scores highly on the measures below is an inherently difficult project.
And you probably need to substantially increase the time and budget for the work.
Obstacles to agility include:
1. Most/all requirements are mandatory.
2. Users/domain experts are not readily available.
3. Developers are not empowered to make decisions.
4. The system’s first release must be large (safety-critical, money-handling, legacy replacement, or regulatory compliance).
5. The complexity is server-side rather than UX side.
6. The complexity is in input and update rather output and report.
7. The business rules are complex and/or money-handling.
8. The processing is batch rather than transactional.
9. The database schema is immature and evolving.
10. There are many integration points with other systems.
11. There are exceptional speed or volume requirements
Building on a sound data structure
Many business applications are built on top of a large persistent data structure that records entities and events of interest to business people.
Agile system development proceeds best when the structure of this state data (this memory) is stable.
So, it helps to get the data structure as complete and right as possible before coding – to minimise refactoring later.
It also helps to implement the logical data structure as directly as possible, to minimise the complexity and processing overhead of any data abstraction layer.
How much model-driven engineering?
Why didn’t multi-level model-based systems engineering take off?
There are many reasons, but here, the challenge is the amount of documentation it requires
Every documentation level is an overhead; every transformation between levels of documentation is an overhead.
The closer the levels to each other, the more levels are needed, the bigger the documentation overhead, and the less value added by a transformation.
The further apart the levels, the less the documentation overhead, the more valuable but also difficult and costly a transformation.
Model-driven engineering is intended to improve quality and verifiability or auditability.
But user acceptance testing is impossible until the bottom level executable description is complete.
At which point, the users may clarify or change their minds about what they really wanted the system to do.
So, the agile development principle is to maintain only high-level or “lean” architectural documentation.
It is impossible to get all the advantages of both approaches; you have to find the right balance for each situation.