Structured modular design (1970s and 80s)

This page is published under the terms of the licence summarized in the footnote.

 

Abstract

This paper review some of the structure modular design ideas that preceded OO design, and anticipated it.

It was initially written to document points raised in a conversation with Anja Fiegler, along with papers on System Complexity and Agility at http://avancier.website.

Contents

Software architecture as modular design and integration (recap) 1

On system modularity - Gouthier and Pont (1970) 2

Stepwise refinement - Niklaus Wirth (1971) 2

Declarative specification. 3

Encapsulation of complexity – Parnas. 3

Structured program design. 3

Structured Analysis and Design Technique - D. Ross. 4

Structured Design - Yourdon and Constantine. 5

Structured database design. 5

Structured analysis in Information Engineering. 5

Structured Systems Analysis and Design (SSADM) 6

Remarks. 6

 

Software architecture as modular design and integration (recap)

Software architecture is about understanding data processing needs, and describing how system components will be organised to cooperate in the required processes.

 

"The beginning of wisdom for a computer programmer is to recognise the difference between getting a program to work and getting it right" M.A. Jackson (1975).

 

What makes a software architecture right?

Here, it means the design meets functional and non-functional requirements in a simpler and probably cheaper way than alternative designs.

 

And what is software architecture? In essence, it is modular design.

Modular design divides a system into modules or components – so as to perform the required processes.

Here you can assume a component is encapsulated, defined primarily by its input/output interface, by the discrete events it can process and services it can offer.

 

To design a large and complex system – one that is effective, efficient and easy to manage - architects have to make a series of modular design decisions.

·         What is the right size and scope of a component?

·         How to avoid or minimise duplication between components?

·         How to separate or distribute components?

·         How integrate components?

 

For 40 years, the IT industry has continually revisited modular design and integration concepts and principles.

And some modular design concepts have seeped into business architecture.

On system modularity - Gouthier and Pont (1970)

As programming became a mainstream activity, people started to write larger and more complex programs.

It soon became apparent that people needed advice on how to divide a large program, or system, into modules.

Modularity – or good modularity - was seen as the key to good design.

 

Gouthier and Pont said that well-defined segmentation of the project effort ensures system modularity.

At design time: “Each task forms a separate, distinct program module.

Each module and its inputs and outputs are well-defined… there is no confusion in the interface with other system modules.”

 

System

Inputs

Module/Task

Outputs

Inputs

Module/Task

Outputs

Inputs

Module/Task

Outputs

 

At testing time: “The integrity of the module is tested independently, there are few scheduling problems in synchronizing the completion of several tasks before checkout can begin.”

 

At maintenance time: “System errors and deficiencies can be traced to specific system modules, thus limiting the scope of detailed error searching.”

 

(The idea of the module, encapsulated by its inputs and outputs, can be a seen as derived from General System Theory.)

Stepwise refinement - Niklaus Wirth (1971)

Given a system would be divided into many modules, the challenge become how to organise and manage the structure of modules.

Naturally, people looked to the traditional hierarchy.

 

Wirth proposed: “Program construction consists of a sequence of refinement steps.

In each step a given task is broken up into a number of subtasks.” 

 

He illustrated this using the “8 Queens problem”.

I have designed software to solve this problem several times, several ways, and now consider the solutions proposed by Wirth and (later) Dijkstra to be clumsy, which is to say I endorse Jackson’s comment at the start of this paper.

Declarative specification

In “Structured Programming” (1972) Hoare showed we can declare the specification of an operation in preconditions and post conditions.

Encapsulation of complexity – Parnas

There are infinite different ways to modularise a system. What makes a good module?

In “Criteria to Be Used in Decomposing Systems into Modules”, 1972, Parnas introduced the idea of information hiding.

“One begins with a list of difficult design decisions.

Each module is designed to hide a decision from the others.[e.g. hide]

·         a data structure,

·         its internal links,

·         accessing procedures and

·         modifying procedures.” 

 

Parnas proposed several other criteria, but his first, encapsulation of a data structure, is surely the most universal and lasting principle.

 

There were many attempts to create a shared module library. Success was patchy.

Around1975, Michael A Jackson reputedly said:

“A module library is the only kind of library that everybody wants to put something in, and nobody wants to take something out of”.

Structured program design

As software engineering matured in the 1970s, people sought to combine these two ideas:

·         The concept of a component that encapsulates its internal data structure and processes.

·         The hierarchical structure - so familiar from social, biological and mechanical systems.

 

Perhaps the most demonstrably successful approach was that of Michael A Jackson.

Jackson modelled the data structures to be processed by a program as hierarchical structures (in the form of regular expressions).

Then based the process structure on those data structures.

This might be called out-to-in design, since the input/output data structures drive the internal process design.

 

Jackson introduced some software design patterns to modularise a system in a rational way.

He taught people to resolve “data structure clashes” by logically separating modules that handle

·         the reading and writing of I/O data structures

·         the reading and writing of database structures (via object-based modules).

 

By 1980, a modular design methodology based on Jackson’s ideas might be described as:

1.      Define the I/O data structures

2.      Define the persistent data structure

3.      Define the events and enquiries applied to the data store

4.      Structure code into layers, such that each layer offers services to the layer above.

Structured Analysis and Design Technique - D. Ross

SADT was developed and field-tested 1969 to 1973.

Structured analysis (in this and other methods) employs one primary abstraction mechanism, the composition hierarchy.

 

1974: More composition hierarchy - Warnier/Orr diagrams

The analyst works backwards from systems output, defines each process as a hierarchy of sub processes.

 

1974: More composition hierarchy - HIPO - IBM

Hierarchical Input Process Output organises the processes of a system in a composition hierarchy.

It documents each module as IPO + Storage (or State).

Ed Yourdon illustrated the core concepts with the diagrams below.

 

Hierarchical structure chart

 

IPO diagram

 

Structured Design - Yourdon and Constantine

The main features of this approach (c 1979) were a hierarchical structure chart and data flow diagrams.

The approach is notable for establishing two ideas about how best to encapsulate subsystems.

They should be internally cohesive and loosely-coupled.

·         Cohesion - the degree to which the internal contents of a module are related

·         Coupling - the degree to which a module depends upon other modules.

 

Activities can be coupled by: time, location, access to the same data several other ways.

Optimal coupling reduces module interfaces and system complexity.

Structured database design

Codd (1970) and others had shown we could derive the structure of a data store from analysis and “normalisation” of a system’s input and output data structures.

Normalisation was and is often used as a technique to validate data models.

 

Data modelling emerged out of reverse-engineering from database schema and Bachman diagrams.

Peter Chen (1976) promoted entity-relationship diagrams.

Data models were immediately used to present computer-independent business models (conceptual models or domain models if you prefer).

Some data modellers used generalisation relationships in their data models, as well as association relationships.

Structured analysis in Information Engineering

Around 1980, different groups brought a selection of discrete methods together into a “methodology” for the analysis and design of data processing systems.

These methods were developed for projects that develop enterprise application developments.

 

For example, Information Engineering brought together:

·         Data flow and data analysis, to analyse current systems.

·         Function analysis, to decompose business functions into elementary business processes.

·         Process dependency diagrams, to show the interdependencies of business processes.

·         Process logic analysis, to define process flows.

·         Entity analysis, to define a data model.

·         Data normalization, to confirm the correctness of the entity model.

·         Entity lifecycle analysis, to ensure all entity update processes have been identified.

 

Information Engineering also featured two very general techniques:

·         Matrices, to cross-check architectural entities, especially data entities to functions or processes.

·         Custer analysis, to scope cohesive groups of whatever architectural entities are being analysed.

Structured Systems Analysis and Design (SSADM)

The UK government’s analysis and design methodology was a similar amalgam of methods.

 In contrast to Information Engineering it was more object-based, and centred on entity-event modelling.

In fact, SSADM treated the system as a Discrete Event Dynamics System (DEDS).

SSADM modelled a system from three view points.

·         The data model showed the passive structure view.

·         The entity life cycles showed the object-based behaviour view.

·         The event process outlines (interaction diagrams) showed the process-oriented behaviour view.

 

“It would be handy to find the odd architect from time to time with a decent understanding of entity lifecycles: particularly the "D" in CRUD at both the micro (data entity) level and the macro (portfolio) level.” Dave Cunningham

Remarks

Today, architects, from software architects to enterprise architects, use a mix of principles promoted across the decades, not least, principles established in the 1970s.

I embrace the philosophy of James Rumbaugh.

“It is important not to be dogmatic.

Every approach has its advantages and limitations.

You must use a mixture of approaches in the real world and not mistake any one of them for truth.”

 

Footnote: Creative Commons Attribution-No Derivative Works Licence 2.0

Attribution: You may copy, distribute and display this copyrighted work only if you clearly credit “Avancier Limited: http://avancier.co.uk” before the start and include this footnote at the end.

No Derivative Works: You may copy, distribute, display only complete and verbatim copies of this page, not derivative works based upon it.

For more information about the licence, see  http://creativecommons.org