Hardware/software design requirements planning: Part 4 – Computer software approaches

It is not possible for a functioning system to exist that is entirely computer software because software requires a machine medium within which to function. Systems that include software will always include hardware, a computing instrument as a minimum, and most often will involve people in some way.

Software is to the machine as our thoughts, ideas, and reasoning are to the gray matter making up our mind. While some people firmly believe in out-of-body experiences for people, few would accept a similar situation for software.

A particular business entity may be responsible for creating only the software element of a system and, to them, what they are developing could be construed a system, but their product can never be an operating reality by itself.

This is part of the difficulty in the development of software; it has no physical reality. It is no wonder then that we might turn to a graphical and symbolic expression as a means to capture its essence.

We face the same problem in software as hardware in the beginning. We tend to understand our problems first in the broadest sense. We need some way to capture our thoughts about what the software must be capable of accomplishing and to retain that information while we seek to expand upon the growing knowledge base.

We have developed many techniques to accomplish this end over the period of 50–60 years during which software has been a recognized system component.

The earliest software analytical tool was flow charting, which lays out a stream of processing steps similar to a functional flow diagram (commonly in a vertical orientation rather than horizontal, probably because of the relative ease of printing them on line printers), where the blocks are very specialized functions called computer processes.

Few analysts apply flow diagramming today, having surrendered to data flow diagramming (DFD) used in modern structured analysis, the Hatley-Pribhai extension of this technique, object-oriented analysis, or unified modeling languauge (UML).

Alternative techniques have been developed that focus on the data that the computer processes. The reasonable adherents of the process and data orientation schools of software analysis would today accept that both are required, and some have made efforts to bridge this gap.

All software analysis tools (and hardware-oriented ones as well) involve some kind of graphical symbols (bubbles or boxes) representing data or process entities connected by lines, generally directed ones.

Some of these processes begin with a context diagram formed by a bubble representing the complete software entity connected to a ring of blocks that correspond to external interfaces that provide or receive data.

This master bubble corresponds to the need, or ultimate function, in functional analysis, and its allocation to the thing called a system.

The most traditional technique was developed principally by Yourdon, DeMarco, and Constantine. It involves expansion of the context diagram bubble into lower-tier processing bubbles that represent subprocesses just as in functional analysis.

These bubbles are connected by lines indicating data that must pass from one to the other. Store symbols are used to indicate a need to temporarily store a data element for subsequent use. These stores are also connected to bubbles by lines to show source and destination of the data.

Since the directed lines represent a flow of data between computer processing entities (bubbles), the central diagram in this technique is often referred to as a data flow diagram.

In all software analysis techniques, there is a connection between the symbols used on the diagrammatic portrayal to text information that characterizes the requirements for the illustrated processes and data needs.