Structured analysis
In software engineering, structured analysis and structured design are methods for analyzing business requirements and developing specifications for converting practices into computer programs, hardware configurations, and related manual procedures.
Structured analysis and design techniques are fundamental tools of systems analysis. They developed from classical systems analysis of the 1960s and 1970s.
Objectives of structured analysis
Structured analysis became popular in the 1980s and is still in use today. Structured analysis consists of interpreting the system concept into data and control terminology represented by data flow diagrams. The flow of data and control from bubble to the data store to bubble can be difficult to track and the number of bubbles can increase.One approach is to first define events from the outside world that require the system to react, then assign a bubble to that event. Bubbles that need to interact are then connected until the system is defined. Bubbles are usually grouped into higher level bubbles to decrease complexity. Data dictionaries are needed to describe the data and command flows, and a process specification is needed to capture the transaction/transformation information.
SA and SD are displayed with structure charts, data flow diagrams and data model diagrams, of which there were many variations, including those developed by Tom DeMarco, Ken Orr, Larry Constantine, Vaughn Frick, Ed Yourdon, Steven Ward, Peter Chen, and others.
These techniques were combined in various published system development methodologies, including structured systems analysis and design method, profitable information by design, Nastec structured analysis & design, SDM/70 and the Spectrum structured system development methodology.
History
Structured analysis is part of a series of structured methods that represent a collection of analysis, design, and programming techniques that were developed in response to the problems facing the software world from the 1960s to the 1980s. In this timeframe most commercial programming was done in Cobol and Fortran, then C and BASIC. There was little guidance on "good" design and programming techniques, and there were no standard techniques for documenting requirements and designs. Systems were getting larger and more complex, and the information system development became harder and harder to do so."As a way to help manage large and complex software, the following structured methods emerged since the end of the 1960s:
- Structured programming in circa 1967 with Edsger Dijkstra - "Go To Statement Considered Harmful"
- Niklaus Wirth Stepwise design in 1971
- Nassi–Shneiderman diagram in 1972
- Warnier/Orr diagram in 1974 - "Logical Construction of Programs"
- HIPO in 1974 - IBM Hierarchy input-process-output
- Structured design around 1975 with Larry Constantine, Ed Yourdon and Wayne Stevens.
- Jackson structured programming in circa 1975 developed by Michael A. Jackson
- Structured analysis in circa 1978 with Tom DeMarco, Edward Yourdon, Gane & Sarson, McMenamin & Palmer.
- Structured analysis and design technique developed by Douglas T. Ross
- Yourdon structured method developed by Edward Yourdon.
- Structured analysis and system specification published in 1978 by Tom DeMarco.
- Structured systems analysis and design method first presented in 1983 developed by the UK Office of Government Commerce.
- Essential Systems Analysis, proposed by Stephen M. McMenamin and John F. Palmer
- IDEF0 based on SADT, developed by Douglas T. Ross in 1985.
- Hatley-Pirbhai modeling, defined in "Strategies for Real-Time System Specification" by Derek J. Hatley and Imtiaz A. Pirbhai in 1988.
- Modern Structured Analysis, developed by Edward Yourdon, after Essential System Analysis was published, and published in 1989.
- Information technology engineering in circa 1990 with Finkelstein and popularised by James Martin.
Structured analysis topics
Single abstraction mechanism
Structured analysis typically creates a hierarchy employing a single abstraction mechanism. The structured analysis method can employ IDEF, is process driven, and starts with a purpose and a viewpoint. This method identifies the overall function and iteratively divides functions into smaller functions, preserving inputs, outputs, controls, and mechanisms necessary to optimize processes. Also known as a functional decomposition approach, it focuses on cohesion within functions and coupling between functions leading to structured data.The functional decomposition of the structured method describes the process without delineating system behavior and dictates system structure in the form of required functions. The method identifies inputs and outputs as related to the activities. One reason for the popularity of structured analysis is its intuitive ability to communicate high-level processes and concepts, whether in single system or enterprise levels. Discovering how objects might support functions for commercially prevalent object-oriented development is unclear. In contrast to IDEF, the UML is interface driven with multiple abstraction mechanisms useful in describing service-oriented architectures.
Approach
Structured analysis views a system from the perspective of the data flowing through it. The function of the system is described by processes that transform the data flows. Structured analysis takes advantage of information hiding through successive decomposition analysis. This allows attention to be focused on pertinent details and avoids confusion from looking at irrelevant details. As the level of detail increases, the breadth of information is reduced. The result of structured analysis is a set of related graphical diagrams, process descriptions, and data definitions. They describe the transformations that need to take place and the data required to meet a system's functional requirements.Image:Analysis Model Objects.jpg|thumb|320px|The structured analyse approach develops perspectives on both process objects and data objects.
De Marco's approach consists of the following objects :
- Context diagram
- Data flow diagram
- Process specifications
- Data dictionary
Context diagram
are diagrams that represent the actors outside a system that could interact with that system. This diagram is the highest level view of a system, similar to block diagram, showing a, possibly software-based, system as a whole and its inputs and outputs from/to external factors.This type of diagram according to Kossiakoff usually "pictures the system at the center, with no details of its interior structure, surrounded by all its interacting systems, environment and activities. The objective of a system context diagram is to focus attention on external factors and events that should be considered in developing a complete set of system requirements and constraints". System context diagrams are related to data flow diagram, and show the interactions between a system and other actors which the system is designed to face. System context diagrams can be helpful in understanding the context in which the system will be part of software engineering.
Data dictionary
A data dictionary or database dictionary is a file that defines the basic organization of a database. A database dictionary contains a list of all files in the database, the number of records in each file, and the names and types of each data field. Most database management systems keep the data dictionary hidden from users to prevent them from accidentally destroying its contents. Data dictionaries do not contain any actual data from the database, only bookkeeping information for managing it. Without a data dictionary, however, a database management system cannot access data from the database.Database users and application developers can benefit from an authoritative data dictionary document that catalogs the organization, contents, and conventions of one or more databases. This typically includes the names and descriptions of various tables and fields in each database, plus additional details, like the type and length of each data element. There is no universal standard as to the level of detail in such a document, but it is primarily a distillation of metadata about database structure, not the data itself. A data dictionary document also may include further information describing how data elements are encoded. One of the advantages of well-designed data dictionary documentation is that it helps to establish consistency throughout a complex database, or across a large collection of federated databases.