Which system analysis tool shows the flow of data from input → processing → output?

Functional decomposition refers broadly to the process of resolving a functional relationship into its constituent parts in such a way that the original function can be reconstructed from those parts.

From: Engineering Systems Acquisition and Support, 2015

Systems design

In Engineering Systems Acquisition and Support, 2015

4.2.2 Functional decomposition

Functional decomposition refers broadly to the process of resolving a functional relationship into its constituent parts in such a way that the original function can be reconstructed from those parts. In general, this process of decomposition is undertaken either for the purpose of gaining insight into the identity of the constituent components or for the purpose of obtaining a compressed representation of the global function––a task that is feasible only when the constituent processes possess a certain level of modularity. In functional decomposition of engineering systems, which is a method for analysing engineered systems, the basic idea is to try to divide a system in such a way that each block of the block diagram can be described without using the words ‘and’ or ‘or’. This exercise forces each part of the system to be assigned a pure function. When a system is composed of pure functions, they can be reused or replaced. A usual side-effect of such composition is that the interfaces between blocks become simple and generic. Because the interfaces usually become simple, it is easier to replace a pure function with a related, similar function. As we have previously stated, a process can be represented as a network of these symbols. Such a symbol network of a decomposed process is called a data flow diagram or DFD.

A functional flow block diagram (FFBD) is a multi-tier, time-sequenced, step-by-step flow diagram of a system’s functional flow. FFBD notation, which was developed in the 1950s, is widely used in classical systems engineering. FFBDs are one of the classic business-process-modelling methodologies, along with flow charts, data flow diagrams, control flow diagrams, Gantt charts, PERT diagrams and IDEF. FFBDs can be developed in a series of levels. FFBDs show the same tasks identified through functional decomposition and display them in terms of logical, sequential relationships. Each block in the first-level diagram can then be expanded to a series of functions, as shown in the second-level diagram for ‘perform mission operations’. Note that the diagram shows both input (transfer to operational orbit) and output (transfer to space transportation system orbit), which thereby initiates the interface identification and control process. Each block in the second-level diagram can be progressively developed into a series of functions, as shown in the third-level diagram in Figure 4.2.1-1.

The FFBD also incorporates alternate and contingency operations, which improves the probability of mission success. The flow diagram provides an understanding of total operation of the system, serves as a basis for development of operational and contingency procedures and pinpoints areas where changes in operational procedures could simplify the overall system operation. In certain cases, alternate FFBDs may be used to represent various means of satisfying a particular function until data are acquired, which permits selection among the alternatives.

Read full chapter

URL: https://www.sciencedirect.com/science/article/pii/B9780857092120000043

Data Models Across the End-State Architecture

W.H. Inmon, ... Mary Levins, in Data Architecture (Second Edition), 2019

Functional Decomposition and Data Flow Diagrams

In the world of applications, there are the functional decomposition and the data flow diagram.

Fig. 14.1.2 depicts these constructs,

Which system analysis tool shows the flow of data from input → processing → output?

Fig. 14.1.2. The application environment.

The functional decomposition is the depiction of the functions that will be achieved by a system. The functional decomposition is laid out in a hierarchical fashion. At the top of the decomposition is the general function of what is to be accomplished by the system. At the second level are the main functions of what is to be accomplished. Then, each second level function is broken down into its subfunctions, until the point of basic functionality is reached.

The functional decomposition is useful to see what the different activities of a system will be. It is useful for organizing the functions, identifying overlap, and checking to see if anything is left out. When you are setting out on a long trip, it is useful for looking at a map of the United States to see what states you will visit and the order in which the states will be traveled.

After the functional decomposition is completed, the next step is to create data flow diagrams for each of the functions. The data flow diagram starts with the input to the module and shows how the input data will be processed to achieve the output data. The three major components of a data flow diagram are an identification of the input, a description of the logic that will occur in the module, and a description of the output.

If the functional decomposition is like a map of the United States, the data flow diagram is like a detailed map of a state. The data flow diagram tells you how to get across Texas. You start at El Paso, you head east, past McKittrick Canyon, go to Van Horn and Sierra Blanca, go through Pecos, then on to Midland and Odessa, and so forth. The map of Texas shows details that the map of the United States cannot show. By the same token, the map of Texas does not show you how to get from Los Angeles to San Jose or from Chicago to Naperville.

The nature of functional decomposition and data flow diagrams are such that process and data are intimately intertwined. Both process and data are needed in order to build a functional decomposition and data flow diagrams.

Fig. 14.1.3 shows the tight interrelationship of data and process in the functional decomposition.

Which system analysis tool shows the flow of data from input → processing → output?

Fig. 14.1.3. Process and data are in lock step.

The building of functional decompositions and data flow diagrams are used to define and build applications. As a rule, these constructs can be very complex. One of the tools that are used in order to manage the complexity is that of the definition of the scope of development. At the very beginning, there is an exercise that requires that the scope of the application be defined. The scope definition is necessary in order to keep the size of the development reasonable. If a designer is not careful, the scope will become so large that the system will never be built. Therefore, it is necessary to rigorously define the scope before the development effort ever begins.

The result of the definition of the scope is that—over time—the organization ends up with multiple applications, each of which have their own functional decompositions and data flow diagrams.

Fig. 14.1.4 shows that over time, each application has its own set of definitions.

Which system analysis tool shows the flow of data from input → processing → output?

Fig. 14.1.4. Each application has its own functional decomposition and set of data flow diagrams.

While the development process that has been described is normal for almost every shop, there is a problem. Over time, a serious amount of overlap between different applications starts to emerge. Because of the need to define and enforce the definition of the scope of an application rigorously, the same or similar functionality starts to appear across multiple applications. When this happens, there start to appear redundant data. The same or similar data element appears in multiple applications.

Read full chapter

URL: https://www.sciencedirect.com/science/article/pii/B9780128169162000395

Examples of Design Competitions

Philip Kosky, ... George Wise, in Exploring Engineering (Fifth Edition), 2021

31.8.2 Design Milestone 2: Generation of Alternative Concepts

The functional decomposition presented in Fig. 31.16 shows the sequence in which the individual functional requirements are performed. Control signals to and from the Arduino microcontroller are shown as dashed lines.

Which system analysis tool shows the flow of data from input → processing → output?

Figure 31.16. Sequential functional decomposition of the Automatic Air Freshener.

Solutions for each functional requirement were brainstormed and then documented in the form of a classification scheme, as shown in Fig. 31.17. An extra row was added to the classification scheme to display the various motor options. This was to ensure that each motor type is fully considered when selecting mechanisms to depress the nozzle of the air freshener can and to rotate it.

Which system analysis tool shows the flow of data from input → processing → output?

Figure 31.17. Classification scheme for the Automatic Air Freshener.

Finally, compatible solutions appearing in the classification scheme were combined and sketched to form the three alternative concepts shown in Fig. 31.18.

Which system analysis tool shows the flow of data from input → processing → output?

Figure 31.18. Concept sketches of the three alternative designs.

Read full chapter

URL: https://www.sciencedirect.com/science/article/pii/B9780128150733000314

Design Step 2

Philip Kosky, ... George Wise, in Exploring Engineering (Fifth Edition), 2021

24.7 Design Milestone #2: Generation of Alternative Concepts

This milestone assumes the system to be designed is sufficiently complex (i.e., at least two subfunctions) to warrant the use of functional decomposition. The following assignment is designed for student design competitions.

Assignment

1.

For the functional decomposition of your design project, brainstorm at least five feasible alternatives for each subfunction and assemble the results in a classification scheme. Include competition strategy as one of the items to be brainstormed in the classification scheme. Search the boundaries of the rules for unusual ideas that could potentially dominate the competition.

2.

Form three promising design concepts by combining compatible subfunction alternatives from your classification scheme. Redraw your concept sketches to enhance clarity and neatness. The quality of the concept drawing, or lack of it, can do much to sway opinions when it comes time to judge the concepts.

3.

Firm up your three design concepts by sketching them up in the form of concept drawings. Functionality (i.e., how it works) should be clearly indicated in the drawings using labeling and text.

Grading Criteria

Technical Communication

Ideas are clearly presented.

Final concept drawings are neatly rendered.

Technical Content

All concepts are feasible, legal, and fundamentally different.

Concepts are presented in sufficient detail.

Requested number of concepts is generated.

Read full chapter

URL: https://www.sciencedirect.com/science/article/pii/B9780128150733000247

Functional Decomposition and Mereology in Engineering

Pieter Vermaas, Pawel Garbacz, in Philosophy of Technology and Engineering Sciences, 2009

5.1 The reconciled functional basis

A more recent research project that originates with the foundational work of Pahl and Beitz is the Reconciled Functional Basis project. This Reconciled Functional Basis (RFB, ϕrom now on) is the result of an effort towards establishing a standard taxonomy of basic technical functions (see, e.g., [Hirtz et al., 2002]) by reconciling two previous taxonomies: the NIST taxonomy (cf. [Szykman, et al., 1999]) and the older versions of the Functional Basis (developed in [Little et al., 1997; Stone et al., 1998; McAdams et al., 1999; Stone et al., 1999; Stone and Wood, 2000]). Each of these taxonomies is a result of empirical generalisation of engineering specifications.

RFB analyses the notion of a functional decomposition against the background of its taxonomy of functions, which is based on a taxonomy of flows. RFB modifies the meaning of the term “flow” since here “flow” does not mean “a process of flowing” (e.g., removing debris), but “a thing that flows” (e.g., debris).34 More precisely speaking, in some papers, e.g., in [Stone and Wood, 2000] this term is used in both meanings, but the RFB taxonomy of flows is based on the latter sense. This shift in meaning is, to be sure, justifiable since it is hard to see how one might differentiate between a process of flowing and a function given the conception of Pahl and Beitz. The RFB whole taxonomy of flows is depicted in Table 2.

Table 2. The RFB taxonomy of flows [Hirtz et al., 2002]

Primary flowSecondary flowTertiary flow
Material Human
Gas
Liquid
Solid Object
Particulate
Composite
Plasma
Mixture Gas-gas
Liquid-Liquid
Solid-solid
Solid-liquid
Liquid-gas
Solid-gas
Solid-liquid-gas
Colloidal
Signal Status Auditory
Olfactory
Tactile
Taste
Visual
Control Analog
Discrete
Energy Human
Acoustic
Biological
Chemical
Electrical
Electromagnetic Optical
Solar
Hydraulic
Magnetic
Mechanical Rotational
Translational
Pneumatic
Radioactive/Nuclear
Thermal

RFB also contains a three-layer classification of what are called basic functions. Each type of function is accompanied by a definition (in natural language), example, and a set of synonymous names. The basic functions are divided in a first layer into eight primary types. Then, some primary basic functions are divided into types of secondary basic functions, and some of these secondary basic functions are in turn divided into types of tertiary basic functions. The whole taxonomy is depicted in Table 3.

Table 3. The RFB taxonomy of functions [Hirtz et al., 2002]

Primary functionsSecondary functionsTertiary functions
Branch Separate Divide
Extract
Remove
Distribute
Channel Import
Export
Transfer Transport
Transmit
Guide Translate
Rotate
Allow degree(s) of freedom
Connect Couple Join
Link
Mix
Control magnitude Actuate
Regulate Increase
Decrease
Change Increment
Decrement
Shape
Condition
Stop Prevent
Inhibit
Convert Convert
Provision Store Contain
Collect
Supply
Signal Sense Detect
Measure
Indicate Track
Display
Process
Support Stabilize
Secure
Position

Of course, the RFB taxonomy of basic functions is not a model of functional decomposition. For instance, the fact that Divide and Extract are subtypes of Separate does not mean that the former are subfunctions of the latter. Moreover the basic functions are not functions in the sense the overall functions are, since the overall functions are (complex) modifications of specific input flows into specific output flows, whereas the basic functions are modifications generalised for the flows subjected. Hence, the basic subfunctions are in the RFB to be identified with basic functions operating on specific primary, secondary and tertiary flows.

In RFB a functional decomposition is a conceptual structure that consists of an overall function that is decomposed, its subfunctions into which the overall function is decomposed, the flows which are modified by the subfunctions, and a net that links these modifications in an ordered way.

The overall function to be decomposed is defined in terms of the flows it modifies, which are taken from the RFB taxonomy of flows. Each of its subfunctions is defined both in terms of the flows the respective subfunction modifies and in terms of its type of modification, which is taken from RFB taxonomy of basic functions. For instance, the overall function of a screwdriver, which is to tighten/loose screws, is defined by means of the following ten input flows and nine output flows (see also Figure 3).

Which system analysis tool shows the flow of data from input → processing → output?

Figure 3. The RFB modelling of the overall function of a screwdriver [Stone and Wood, 2000, Fig. 2]

input flows for the function tighten/loose screws:

energy flows: electricity, human force, relative rotation and weight;

material flows: hand, bit and screw;

signal flows: direction, on/off signal and manual use signal;

output flows for the function tighten/loose screws:

energy flows: torque, human force, heat, noise and weight;

material flows: hand, bit and screw;

signal flows: looseness/tightness.

On the other hand, one of the subfunctions in the functional decomposition of this overall function tighten/loose screws is called convert electricity to torque (see Figure 4), which means that it is a function of the convert-type (cf. Table 3), and modifies one input flow to three output flows:

Which system analysis tool shows the flow of data from input → processing → output?

Figure 4. The RFB functional decomposition of a screwdriver [Stone and Wood, 2000, Fig. 4]

input flows for the subfunction convert electricity to torque:

energy flows: electricity;

material flows: none;

signal flows: none.

output flows for the subfunction convert electricity to torque:

energy flows: heat, noise and torque;

material flows: none;

signal flows: none.

The task of a designer who performs a functional decomposition is to link any input flow of the overall function to be decomposed with some of the output flows. Any such link that starts with an input flow of the overall function and ends with one of its output flows is called a function chain. In RFB one distinguishes between two types of function chains: sequential and parallel. A function chain is sequential if it is ordered with respect to time, i.e., if any temporal permutation of its subfunction may in principle result in failing to perform the overall function. A parallel function chain is a fusion of sequential function chains that share one or more flows.

In RFB one assumes that each subfunction of an overall function to be performed by a technical system S is realised by a component of S; however, the relation between subfunctions and components is many-to-many, i.e., one sub-function may be realised by several components and one component may realise more than one subfunction.

The notion of functional decomposition developed within RFB plays an important role in what is called the concept generator, which is a web-based computational tool for enhancing conceptual design.35 The concept generator is to present a designer with a number of different solutions to his or her design problem on the basis of previously developed (and stored) high-quality designs. One of the input data to be provided for this tool is a function chain for a product to be newly developed. The output solutions describe the design solution in terms of the technical systems whose descriptions are loaded into the knowledge base of the concept generator. The functional decomposition links the overall function established by the generator with the conceptual components that compose a general description of the product that is construed here as a solution of the initial design problem [Strawbridge et al., 2002; Bryant et al., 2004].

The RFB proposal adds precision and a wealth of empirical details to the methodology of Pahl and Beitz. Its explicit aim to contribute to the standardisation of conceptual models in engineering makes it even more valuable for specifically mereological analysis of functional modelling.

In our terminology, the overall function of an RFB functional decomposition Decomp(Φ, Org(ϕ1, ϕ2, …, ϕn)) may be any function Φ but the subfunctions ϕ1, ϕ2, …, ϕn are to be identified with RFB basic functions from Table 3 operating on specific RFB primary, secondary and tertiary flows from Table 2. The net of flows between the subfunctions ϕ1, ϕ2, …, ϕn defines their organisation Org(ϕ1, ϕ2, …, ϕn).

In RFB the overall functions Φ and the subfunctions ϕ1, ϕ2, …, ϕn in functional decompositions Decomp(Φ, Org(ϕ1, ϕ2, …, ϕn)) may be describing systems S and s1, s2, …, sn that are endurants and perdurants, but like in the methodology of Pahl and Beitz, again the additional assumptions are made that functions comply with physical conservation laws for flows, and that the subfunctions ϕ1, ϕ2, …, ϕn, are to be taken from a set of basic functions. A further additional assumption seems to be that the functional orderings ϕi → ϕj making up the organisations Org(ϕ1, ϕ2, …, ϕn) of the subfunctions, are always asymmetric: flows between two subfunctions in functional decompositions like depicted in Figure 4, always go in one direction. The benefit of philosophical research on functional descriptions to engineering can again lie in making these assumptions explicit and in challenging them. The requirement that functions always have to be decomposable into RFB basic functions operating on specific RFB flows introduces again a tension between the goal of functional decomposition to facilitate designing and to facilitate communication. Consider, ϕor instance, the basic function convert acoustic energy in electrical energy. The identification of this basic function in a decomposition of an overall function may be useful to a shared understanding of this overall function but will not help designers to easily find a corresponding design solution. A requirement that subfunctions are only ordered in one direction may in turn be helpful in engineering for managing the flow of materials, energies and signals, but may also be revealed to be an unnecessary constraint to the decomposition of functions.

Read full chapter

URL: https://www.sciencedirect.com/science/article/pii/B9780444516671500148

27th European Symposium on Computer Aided Process Engineering

Xinsheng Hu, ... Gürkan Sin, in Computer Aided Chemical Engineering, 2017

2.1 MFM modeling

MFM is used to simulate the chemical process by functional decomposition using symbols. The symbols, as seen in Figure 2, express the functional action types and logical action sequences in the chemical process and enable modeling at different abstraction levels. According to the functional decomposition based on the first engineering principles and first operational principles, the plant can be divided into functional nodes. Then by following the syntax of MFM, which can be learned by the studies (Lind, 2013; Wu et al. 2014), the simulation of the plant can be performed by using the MFMSuite platform.

Which system analysis tool shows the flow of data from input → processing → output?

Figure 2. The basic MFM symbols

Read full chapter

URL: https://www.sciencedirect.com/science/article/pii/B9780444639653501008

Preview of Models

In Human Performance Models for Computer-Aided Engineering, 1990

FRAMEWORK

In selecting and organizing the models reviewed, the authors had in mind the general framework and functional decomposition shown in Figure 2-1. Even though the chapters in Parts II and III and the individual models discussed in these do not follow this framework rigidly, it has been useful for organizing the discussion of this complex field.

Which system analysis tool shows the flow of data from input → processing → output?

FIGURE 2-1. Framework for models of vision and cognition.

The framework of Figure 2-1 is aimed toward a full simulation model of the visual system. This system is modeled as a serial set of processes starting with early vision. Eye fixation, although shown in the figure, is only treated statistically, and the details of the eye movement process are covered superficially. The inputs to the early vision models are direct physical measures of the visual scene, and thus these models and those that build from them are image driven. The framework assumes that the outputs of models at one stage provide the inputs needed by those at the next stage in progression from early vision to form perception, three-dimensional structure through motion, state-variable estimation, object recognition, mental manipulation of information and finally to combination of views. The later stages of vision are recognized as being cognitive, and are shown as being within the envelope of the cognitive system. Later visual and many cognitive processes, especially those that determine what will be attended to, also influence earlier visual processes, although these effects are not shown in the framework. This linear framework has proven to be a useful way of organizing the discussion of vision even though it is clearly an oversimplification.

For cognition we lack a well-developed architecture to structure simply the flow of information and interaction among the functional components of cognitive processing. Rather, we have found it useful to differentiate between models of mechanisms of the human cognitive architecture and models of rational action. The section on cognitive models begins with a review of models for the architecture of human information processing. We then discuss several component mechanisms of the cognitive architecture, namely resource allocation and attention, working memory, and learning. The rest of the section focuses on models of rational action, first addressing models that are based on scenarios consisting of the actions the pilot is required to perform to execute a specified mission. Three other types of rational action models are treated in this section: errors, decisions, and representation of knowledge. The later stages of vision that are included within cognition belong mostly within the rational action grouping. This collection of topics does not provide complete coverage of all the cognitive functions involved in helicopter flight or even of those just dealing with the visual tasks of flight, but it is a large and important subset of those functions.

Read full chapter

URL: https://www.sciencedirect.com/science/article/pii/B9780122365300500078

Integrated System Modeling

Peter J. Ashenden, ... Darrell A. Teegarden, in The System Designer's Guide to VHDL-AMS, 2003

Architectural Trade-Offs

The most important decisions in a design are often the choices that are made about the structure and functional decomposition of the system. These decisions can determine whether or not the system requirements can even be met. They can also affect the future extensibility of the system, determining the adaptability of the design to future requirements and trade-offs. Finally, architectural decisions can have a significant effect on the cost structure of the system.

The architecture of the system is best determined early in the design, through the use of the system architecture model. Architectural experiments can be performed and analyzed quickly using high-level models. The trade-offs can be documented in reports using graphs from the simulation analyses. There is value in documenting those architectural attempts that are not selected, since they may be useful in the future when circumstances and requirements change. They can also help avoid repeated exploration of dead-end paths.

VHDL-AMS provides mechanisms to help make architectural exploration easier. The use of structural models (net-lists of more basic building blocks) allows for flexible assembly of higher-level models from basic building blocks. The use of multiple architectures for a given entity also makes it easier to build, configure, use and document the system architecture model with varying functionality.

Read full chapter

URL: https://www.sciencedirect.com/science/article/pii/B9781558607491500256

CoWare—A Design Environment for Heterogeneous Hardware/Software Systems

D. VERKEST, ... H. DE MAN, in Readings in Hardware/Software Co-Design, 2002

3.1 Specification of the Pager

Each block in Figure 6 corresponds to a process implementing a specific function of the pager. This functional decomposition determines the initial partitioning. The finest granularity is determined by the functions in the system. It does not make sense to have a finer grain partitioning: the communication inside each process is very intense, the communication with external processes is limited. Hence, the partitioning of the system is determined by the functional characteristics of the system.

The arrows in between the processes indicate communication via a Remote Procedure Call (RPC) mechanism. Figure 7 shows the RPC communication in detail for part of the pager design. The blocks in the figure correspond to the processes from Figure 6.a. The small rectangles on the perimeter of the processes are the ports. The shaded ports are master ports, the others are slave ports.

Which system analysis tool shows the flow of data from input → processing → output?

Figure 7. RPC as a basic communication mechanism between processes.

The Sample Clock Generator process contains a time-loop thread. This thread runs continuously. It performs an RPC over its input port ip to the Tracking & Acquisition process to obtain a new value for delta. The time-loop thread of the process adds the delta parameter to some internal variable until a threshold is exceeded. In this way it implements a sawtooth function. When the sawtooth exceeds the threshold an RPC call is issued to the A/D converter process.

The slave thread clock in the A/D converter process samples the analogue input, and sends the result to the Down-conversion process via an RPC call. This in turn will activate the Decimination process via an RPC call, etc.

The Correlator & Noise Estimator process contains a slave thread associated with port ip to compute the correlation values. This slave thread is activated when the Phase Correction process writes data to the Correlator & Noise Estimator process (i.e. when the Phase Correction process performs an RPC to the ip port of the Correlator & Noise Estimator process). The slave thread reads in the data and then performs an RPC to the User Interface process to obtain a new value for the parameter par it requires for computing the correlation values. Finally, the new correlation results are sent to the Tracking & Acquisition process via an RPC call on its op port.

The slave thread in the Tracking & Acquisition process updates the delta value for the sawtooth function implemented by the Sample Clock Generator process. It puts the updated value in the context, where it is retrieved by the slave thread op which serves RPC requests from the Sample Clock Generator process. In this way the Tracking & Acquisition process influences the frequency of the clock generated by the Sample Clock Generator process. This example shows how the context is used for communication between threads inside the same process whereas the RPC mechanism is used for communication between threads in different processes. The locking and unlocking of the context is required to avoid concurrent accesses to the variable delta. The lock in the slave thread op locks the context for read: other threads are still allowed to read from the context, but no other thread may write the context. The lock in the slave thread ip locks the context for write: no other thread is allowed to write or read to context until it is unlocked again.

Each process is described in the language that is best fit for the characteristics of the function it implements.4 The data-flow blocks (NCO, Down-conversion, Decimination, Chip Matched Filter, Phase Correction, Correlator & Noise Estimator, and Sample Clock Generator) are described in Silage/DFL. The control oriented blocks (Tracking & Acquisition, Frame Extraction and User Interface) are described in C. The description of the DSP blocks consists of approximately 900 lines of Silage/DFL code. The description of the control oriented blocks consists of approximately 100 lines of C code.

At this moment it is not yet decided what process will be implemented on what kind of target processor nor is it defined how the RPC communication will be implemented. However, the choice of the specification language for each process restricts the choice of the component compiler and in that sense partly determines the target processor. Hence, studying possible alternative buildings of a process to a target processor may require the availability of a description of the process in more than one specification language or a clear guess of the best solution.

Read full chapter

URL: https://www.sciencedirect.com/science/article/pii/B9781558607026500375

Specifying Synthetic Instruments

C.T. Nadovich, in Synthetic Instruments, 2005

Functional Decomposition and Scope

When I defined a compound ordinate like Gain in terms of atomic ordinates, this is really quite similar to the classic functional decomposition that is used with programming languages like C. You start with a complex function, and then partially factor it with several, subroutines. Some of these subfunctions are factored again, and those again, and so on down till you have functions that don't call any subroutines.1 Each function in such a decomposition represents a node in a tree much in the same way as the <Measurement> elements represent nodes in the XML tree I have presented for describing measurement maps.

In a classic functional decomposition, each function can have parameters, and optionally return values. This allows us to pass information downward and upward in the functional tree. In the measurement map and calibration strategy schema I have outlined, information flow is implicit between the atoms and the compounds, relying on the fact that their interfaces fit. While it would be nice to believe that this fitting would happen spontaneously on its own, realistically I need a way to specify an interface.

I have used “name=” identifiers to label things within the measurement. Let's add an explicit <ParameterList> element to say what internal parameters are passed into the measurement from outside. The names associated with these parameters become placeholders for the value passed in. Here's the calibration strategy map for “Gain” restated with an explicit parameter list.

Example 9-7

Parameter list

Which system analysis tool shows the flow of data from input → processing → output?

Which system analysis tool shows the flow of data from input → processing → output?

In addition to their use as parameter labels, I have used element “name=” identifiers as local variables within the measurement, allowing us to refer to specific abscissa values from the <Collapse> block. It would be reasonable to expect that the implied scope of a local identifier is delimited at the <Measurement>, just like the scope of functional parameters.

Read full chapter

URL: https://www.sciencedirect.com/science/article/pii/B9780750677837500127

Which data analysis tool shows the flow of information within an information system?

A data flow diagram (DFD) maps out the flow of information for any process or system. It uses defined symbols like rectangles, circles and arrows, plus short text labels, to show data inputs, outputs, storage points and the routes between each destination.

Which phase in the system life cycle involves gathering and analyzing data?

Analysis This phase involves gathering all the specific requirements for the system to be designed, analyzing any existing system if there is one, and the drawbacks it has that result in going for new software, and feasibility study of the new software system to be designed.

Which phase in the systems life cycle involves installing the new system?

Implementation Phase Implementation includes user notification, user training, installation of hardware, installation of software onto production computers, and integration of the system into daily work processes.

What is the the first phase in the systems life cycle?

Phase 1 of the systems development life cycle involves a preliminary analysis, an initial phase at the start of a project that determines whether the concept is viable; any proposed alternative solutions; the cost benefit evaluation; and finally the submission of the preliminary plan for recommendations.