The service , rather than a definable good, constitutes the major output of the development process.

Understanding and scoping process problems

Paul Harmon, in Business Process Change (Fourth Edition), 2019

Process Levels and Levels of Analysis

Another key concept is the idea of a process hierarchy and the use of levels to describe the subdivision of processes. We show an abstract process hierarchy in Figure 8.2 and have added notes on the left to suggest how a process analysis effort will tend to vary, depending on whether we are dealing with very large processes, mid-level processes, or specific activities or tasks.

The service , rather than a definable good, constitutes the major output of the development process.

Figure 8.2. Hierarchical decomposition of a value chain suggesting how “level of analysis” corresponds to process level.

As a generalization, we can usually divide the process hierarchy into three parts and associate problems and analysis techniques with specific levels. Broadly, one set of process analysis techniques is used to redesign or improve higher level processes. Another set is used on the types of process problems we find in the middle of the process hierarchy. Still another set of techniques is appropriate for processes at the bottom of the hierarchy. Figure 8.3 provides an overview of this three-part distinction.

The service , rather than a definable good, constitutes the major output of the development process.

Figure 8.3. Overview of the different levels of process analysis.

Thus the top part of the process hierarchy is usually associated with architecture problems and with problems of coordination between departments or functional units. In this case we focus on aligning inputs and outputs and write contracts to specify what Process A will need to deliver to its “customer” Process B.

Midsize problems usually occur in processes managed within a single department or at most a few departments. The problems often require that the processes be simplified or the sequences rearranged. Nonvalue-adding processes or subprocesses need to be removed; some activities need to be automated.

Low-level problems usually involve individual performers or software systems. They usually require a detailed task analysis. In some cases the business rules used by the performers or the systems need to be specified. Often training programs and job descriptions need to be developed.

Read full chapter

URL: https://www.sciencedirect.com/science/article/pii/B978012815847000008X

Terminology and Definitions

N.A. Shneydor B.Sc., Dipl. Ingénieur, M.Sc., Ph.D., in Missile Guidance and Pursuit, 1998

1.1.3 Scope of this Book

In this book we shall deal with the upper two levels of the guidance process hierarchy, namely geometrical rules and guidance laws, and shall not delve into the topic of body control. There are two reasons for this.

The first reason is the wish to make the book as reasonably small as possible, topped by the fact that there are available several good texts on control of aircraft and missiles [2–5] and marine vehicles [6].

The second reason stems from the fact that body-control loops are faster, sometimes much faster, than guidance loops; in control-engineering terminology, body-control loops are high-bandwidth inner loops whithin lower-bandwidth guidance loops (Fig. 1.2). Indeed, in many works it is assumed that the inner loop is so fast that, for the purpose of preliminary outer-loop analysis, it can be ignored altogether. M is then represented in the guidance loop by zero-order dynamics, for which aM≡aMc by definition.

The service , rather than a definable good, constitutes the major output of the development process.

Figure 1.2. Body-control loop whithin the guidance loop

However, when implementation and mechanization of guidance laws are studied, we shall not ignore the fact that body control loops do have dynamics; in some cases, the effects of nonlinearities will also be examined.

Read full chapter

URL: https://www.sciencedirect.com/science/article/pii/B978190427537450006X

Working with the Business Process Management (BPM) Life Cycle

Mark von Rosing, ... Anette Falk Bøgebjerg, in The Complete Business Process Handbook, 2015

Step 6: Process Planning and Design

As a direct continuation from defining and describing the overall process goals, planning and design steps are initiated with the purpose of designing new processes from scratch and/or plan the redesign and reengineering requirements of existing processes. The level of detail in this area is also increased dramatically, as the design process slowly moves away from the high-level process landscape and into a much more detailed process landscape (Figure 7).

The service , rather than a definable good, constitutes the major output of the development process.

Figure 7. The process workflow connection diagram is a process matrix that shows the connectivity between the services delivered by business processes in the process landscape. This is a very powerful and important tool to use when designing an organization’s business processes as it shows how strategic, tactical, and operational service deliverables relate to one another.

Ref. 9.

The output of step 6 is consumed by step 7.

Typical tasks that are done within this step:

Determine the need for new main, management, and/or supporting (classification of) processes

Organize and structure process hierarchy

Determine and define each required process level

Gather and categorize process steps, process activities, and events and gateways

Collect information around process meta objects

Typical templates that are used:

Information Map and/or Matrix

Process Map and/or Matrix

Object Map and/or Matrix

BPM Notations Map and/or Matrix

Role Map and/or Matrix

Owner Map and/or Matrix

Requirement Map and/or Matrix

Workflow Map and/or Matrix

Application Service Map and/or Matrix

Data Service Map and/or Matrix

Application Rule Map and/or Matrix

Data Rule Map and/or Matrix

Typical BPM CoE roles involved:

Process eXperts

Process Engineers

Enterprise Architects

Read full chapter

URL: https://www.sciencedirect.com/science/article/pii/B9780127999593000148

Concepts and Security Model

In SAP Security Configuration and Deployment, 2009

Objects

Objects are assigned to roles and use the same authorization levels as tasks and roles (that is, corporate, organization, process, sub-process, control) to set up organization and process hierarchies. When assigning a role to an object, the object authorization level must be the same or higher than the role authorization level. It is important to note that this object authorization level determines the level of data access within an organization and process hierarchy for SAP Process Control; and it is a different concept than the authorization object for SAP Web AS ABAP systems.

To illustrate this principle, let's use an example to compare two roles with different organizational level as shown in Table 2.2. The Internal Corporate Control Manager is a corporate-level role so it can view all organizations within the corporation, as well as create or modify the organization hierarchy. The Organization Owner is an organization-level role, it can only view the assigned organization hierarchy and it is not allowed to create or modify the organizational hierarchy.

Table 2.2. An Example of Object Authorization

RoleRole AuthorizationTaskTask HierarchyObject
Internal CorporateControl Manager Edit Hierarchy Corporate Functions Corporate Data Level access
Organization Owner Display Hierarchy Organization Functions Organization Data Level access

Read full chapter

URL: https://www.sciencedirect.com/science/article/pii/B9781597492843000028

Service-Oriented Architecture

Fred A. Cummins, in Building the Agile Enterprise, 2009

A Framework Example

The enhanced Telecom Operations Map (eTOM) from the Tele Management Forum (TMF) is a widely recognized industry framework. It is a business process framework for all processes of a telecommunications service provider. eTOM defines a process hierarchy similar to that previously described for the service-oriented analysis. The TMF has also defined a companion enterprise data model called Shared Information and Data (SID) that supports the Enterprise Logical Data Model requirement discussed in Chapter 5.

Figure 2.9 illustrates the eTOM framework at the enterprise level. It is described as a process framework that defines the business processes of the enterprise in a hierarchy. There are three major process categories: (1) operations, (2) strategy, infrastructure, and product, and (3) enterprise management. These are described as level-zero processes.

The service , rather than a definable good, constitutes the major output of the development process.

Figure 2.9. eTOM Framework.

The Operations category reflects the primary business operations. The Strategy, Infrastructure, and Product segment defines processes for changes to the business; that aspect of agile enterprise architecture is addressed in Chapters 8 and 9Chapter 8Chapter 9 of this book. The processes in Enterprise Management are typically viewed as support services—those processes that are part of managing the enterprise, such as finance and human resources, but are not a direct part of the value chain.

The Operations and the Strategy, Infrastructure, and Product categories of Figure 2.9 are each divided by vertical and horizontal partitions described as level 1 processes. The vertical partitions reflect functional capabilities. The horizontal partitions reflect primary enterprise objectives that cut across the functional capabilities. For example, customer relationship management (CRM) is an enterprise objective that requires participation and support from each of the functional capabilities. These objectives are optimized operationally in the Operations segment and optimized from a business change perspective in the Strategy, Infrastructure, and Product segment.

Figure 2.10 shows more detail for the Operations process. These level 2 processes are shown at the intersections of the vertical and horizontal level 1 processes; each is in both a horizontal and a vertical level 1 process within the eTOM specification. Each of these level 2 processes is further detailed in subprocesses. Note that some level 2 processes span level 1 processes; these are effectively shared capabilities that may represent either shared work management service units or capabilities that can be further broken down to define shared operational service units. A similar breakdown is defined for the Strategy, Infrastructure, and Product level 1 process. More detailed breakdowns also exist for the Enterprise Management process. eTOM process models provide additional insights on capability requirements and the contexts in which they are used.

The service , rather than a definable good, constitutes the major output of the development process.

Figure 2.10. eTOM Operations Level 2 Processes.

Read full chapter

URL: https://www.sciencedirect.com/science/article/pii/B9780123744456000029

Frameworks

Tim Weilkiens, ... Andrea Grass, in OCEB Certification Guide, 2011

Publisher Summary

This chapter discusses frameworks and basic terms from this area such as quality or regulation, principle, and guideline. OCEB Fundamental comprises frameworks on processes, quality, management, and metrics, as well as regulations. In the concrete project environment, there may be other frameworks that are of relevance. Process frameworks are reference models that support the description, assessment, and optimization of business processes. They usually specify process hierarchies to classify processes. Specific process frameworks addressed in the Fundamental certification include the APQC Process Classification Framework (PCF), the Supply Chain Operation Reference Model (SCOR), and the Value Reference Model (VRM). Quality frameworks support the improvement or management of the quality of a product or service. In addition to the basic aspects the chapter discusses the Business Process Maturity Model (BPMM), the Six Sigma quality methodology, the ISO-9000 standards, and the Toyota Production System (TPS). Regulations are statutory provisions. Business processes must adhere to them or new business processes must be implemented due to regulations. Regulations addressed by OCEB Fundamental refer to the financial industry only, but they are considered representative of further regulations from other industries. Management frameworks specify best practices, guidelines, and tools that support management in its work and monitoring tasks.

Read full chapter

URL: https://www.sciencedirect.com/science/article/pii/B9780123869852000010

Process Tagging—A Process Classification and Categorization Concept

Mark von Rosing, ... Jeanne W. Ross, in The Complete Business Process Handbook, 2015

The Nature of Process Decomposition

One of the fundamental difficulties with modern process practices for understanding and representing business with structured (graphical) process models revolves around the analysis and representation of work, the representation of actions involving mental or physical effort done to achieve a purpose or result. The core of the difficulty with describing work is that there is little clarity and less agreement about what is meant by work or with respect to how to organize its descriptions in ways that are meaningful, repeatable, and reusable. Methodologies, most of which are focused on software implementation of behaviour, talk about using a variety of methods to analyze and design workflows and processes within an organization, using such terms as “process,” “sub-process,” “activity,” “task,” “procedure,” “transaction,” or “step” to describe the way work is executed but without distinguishing the nature of one from another. Indeed, most attempts at clarification simply classify process into levels of detail. By failing to offer definitions whose perspective and context are clear, they unfortunately fail before they start.

It may be that a factor with existing approaches to describing work is that they are based on the old assembly line thinking invented by automotive manufacturing, and that this approach does not directly scale to address their application in a different context.

When capturing the details of work on an assembly line, the place where the flow starts and stops is clear. Owing to the tangible nature of the work involved, whether modeling a workstation on the entire line, both the nature of the work and the scope are clear. Anyone can point out and relate to where the work starts and where it ends. Work starts at the factory door or at the start of the line and ends with a completed product. In addition, whether building subassemblies out of individual components or doing final assembly of a vehicle, the items being manipulated are clear. This clarity does not easily transfer from the physical world of transformational work to the more conceptual world that is aimed at processing information.

We have already covered this in the chapter entitled “What Is BPM,” but we already see the first signs of this problem when we examine the literature, which is full of definitions for process. Although a small selection of the more commonly used definitions show a certain level of consistency, none of them provides the clarity needed to provide testable criteria, to ensure an analyst is able to be clear regarding where the work being documented is to start, where it is to end, or what level of representation is applicable to address the business problem at hand.

The core idea of a process is that any piece of definable work will always produce a specific product (or economic service); i.e., the reason for the existence of the process is the output of the product or service that it produces. In our context, an attribute or feature of a process is the output it is designed to create, each time the process is executed it will create a new instance of its product or service, and the thing that it is designed to produce remains consistent each time the process is performed.

Consider the following:

1.

“A process is a group of related activities that together creates a result of value for clients.”7 This tells us that there is some sort of relationship between “process” and “activity” without giving any illumination as to what either is, or providing tests that we can use to any great effect.

2.

“A business process is a series of steps designed to produce a product or service.”8 This definition tells us that there is a relationship between “process” and “steps” but leaves us no wiser.

3.

“A business process is a series of logically connected business events and the logical connection is to a bigger scenario such as ‘source to pay.’”9 This view of the subject says that a process and an event are the same thing, while indicating that a series of processes is part of some larger idea.

4.

“A process describes a sequence or flow of activities in an organization with the objective of carrying out work. Processes can be defined at any level from enterprise-wide processes to processes performed by a single person. Low-level processes can be grouped together to achieve a common business goal.”10 This is just another way of saying that a process can be anything you want; it does not help if one is trying to create a testable, repeatable specification.

In the end, all of these definitions offer a view of process that is akin to a matryoshka or Russian nesting doll of decreasing size placed one inside the other with no way to distinguish where the dolls are in the hierarchy without having other dolls in the set to provide a basis for comparison.

Figure 5 shows a set of five nested matryoshka dolls and a set of five undifferentiated processes that are similarly nested. In the case of either the dolls or any of the processes, it is not possible to distinguish members of the set without additional information. This lack of information creates a challenge for making the documentation and representation of work a repeatable process, where the start and stop of the description is meaningful and results in a complete specification, and where the level of description is suited to the problem being addressed and the result desired.:

The service , rather than a definable good, constitutes the major output of the development process.

Figure 5. Nested dolls and nested processes.

At its simplest level, APQC’s Process Classification Framework (PCF) is a list that organizations use to define work processes comprehensively and without redundancies. The goal of the PCF is to create an inventory of the processes practiced by most organizations, categorize them, and align them according to a standard system.

Processes are organized into levels. Level 1 processes are a simple categorization of service. Level 2 processes capture more detail within the same process, and so on.

In counterpoint to this, the SAP ASAP (Accelerated SAP) process hierarchy makes a partially useful attempt to distinguish among levels of complexity in processes. The SAP process hierarchy arguably offers the most advanced thinking on the classification of processes within a hierarchy, and therefore offers an excellent starting point. Key members of this hierarchy are as follows:

1.

Level 1 Process Areas: A high level aggregation of deliverable processes.

2.

Level 2 Process Groups: “A bundle of processes that belong to the same area of responsibility dealing with similar tasks and activities for functional or other reasons”.11 This suggests that processes can be bundled based into arbitrary classification, i.e., “for functional or other reasons,” a definition that might have been useful except that the bundling can be for any arbitrary reason.

3.

Level 3 business processes: “The business process is the level that aggregates business-oriented functions or steps into a unit that is meaningful and comprehensive in the sense that the steps or functions incorporated are essential to fulfill a business mission-related task; i.e., a business process is defined by steps that transform an input into an enriched output.”12 Without knowing what a “business mission-related task” is and without clarity on how “steps or functions” are organized this definition does not directly advance our understanding.

4.

Level 4 process steps: “An activity performed by a user or a piece of software together with other process steps forming a business process.”

The SAP process hierarchy specification offers specific guidance about the creations of process steps, specifying that:

a.

A process step is an activity related to exactly one object (e.g., a human, a sheet of paper, a purchase order (system), etc.).

b.

A process step is typically executed by one person and documented using an appropriate representation of the object (paper, data in an IT system, etc.).

c.

From a user interaction point of view, a process step is a single work task in a causal workflow without role change. A process step is typically identified by the fact that the task owner has all necessary responsibilities to execute the task. A process step can be performed by a single human being or by interaction between human/system and system/system.”

Although it should be evident that work can be performed by things other than “users” or “software,” such as by machines, not only is the intent of this definition with its supporting tests exactly the type of guidance we are looking for, it appears to be extremely useful with regard to understanding the nature of process. We will also choose not to be confused by the use of the term “activity” in this definition and assume what is meant is “work.”

5.

Level 5 activities: “Activities are the lowest granularity for business process modeling and reflect the single actions a user or a system performs to fulfill the process step; i.e., filling in the fields of a special mask consists of activities as each field has to be filled to end the step.” Again, other than being transactional work-centric and focused on “modeling,” this definition provides clear guidance about the nature of work at this level of granularity.

The challenge, then, in part is that these definitions need to be expanded to embrace all forms of work—transformational, transactional, and tacit—and to provide greater clarity at all levels of this hierarchy so as to create an integrated set of testable, definitions with supporting criteria that can be used to produce consistent, repeatable results that can be applied to obtain similar results by two or more analysts working to understand, document, and represent the decomposition of work. Table 1 summarizes the key classification schemes for classifying process detail/decomposition.

Table 1. Overview of Different Views of Process Levels

APQC PCFLEAD Process LevelsSAP Process LevelsSAP Solution ManagerSCOR
Levels 1 Category Process area Business area
2 Process Group Process Group Process Group
3 Process Business process Business process Scenario Level 1
Level 2
4 Activity Process step Business process variant Process Level 3
5 Process activity Process step Process step Level 4
6 Process activity

Read full chapter

URL: https://www.sciencedirect.com/science/article/pii/B9780127999593000082

Getting Started

Tim Weilkiens, ... Kim Nena Duggen, in OCEB 2 Certification Guide (Second Edition), 2016

Business Goals

The business goals topic area entails concepts of business administration, marketing, and project management. Anyone who works in the business process environment should have basic knowledge of organizational forms of enterprises, market environment analyses, marketing, financial key figures, and business analysis methods.

Business Processes

Independent of the standard BPMN, the two topic areas, concepts and fundamentals of business processes and concepts and fundamentals of business process management, require basic knowledge of business processes. Not only the What, but also the How is important, for example, how to discover business processes or present business process hierarchies, and how to handle the various degrees of abstraction in the description. By aligning the business processes with the business goals, a link is established between the first and the third topic areas.

BPM

The business process management concepts and fundamentals topic area deals with the handling of business processes in enterprises, the impacts of process-focused structures, and the various approaches of business process management such as Business Process Reengineering (BPR) or Total Quality Management (TQM). Another topic is again an OMG standard: Business Process Maturity Model (BPMM). This is a maturity model for business processes similar to Capability Maturity Model Integration (CMMI) for software and system development.

BMM

The business motivation modeling topic area addresses another OMG specifications. The Business Motivation Model (BMM) is a standard to describe business plans. It defines the basic artifacts, their characteristics, and their interrelationship. This includes vision, mission, strategy, business rules, objectives, influencers, and appraisals.

BPMN

With a total of 40%, the business process modeling concepts and skills topic assumes the largest part of the OCEB2-Fundamental certification. The OMG standard, BPMN, predominates here. BPMN fundamentals and the diagram elements of the process diagram are required here. You not only need to know what a specific element represents, but also must be able to interpret a BPMN diagram. You must be able to answer contextual questions on a process diagram with real subject-matter knowledge.

Frameworks

The last topic area of the OCEB2 Fundamental certification deals with process quality, governance, and metrics frameworks.

Read full chapter

URL: https://www.sciencedirect.com/science/article/pii/B9780128053522000017

Farm automation

K. Rupabanta Singh, ... Sourav K. Giri, in AI, Edge and IoT-based Smart Agriculture, 2022

3 Architecture of edge computing and IoT (E-IoT) platform

Edge computing includes many technologies that take advantage of computing resources that are available outside conventional data-centric cloud computing. The key components of edge computing and the IoT platform are explained below [25, 26]:

i.

Edge devices: IoT sensors that generate data.

ii.

Edge nodes: A device capable of routing network traffic to an edge gateway, edge server, or edge device where edge computing can execute.

iii.

Edge server: A general purpose computer located in a distant place to operate farm application tasks and common services.

iv.

Edge gateway: An edge server that can perform different operations such as firewall protection, network termination, wireless connection, tunneling, or protocol translation.

v.

Cloud: Storage for applications and machine learning models. It can also host and run application workloads used to manage various edge nodes.

The requirements of sensing, identification, and control of smart things of the IoT-based system using sensors and actuators make the architecture different from other information systems. There are six viewpoints defined by the author [27]:

i.

Domain model viewpoint: This is used to provide a common view regarding the main IoT components and their interactions. Also, the main physical entities and key actors are concerned.

ii.

Business process hierarchy viewpoint: The outlines of business processes are shown in the business process hierarchy view. It shows interrelationships between business processes.

iii.

IoT layer viewpoint: It categorizes the functionalities of each IoT component from a technical outlook.

iv.

Deployment viewpoint: It shows the position of hardware and software components and also shows how these components are developed.

v.

Information viewpoint: The entities of the IoT-based systems are modeled in this view.

vi.

Interoperability endpoint viewpoint: The interoperability endpoint view defines the interfaces for interacting with legacy and IoT systems.

The edge computing architectures describe how edge computing overcomes the limitations of conventional data-centric cloud computing in an IoT-based system in terms of latency, privacy, bandwidth, and cost [28, 29]. The edge computing preprocesses the data generated from different IoT sensors and then sends that to the centralized cloud server [30]. Due to preprocessing, fewer data need to be sent to the remote cloud. The major computing is done at the edge of the local network that is, near the place where the IoT devices collect the data.

The five edge computing reference architectures are given below:

3.1 FAR-edge RA

This architecture was developed with three layers. It is a conceptual framework for developing and implementing the FAR-edge project platform [31] depicted in Fig. 5.

The service , rather than a definable good, constitutes the major output of the development process.

Fig. 5. FAR-edge reference architecture.

Farm layer: It is the bottom layer of the architecture and it includes all the IoT sensors, smart machines, and actuators presenting the edge node.

Edge node layer: All application software developed to perform tasks such as real-time data analytics are placed in this edge layer that contains edge gateways, computing devices, and blockchain technologies.

Cloud server layer: It is the remote cloud with a cloud server and various services such as monitoring and managing the resources of the architecture.

3.2 Edge computing reference architecture 2.0

The edge computing RA 2.0 was designed based on ISO/IEC/IEEE 4 2010:20111 [32] by the Industrial Internet Alliance (IIA) and the Edge Computing Consortium. The edge computing reference architecture 2.0 has four layers with open interfaces and different services emphasis on intelligent services shown in Fig. 6.

The service , rather than a definable good, constitutes the major output of the development process.

Fig. 6. Edge computing reference architecture 2.0.

Smart services: This is the topmost layer of the architecture containing the development service framework and the operation service framework that ensure intelligent coordination between service development and deployment.

Service fabric: This layer defines the technological processes, control parameter, tasks, and path plan of processing and executing the fast deployment of service policies.

Connectivity and computing fabric: This layer is liable for deploying activities and coordinating between the computational resource services.

Edge computing node: This layer provides compatibility with different types of connections in real-time processing and response capacities.

3.3 Industrial Internet Consortium reference architecture

The Industrial Internet Consortium RA was developed by the Industrial Internet Consortium, and it is based on ISO/IEC/IEEE 4 2010:2011 [32] shown in Fig. 7.

The service , rather than a definable good, constitutes the major output of the development process.

Fig. 7. Industrial Internet Consortium reference architecture.

There are three layers in this architecture:

Edge: The edge layer collects data from the edge nodes through the proximity network, which includes the breadth of distribution, location, scope of governance, and nature of the proximity network.

Platform: The execution and transfer of control commands from the bottom layer to the top layer are performed in this part. Grouping the processes and analyzing the data flows are the main functions of this layer.

Enterprise: This top layer hosts special applications such as operation management, end user interfaces, and decision support systems. It generates the control commands for the edge layer and the platform layer.

3.4 INTELSAP reference architecture

This reference architecture developed by INTEL and SAP allows the fast and flexible development of IoT projects [33]. Fig. 8 shows the three blocks of the architecture [34].

The service , rather than a definable good, constitutes the major output of the development process.

Fig. 8. INTEL-SAP reference architecture.

Edge endpoint: It includes IoT sensors and IoT devices that generate the data sent to the next block to analyze the processing.

Edge gateway: It performs the authentication with a device certificate and the data path to the remote cloud is established.

Cloud: It performs the verification process of the Intel LPID signature and the ownership of the IoT devices for registration. When the verification is successfully completed, it establishes a path between the IoT devices and the cloud.

3.5 Global edge computing reference architecture

This reference architecture has three layers and performs the data encryption process at the bottom layer. Blockchain technologies have been applied to improve the security in this architecture. It inherits some features from the above-mentioned reference architecture, as shown in Fig. 9.

The service , rather than a definable good, constitutes the major output of the development process.

Fig. 9. Global edge computing reference architecture.

IoT layer: It is the bottom layer of the architecture and it includes sensors, actuators, controllers, and a gateway for IoT environments.

Edge layer: The middle layer of the architecture performs the filtration and preprocessing of the data generated by the IoT sensors and devices.

Business solution layer: This is the top layer of the reference architecture and it includes analytics, cloud management, authentication, knowledge base, and APIs for performing various tasks such as data analysis, authentication, etc.

The special characteristics of the global edge computing architecture, which are not available in other architectures, are data encryption and open source. Blockchain technology is only used in the FAR-edge and global edge computing architecture. The ECC, IIC, and global edge computing architectures are international standards, but the FAR-edge and INTEL SAP architectures are not.

The Farm Automation system architecture has four layers; Physical Layer, Edge Layer, Cloud Layer, and Network Communication Layer as shown in Fig. 10 [26, 35, 36].

The service , rather than a definable good, constitutes the major output of the development process.

Fig. 10. Farm automation system reference architecture.

i.

Physical Layer

The physical layer includes IoT sensors and gateway devices placed on farmland. These devices collect real-time data and send it to the edge layer for processing.

ii.

Edge Layer

Edge computing is performed in this layer near the end devices for real-time operation.

iii.

Cloud Layer

It is the topmost layer providing various services commonly used for data storage. It is located far from the farmland and needs an Internet connection.

iv.

Network Communication Layer

It is the communication gateway that connects the physical layer and the edge layer.

Read full chapter

URL: https://www.sciencedirect.com/science/article/pii/B9780128236949000244

Process management

Paul Harmon, in Business Process Change (Fourth Edition), 2019

Matrix Management

Having defined functional and process management let’s consider how an organization might combine the strengths of the two approaches at the top of the organization. Recently, leading organizations have begun to establish some kind of process management hierarchy that, at least at the upper level, is independent of the organization’s functional hierarchy. The top position in a process hierarchy is a manager who is responsible for an entire value chain. Depending on the complexity of the organization the value chain manager might have other process managers reporting to him or her. This approach typically results in a matrix organization like the one pictured in Figure 6.8.

The service , rather than a definable good, constitutes the major output of the development process.

Figure 6.8. Matrix organization with independent senior functional and process managers.

In Figure 6.8 we show a company like the one pictured earlier with three functional units. In this case, however, another senior manager has been added, and this individual is responsible for the success of the widget value chain. Different organizations allocate authority in different ways. For example, the widget process manager may function only in an advisory capacity. In this case he or she would convene meetings to discuss the flow of the Widget value chain. In such a situation the sales supervisor would still owe his or her primary allegiance to the VP of sales, and that individual would still be responsible for paying, evaluating, and promoting the sales supervisor. Key to making this approach work is to think of the management of the widget value chain as a team effort. In effect, each supervisor with management responsibility for a process that falls inside the widget value chain is a member of the widget value chain management team.

Other companies give the widget value chain manager more responsibility. In that case the sales supervisor might report to both the widget value chain manager and to the VP of sales. Each senior manager might contribute to the sales supervisor’s evaluations and each might contribute to the individual’s bonus, and so forth.

Figure 6.9 provides a continuum that is modified from one originally developed by the Project Management Institute (PMI). PMI proposed this continuum to contrast organizations that focused on functional structures and those that emphasized projects. We use it to compare functional and process organizations. In either case the area between the extremes describes the type of matrix organization that a given company might institute.

The service , rather than a definable good, constitutes the major output of the development process.

Figure 6.9. Types of organizational structure.

Modified from the Project Management Institute’s classification of five organization types.

The type of matrix an organization has is determined by examining the authority and the resources that senior management allocates to specific managers. For example, in a weak matrix organization functional managers might actually “own” the employees, have full control over all budgets and employee incentives, and deal with all support organizations. In this situation the process manager would be little more than the team leader who gets team members to talk about problems and tries to resolve problems by means of persuasion.

In the opposite extreme the process manager might “own” the employees and control their salaries and incentives. In the middle, which is more typical, the departmental head would “own” the employees and have a budget for them. The process manager might have control of the budget for support processes, like IT, and have money to provide incentives for employees. In this case employee evaluations would be undertaken by both the departmental and the project manager, each using their own criteria.

Most organizations seem to be trying to establish a position in the middle of the continuum. They keep the functional or departmental units to oversee professional standards within disciplines and to manage personnel matters. Thus the VP of sales is probably responsible for hiring the sales supervisor shown in Figure 6.8 and for evaluating his or her performance and assigning raises and bonuses. The VP of sales is responsible for maintaining high sales standards within the organization. On the other hand, the ultimate evaluation of the sales supervisor comes from the SVP of the widget process. The sales supervisor is responsible for achieving results from the widget sales process and that is the ultimate basis for his or her evaluation. In a sense the heads of departments meet with the SVP of the widget process and form a high-level process management team.

Read full chapter

URL: https://www.sciencedirect.com/science/article/pii/B9780128158470000066