Second in a series of blogs from the Industrial Internet Consortium Trustworthy Task Group. For a comprehensive look at trustworthiness foundations in IIoT, download our foundational document.
The degree of confidence one has that the system performs as expected. Characteristics include safety, security, privacy, reliability, and resilience in the face of environmental disturbances, human errors, system faults, and attacks.
Trustworthiness, and confidence in that trustworthiness, are essential aspects of all types of industrial systems. Such systems are complex systems of systems that can exhibit emergent properties due to the interconnection and interactions of their subsystems. These subsystems can include information technology (IT) that focuses on data and operational technologies (OT) that use data, sensors, and actuators to interact with the physical environment.
The consequences of incorrect action can lead to loss of human life, impact on the environment, interruption of critical infrastructure, or other consequences such as disclosure of sensitive data, destruction of equipment, economic loss, or damage to reputation. In the digital world, trust and trustworthiness are achieved by the trustworthiness characteristics reaching appropriate levels for the context and having evidence to support that level is actually being reached.
Here is a breakdown of the five characteristics defined by the IIC:
- Safety ensures that a system operates without unacceptable risk of physical injury or damage to people's health and indirectly on damage to property or the environment. Nearly any damaging environmental event (e.g., pollution of soil, air, or water) presents a risk to human health. Safety implementations should reduce those risks. This requires an analysis of risks, determining ranges of safe operation and designing the system to operate within constraints defined for safe operation. When safety accounts for security risks, additional measures may be needed, such as ensuring a supervisory system is separated from the system being controlled.
- Security protects a system from unintended or unauthorized access, change, or destruction. It ensures confidentiality, integrity, and data availability at rest, in motion, or use and protects control systems, applications, and services from inappropriate use or behavior. The priority of concerns depends on the context. In operational technology security, availability and then integrity may take precedence over data confidentiality, which is often primary in information technology security, for example. Security is achieved by protecting endpoint devices, networking equipment, machine identities, virtual machines, containers, and applications and data.
- Reliability describes the ability of a system or component to perform its required functions under stated conditions for a specified period of time. This includes any considerations for physical degradation, expired software versions, and well-known potential malfunctions that result in frequent maintenance, replacement of end-of-life components, or software updates. Reliability enables uninterrupted operation of the system: an essential element in assuring that the system will meet expectations over time.
- Resilience describes the ability of a system or component to maintaining an acceptable level of service in the face of disruption. In contrast to reliability, resilience addresses unexpected and unplanned system statuses that can result, for example, from human errors in the operation or an environmental event (loss of power, earthquake, etc.). The main purpose of resilience is to prevent or reduce the serious impact of a disruption to the system by damage or loss of operation.
- Privacy protects the right of individuals to control what information related to them may be collected and stored by whom and to whom that information may be disclosed, and for how long.
Trustworthiness Relates to a System
Trustworthiness characteristics interact dynamically and can enhance or inhibit each other. System design needs to take into consideration the interactions of these characteristics and balance their composition. Evidence about the tradeoffs and the rationale for the choices can assure that the system is trustworthy for the context of a specific industrial system.
Concerns in a factory are different from a hospital operating room. For each system, the designer must understand the trustworthiness considerations involved in the implementation, assembly, operation, and maintenance of the system and retain that composition’s balance to continue the system’s trustworthiness.
Trustworthiness relates to the operational system. This means that the system of interest must be well defined since the definition of the system will determine the relevant requirements. For example, defining the trustworthy system of interest as a set of cameras in a chemical factory is different from considering trustworthiness of the entire factory.
Risks related to the cameras may include privacy or trade secret loss, prevention of safety or security monitoring, or possible security attack pivot points. When the full factory is the system of interest, it includes all the issues for the cameras and many more, such as a potential chemical explosion that could cause loss of life. The system of interest must be defined so appropriate concerns, and tradeoffs are considered.
Trustworthiness in the IIoT is explored in a new IIC foundational document, “The Industrial Internet of Things Trustworthiness Framework Foundations.” Co-authored by members of the IIC Trustworthiness Task Group, the document defines and motivates trustworthiness, highlights the need to consider trustworthiness throughout the system lifecycle, and raises awareness of technologies, processes, and practices. It also highlights traceability and assurance of trustworthiness based on evidence.
 see [IIC-Voc2020]