1. Introduction
Many companies have seen dramatic changes because of cloud computing
[1]. This is particularly true given the rapid increase in the use of enormous data sets. Meanwhile, there has been a meteoric increase in demand for private services. Cloud computing platforms provide various centralized systems, but with significant drawbacks
[2][3]. Inevitable, lengthy, unpredictable delays and time-conscious services are seen with clouds and their endpoints
[4]. The stakes are high when there is a disruption in the information infrastructure or network connections. A privacy issue may arise here. In response, the fog computing
[5] concept was developed to help improve computation, security, and privacy for Cloud-Edge, which is currently the industry standard.
The proliferation of Internet of Things (IoT) devices and the increasing demand for low-latency applications have led to the rise of fog computing as a solution for decentralized data processing. However, trust and security in fog computing environments pose significant challenges. Malicious nodes, unauthorized access, and data breaches can compromise the integrity and reliability of data processing. Therefore, there is a solid motivation to address these gaps in trust management and enhance the overall security of fog computing. Optimizing network performance and reducing latency is crucial to ensuring a seamless user experience in fog computing. Improving the efficiency of data transmission and communication among fog nodes can significantly enhance the performance of fog computing networks
[6].
Gateways, routers, switches, and even professionally installed conventional servers may all be considered fog devices
[7]. In addition, fog computing is widely regarded as an innovative green platform with sustainability and tremendous security advantages in light of the current requirement for massive emission reduction. Several fog nodes (FNs) are considered renewable in the fog computing system. The sites of FNs might be dispersed throughout a wide area. The multiple FNs can function autonomously, yet in concert thanks to a well-formulated formula, significantly reducing the stress on the data center’s infrastructure during computing. Fog computing allows for separating or sifting processing at the central layer between the endpoint and the cloud, which may improve QoS and reduce costs
[8]. As people will see in the following subsection, fog computing was widely regarded as desirable to address the developing problems associated with the Internet of Things. Fog computing was the most practical method, since it interconnected all local devices, digital equipment, wireless access points, and the internet. Because of this interdependence, strict security and privacy breaches, such as the exposure of client data storage locations, the disclosure of sensitive information, and the theft of personal accounts, are possible. Cisco first investigated fog computing to extend cloud functionality to the system’s peripherals. Fog computing has emerged as a viable alternative to local cloud computing, with significant benefits in QoS, latency, and geographical spread
[9]. Fog computing is often regarded as a virtualized system
[1], rendering services such as networking, storage, and, most critically, computation between the client and information center, with all the associated risks.
Edge computing relies on decentralized, self-operating nodes to ensure data are not sent to a central server, but instead processed locally. On the other hand, fog computing nodes constantly weigh their resources while deciding whether or not to upload to the cloud or analyze data from various information sources. Some cloud services, such as infrastructure as a service (IaaS), software as a service (SaaS), and platform as a service (PaaS), may be expanded with fog computing in a way that is not possible with an edge architecture (PaaS). While developing communication assets and processing power is undertaken at the network’s periphery, or “fog”, fog computing may assist with this activity
[10]. In 2012
[11], a new paradigm termed fog computing (the fog, in short) was developed to address these issues. Bonomi et al.
[9] define the fog as a highly virtualized platform that bridges the gap between cloud data centers and end devices by providing the former with storage, the latter with computing, and the latter with networking. Data, computing, storage, and application services are all things that may be found in the cloud or fog
[12]. Decentralization, locally processing vast volumes of data, software installation on heterogeneous hardware
[13], closeness to end users, dense geographical dispersion, and mobility support are ways the latter differs from the former. Here, people illustrate the connection between them using a traffic light system and explain the implications of delay. However, the distance between the monitoring probe and the cloud server might be as high as three or four hops in a traffic light system without fog. Therefore, the system is challenged by network latency, and real-time choices cannot be made instantly, but with the fog, the traffic lights become actuators and the monitoring probe becomes a sensor.
Traditional compressed video, which may experience some lag when sent from the fog node to the cloud, is possible. A flashing ambulance’s spotlight triggers a quick decision from the fog node to activate the appropriate traffic signals, allowing the ambulance to pass through without delay. The fog is a valuable addition to the cloud, but cannot be a substitute. The fog is the subject of intensive study at various research facilities, including ARM, Cisco, Dell, Intel, Microsoft Corp., Cloudlet, Intelligent Edge by Intel, and Princeton University’s Edge Laboratory. The OpenFog Consortium (founded in 2015) is making strides toward an open architecture for the fog that will allow for greater interoperability and scalability
[14]. Cisco, Huawei, Ericsson, etc., are just a few companies that provide networking hardware, including switches and gateways. The immense potential of the fog is shown in the current research developments.
The fog has such capabilities as proximity awareness, low latency, and edge location
[15]. It is appropriate for a situation where many heterogeneous, ubiquitous, and distributed devices must coordinate their communication, share resources, and carry out data storage and processing operations. The user’s fog is accessible from any internet-connected device at any time. Smart cities
[16] and health care
[17] are examples of fields where fog may be used. Moreover, it can provide higher QoS regarding reaction time and power usage
[18].
For latency-aware processing of IoT data, the fog uses network devices (called fog nodes in this research)
[19]. Fog nodes are the various components of a fog system stationed on the network’s periphery. Fog servers are among them, along with gateways, routers, switches, access points, base stations, and others. The computing, networking, and storage resource allocation may be managed consistently and streamlined thanks to the fog
[20]. In the Internet of Things (IoT), fog nodes are typically the first group of processors that data encounter, and these nodes can build a complete hardware root of trust. They may act as a trusted foundation for all the apps and processes that operate on them and, eventually, the cloud
[21]. Without a hardware root of trust, the fog’s software i.e., iFogsim infrastructures are vulnerable to various attack scenarios that provide hackers with a foothold. The kinds of security features offered by the fog are predicated on the needs of life-safety-critical systems.
2. Existing Trusted State-of-the-Art Fog Computing
Fog computing trust management (COMMITMENT) aims to provide a system that leverages previous high-quality service and high-quality protection history measures from prior direct and indirect fog network interactions to assess the level of trust in fog computing nodes (as a consequence). It was possible to detect and decrease 66% of harmful attacks and interactions between fog nodes using the COMITMENT approach while reducing service response time by about 15%. In
[6], the authors proposed a secure handoff and routed scheme to protect the nodes from attacks and classify each fog node based on their behaviors. Moreover, the scheme provides a trust management mechanism between IoT and fog layers. A new comprehensive trust management system (GDTMS), which is currently being developed, is described in the article. In
[22], the authors suggest a two-way open-to-interpretation logic-based trusted management system that empowers a resource requester to confirm if a provider should provide trustworthily and if the job is correct and allows the service to maintain trust, verifying the legitimacy of the person requesting the service. The remedy can withstand a substantial population of rogue nodes and successfully prevent trust-based attacks. The author’s
[23] research work identified a comprehensive collection of efficient criteria for highly secured selection in a fog-based computing environment. Furthermore, a good work decision-making technique with fuzzy and excellent worst techniques is used to evaluate the contribution level of every metric on trust level, considering metrics ambiguity. With a value of 0.470, the results indicate that quality of service does have a massive effect on robust security selection. Study
[24] shows a trust management system based on fuzzy reputations limited to QoS trust measures. This trust is calculated using data gathered both directly and indirectly. The lack of consideration for the social ties between internet-connected gadgets is the primary limitation of this study. Bao et al.
[25] focused on the social connections between IoT devices in defining trust management systems for IoT applications. To determine a node’s level of trust, people use a variety of trust indicators, including honesty, cooperation, community of interest (COI), friendship, etc., as well as data gleaned through personal observation and the views of other nodes. The accuracy and convergence of their answers in performing trust assessments are crucial to their evaluation. The emphasis in
[26] is on fixing the issue of misbehaving nodes whose traits may evolve. People see a trust administration system that can be expanded, modified, and maintained. According to their method, the trust management system’s scalability is ensured by persistently storing trust information for the subset of nodes seen. The authors of
[27] suggest a context-aware trust management system for the SIoT. To successfully distinguish between trustworthy and untrustworthy devices, context-aware QoS determines which of the three trust contexts they operate in. In
[28], the authors present a trust assessment approach based on behavior graphs and service groupings that considers identity and other features of relationships, as well as the development of interactions and quality indicators of services such as availability and dependability. In addition to these traditional measures of reliability, trust in the cloud may also be computed using measures of social interactions, such as the degree to which people are honest and sincere. In the health-care industry, the increasing reliance on technology and the proliferation of connected devices have led to the generation of vast amounts of sensitive data. These data include patient health records, diagnostic images, real-time monitoring data, and other critical information. Ensuring these data’s security, privacy, and integrity is paramount to protecting patient confidentiality, maintaining trust in health-care systems, and enabling accurate decision-making. However, fog computing environments in health care face unique challenges in achieving trustworthy decision-making. Fog computing, which extends cloud computing capabilities to the edge of the network, brings computation, storage, and networking resources closer to the data sources. While fog computing offers benefits such as reduced latency, improved scalability, and enhanced data privacy, it introduces additional complexities in managing trust and security. The existing literature on trust management in fog computing primarily focuses on general applications and needs a specific focus on the health-care domain. Therefore, a trust management framework tailored to the health-care context must be developed to address health-care systems’ particular challenges and requirements. This framework should consider the latest research advancements and provide a comprehensive approach to ensure trustworthy decision-making in health-care environments. The existing literature on trusted fog computing has focused on various aspects of trust management and network optimization. However, there are still gaps that need to be addressed.