Next Issue
Volume 15, April
Previous Issue
Volume 15, February
 
 

Future Internet, Volume 15, Issue 3 (March 2023) – 32 articles

Cover Story (view full-size image): The Internet of Things (IoT) has been shown to be valuable for business process management (BPM), for example, to better track and control process executions. While IoT actuators can automatically trigger actions, IoT sensors can monitor changes in environments and humans involved in processes. These sensors produce large amounts of data, which holds the key to understanding the quality of executed processes. To achieve this level of understanding, data from the process engine and IoT sensor data must have a uniform representation. The DataStream XES extension presented in this work enables the connection of IoT data to process events, preserving the full contexts required for data analysis. It is evaluated by creating two datasets from real-world scenarios, i.e., the logistics and manufacturing domains. View this paper
  • Issues are regarded as officially published after their release is announced to the table of contents alert mailing list.
  • You may sign up for e-mail alerts to receive table of contents of newly released issues.
  • PDF is the official format for papers published in both, html and pdf forms. To view the papers in pdf format, click on the "PDF Full-text" link, and use the free Adobe Reader to open them.
Order results
Result details
Section
Select all
Export citation of selected articles as:
18 pages, 1487 KiB  
Article
Artificial-Intelligence-Based Charger Deployment in Wireless Rechargeable Sensor Networks
by Hsin-Hung Cho, Wei-Che Chien, Fan-Hsun Tseng and Han-Chieh Chao
Future Internet 2023, 15(3), 117; https://doi.org/10.3390/fi15030117 - 22 Mar 2023
Cited by 2 | Viewed by 2256
Abstract
To extend a network’s lifetime, wireless rechargeable sensor networks are promising solutions. Chargers can be deployed to replenish energy for the sensors. However, deployment cost will increase when the number of chargers increases. Many metrics may affect the final policy for charger deployment, [...] Read more.
To extend a network’s lifetime, wireless rechargeable sensor networks are promising solutions. Chargers can be deployed to replenish energy for the sensors. However, deployment cost will increase when the number of chargers increases. Many metrics may affect the final policy for charger deployment, such as distance, the power requirement of the sensors and transmission radius, which makes the charger deployment problem very complex and difficult to solve. In this paper, we propose an efficient method for determining the field of interest (FoI) in which to find suitable candidate positions of chargers with lower computational costs. In addition, we designed four metaheuristic algorithms to address the local optima problem. Since we know that metaheuristic algorithms always require more computational costs for escaping local optima, we designed a new framework to reduce the searching space effectively. The simulation results show that the proposed method can achieve the best price–performance ratio. Full article
Show Figures

Figure 1

20 pages, 1673 KiB  
Review
Indoor Occupancy Sensing via Networked Nodes (2012–2022): A Review
by Muhammad Emad-Ud-Din and Ya Wang
Future Internet 2023, 15(3), 116; https://doi.org/10.3390/fi15030116 - 22 Mar 2023
Cited by 3 | Viewed by 2133
Abstract
In the past decade, different sensing mechanisms and algorithms have been developed to detect or estimate indoor occupancy. One of the most recent advancements is using networked sensor nodes to create a more comprehensive occupancy detection system where multiple sensors can identify human [...] Read more.
In the past decade, different sensing mechanisms and algorithms have been developed to detect or estimate indoor occupancy. One of the most recent advancements is using networked sensor nodes to create a more comprehensive occupancy detection system where multiple sensors can identify human presence within more expansive areas while delivering enhanced accuracy compared to a system that relies on stand-alone sensor nodes. The present work reviews the studies from 2012 to 2022 that use networked sensor nodes to detect indoor occupancy, focusing on PIR-based sensors. Methods are compared based on pivotal ADPs that play a significant role in selecting an occupancy detection system for applications such as Health and Safety or occupant comfort. These parameters include accuracy, information requirement, maximum sensor failure and minimum observation rate, and feasible detection area. We briefly describe the overview of occupancy detection criteria used by each study and introduce a metric called “sensor node deployment density” through our analysis. This metric captures the strength of network-level data filtering and fusion algorithms found in the literature. It is hinged on the fact that a robust occupancy estimation algorithm requires a minimal number of nodes to estimate occupancy. This review only focuses on the occupancy estimation models for networked sensor nodes. It thus provides a standardized insight into networked nodes’ occupancy sensing pipelines, which employ data fusion strategies, network-level machine learning algorithms, and occupancy estimation algorithms. This review thus helps determine the suitability of the reviewed methods to a standard set of application areas by analyzing their gaps. Full article
(This article belongs to the Special Issue Artificial Intelligence for Smart Cities)
Show Figures

Figure 1

26 pages, 1288 KiB  
Article
A Petri Net Model for Cognitive Radio Internet of Things Networks Exploiting GSM Bands
by Salvatore Serrano and Marco Scarpa
Future Internet 2023, 15(3), 115; https://doi.org/10.3390/fi15030115 - 21 Mar 2023
Cited by 2 | Viewed by 1843
Abstract
Quality of service (QoS) is a crucial requirement in distributed applications. Internet of Things architectures have become a widely used approach in many application domains, from Industry 4.0 to smart agriculture; thus, it is crucial to develop appropriate methodologies for managing QoS in [...] Read more.
Quality of service (QoS) is a crucial requirement in distributed applications. Internet of Things architectures have become a widely used approach in many application domains, from Industry 4.0 to smart agriculture; thus, it is crucial to develop appropriate methodologies for managing QoS in such contexts. In an overcrowded spectrum scenario, cognitive radio technology could be an effective methodology for improving QoS requirements. In order to evaluate QoS in the context of a cognitive radio Internet of Things network, we propose a Petri net-based model that evaluates the cognitive radio environment and operates in a 200 kHz GSM/EDGE transponder band. The model is quite flexible as it considers several circuit and packet switching primary user network loads and configurations and several secondary user types of services (that involve semantic transparency or time transparency); furthermore, it is able to take into account mistakes of the spectrum sensing algorithm used by secondary users. Specifically, we derive the distribution of the response time perceived by the secondary users, where it is then possible to obtain an estimation of both the maximum throughput and jitter. The proposed cognitive radio scenario considers a secondary user synchronized access to the channel when using the GSM/EDGE frame structure. Full article
(This article belongs to the Special Issue Future Communication Networks for the Internet of Things (IoT))
Show Figures

Figure 1

17 pages, 7331 KiB  
Article
Research on Spaceborne Target Detection Based on Yolov5 and Image Compression
by Qi Shi, Daheng Wang, Wen Chen, Jinpei Yu, Weiting Zhou, Jun Zou and Guangzu Liu
Future Internet 2023, 15(3), 114; https://doi.org/10.3390/fi15030114 - 19 Mar 2023
Viewed by 1732
Abstract
Satellite image compression technology plays an important role in the development of space science. As optical sensors on satellites become more sophisticated, high-resolution and high-fidelity satellite images will occupy more storage. This raises the required transmission bandwidth and transmission rate in the satellite–ground [...] Read more.
Satellite image compression technology plays an important role in the development of space science. As optical sensors on satellites become more sophisticated, high-resolution and high-fidelity satellite images will occupy more storage. This raises the required transmission bandwidth and transmission rate in the satellite–ground data transmission system. In order to reduce the pressure from image transmission on the data transmission system, a spaceborne target detection system based on Yolov5 and a satellite image compression transmission system is proposed in this paper. It can reduce the pressure on the data transmission system by detecting the object of interest and deciding whether to transmit. An improved Yolov5 network is proposed to detect the small target on the high-resolution satellite image. Simulation results show that the improved Yolov5 network proposed in this paper can detect specific targets in real satellite images, including aircraft, ships, etc. At the same time, image compression has little effect on target detection, so detection complexity can be effectively reduced and detection speed can be improved by detecting the compressed images. Full article
Show Figures

Figure 1

22 pages, 1110 KiB  
Article
CvAMoS—Event Abstraction Using Contextual Information
by Gemma Di Federico and Andrea Burattin
Future Internet 2023, 15(3), 113; https://doi.org/10.3390/fi15030113 - 18 Mar 2023
Cited by 5 | Viewed by 1654
Abstract
Process mining analyzes events that are logged during the execution of a process, with the aim of gathering useful information and knowledge. Process discovery algorithms derive process models that represent these processes. The level of abstraction at which the process model is represented [...] Read more.
Process mining analyzes events that are logged during the execution of a process, with the aim of gathering useful information and knowledge. Process discovery algorithms derive process models that represent these processes. The level of abstraction at which the process model is represented is reflected in the granularity of the event log. When a process is captured by the usage of sensor systems, process activities are recorded at the sensor-level in the form of sensor readings, and are therefore too fine-grained and non-explanatory. To increase the understandability of the process model, events need to be abstracted into higher-level activities that provide a more meaningful representation of the process. The abstraction becomes more relevant and challenging when the process involves human behavior, as the flexible nature of human actions can make it harder to identify and abstract meaningful activities. This paper proposes CvAMoS, a trace-based approach for event abstraction, which focuses on identifying motifs while taking context into account. A motif is a recurring sequence of events that represents an activity that took place under specific circumstances depicted by the context. Context information is logged in the event log in the form of environmental sensor readings (e.g., the temperature and light sensors). The presented algorithm uses a distance function to deal with the variability in the execution of activities. The result is a set of meaningful and interpretable motifs. The algorithm has been tested on both synthetic and real datasets, and compared to the state of the art. CvAMoS is implemented as a Java application and the code is freely available. Full article
(This article belongs to the Special Issue IoT-Based BPM for Smart Environments)
Show Figures

Figure 1

13 pages, 2073 KiB  
Article
A Descriptive Study of Webpage Designs for Posting Privacy Policies for Different-Sized US Hospitals to Create an Assessment Framework
by Karen Schnell, Kaushik Roy and Madhuri Siddula
Future Internet 2023, 15(3), 112; https://doi.org/10.3390/fi15030112 - 17 Mar 2023
Viewed by 1553
Abstract
In the United States, there are laws and standards guiding how people should be informed about the use of their private data. However, the challenge of communicating these guidelines to the naïve user is still at its peak. Research has shown that the [...] Read more.
In the United States, there are laws and standards guiding how people should be informed about the use of their private data. However, the challenge of communicating these guidelines to the naïve user is still at its peak. Research has shown that the willingness to read privacy statements is influenced by attitudes toward privacy risks and privacy benefits. Many websites publish privacy policies somewhere on their web pages, and it can be difficult to navigate to them. In the healthcare field, research has found that health information websites’ key information is presented poorly and inconsistently. For the policies to be legally binding, a person must be able to find them. In the healthcare industry, where sensitive data are being collected, research on how a user navigates to privacy policies for different size hospital websites is limited. Studies exist about privacy policies or website design and not both. This descriptive study involved ascertaining commonalities and differences among different-sized hospitals’ website designs for supporting privacy policies. A foundation framework was created using Web Content Accessibility Guidelines (WGAC) principles and the literature review findings for evaluating practices for website publishing of privacy policies. The results demonstrated a very low variance in the website design concepts employed by hospitals to publish their privacy policy. Full article
(This article belongs to the Section Techno-Social Smart Systems)
Show Figures

Figure 1

15 pages, 1621 KiB  
Article
Co-Design, Development, and Evaluation of a Health Monitoring Tool Using Smartwatch Data: A Proof-of-Concept Study
by Ruhi Kiran Bajaj, Rebecca Mary Meiring and Fernando Beltran
Future Internet 2023, 15(3), 111; https://doi.org/10.3390/fi15030111 - 17 Mar 2023
Cited by 5 | Viewed by 4295
Abstract
Computational analysis and integration of smartwatch data with Electronic Medical Records (EMR) present potential uses in preventing, diagnosing, and managing chronic diseases. One of the key requirements for the successful clinical application of smartwatch data is understanding healthcare professional (HCP) perspectives on whether [...] Read more.
Computational analysis and integration of smartwatch data with Electronic Medical Records (EMR) present potential uses in preventing, diagnosing, and managing chronic diseases. One of the key requirements for the successful clinical application of smartwatch data is understanding healthcare professional (HCP) perspectives on whether these devices can play a role in preventive care. Gaining insights from the vast amount of smartwatch data is a challenge for HCPs, thus tools are needed to support HCPs when integrating personalized health monitoring devices with EMR. This study aimed to develop and evaluate an application prototype, co-designed with HCPs and employing design science research methodology and diffusion of innovation frameworks to identify the potential for clinical integration. A machine learning algorithm was developed to detect possible health anomalies in smartwatch data, and these were presented visually to HCPs in a web-based platform. HCPs completed a usability questionnaire to evaluate the prototype, and over 60% of HCPs scored positively on usability. This preliminary study tested the proposed research to solve the practical challenges of HCP in interpreting smartwatch data before fully integrating smartwatches into the EMR. The findings provide design directions for future applications that use smartwatch data to improve clinical decision-making and reduce HCP workloads. Full article
(This article belongs to the Special Issue Challenges and Opportunities in Electronic Medical Record (EMR))
Show Figures

Figure 1

26 pages, 709 KiB  
Article
Creation, Analysis and Evaluation of AnnoMI, a Dataset of Expert-Annotated Counselling Dialogues
by Zixiu Wu, Simone Balloccu, Vivek Kumar, Rim Helaoui, Diego Reforgiato Recupero and Daniele Riboni
Future Internet 2023, 15(3), 110; https://doi.org/10.3390/fi15030110 - 14 Mar 2023
Cited by 13 | Viewed by 3327
Abstract
Research on the analysis of counselling conversations through natural language processing methods has seen remarkable growth in recent years. However, the potential of this field is still greatly limited by the lack of access to publicly available therapy dialogues, especially those with expert [...] Read more.
Research on the analysis of counselling conversations through natural language processing methods has seen remarkable growth in recent years. However, the potential of this field is still greatly limited by the lack of access to publicly available therapy dialogues, especially those with expert annotations, but it has been alleviated thanks to the recent release of AnnoMI, the first publicly and freely available conversation dataset of 133 faithfully transcribed and expert-annotated demonstrations of high- and low-quality motivational interviewing (MI)—an effective therapy strategy that evokes client motivation for positive change. In this work, we introduce new expert-annotated utterance attributes to AnnoMI and describe the entire data collection process in more detail, including dialogue source selection, transcription, annotation, and post-processing. Based on the expert annotations on key MI aspects, we carry out thorough analyses of AnnoMI with respect to counselling-related properties on the utterance, conversation, and corpus levels. Furthermore, we introduce utterance-level prediction tasks with potential real-world impacts and build baseline models. Finally, we examine the performance of the models on dialogues of different topics and probe the generalisability of the models to unseen topics. Full article
(This article belongs to the Special Issue Deep Learning and Natural Language Processing II)
Show Figures

Figure 1

21 pages, 816 KiB  
Article
DataStream XES Extension: Embedding IoT Sensor Data into Extensible Event Stream Logs
by Juergen Mangler, Joscha Grüger, Lukas Malburg, Matthias Ehrendorfer, Yannis Bertrand, Janik-Vasily Benzin, Stefanie Rinderle-Ma, Estefania Serral Asensio and Ralph Bergmann
Future Internet 2023, 15(3), 109; https://doi.org/10.3390/fi15030109 - 14 Mar 2023
Cited by 15 | Viewed by 3300
Abstract
The Internet of Things (IoT) has been shown to be very valuable for Business Process Management (BPM), for example, to better track and control process executions. While IoT actuators can automatically trigger actions, IoT sensors can monitor the changes in the environment and [...] Read more.
The Internet of Things (IoT) has been shown to be very valuable for Business Process Management (BPM), for example, to better track and control process executions. While IoT actuators can automatically trigger actions, IoT sensors can monitor the changes in the environment and the humans involved in the processes. These sensors produce large amounts of discrete and continuous data streams, which hold the key to understanding the quality of the executed processes. However, to enable this understanding, it is needed to have a joint representation of the data generated by the process engine executing the process, and the data generated by the IoT sensors. In this paper, we present an extension of the event log standard format XES called DataStream. DataStream enables the connection of IoT data to process events, preserving the full context required for data analysis, even when scenarios or hardware artifacts are rapidly changing. The DataStream extension is designed based on a set of goals and evaluated by creating two datasets for real-world scenarios from the transportation/logistics and manufacturing domains. Full article
(This article belongs to the Special Issue IoT-Based BPM for Smart Environments)
Show Figures

Figure 1

22 pages, 1877 KiB  
Article
A Novel Hybrid Edge Detection and LBP Code-Based Robust Image Steganography Method
by Habiba Sultana, A. H. M. Kamal, Gahangir Hossain and Muhammad Ashad Kabir
Future Internet 2023, 15(3), 108; https://doi.org/10.3390/fi15030108 - 10 Mar 2023
Cited by 11 | Viewed by 4248
Abstract
In digital image processing and steganography, images are often described using edges and local binary pattern (LBP) codes. By combining these two properties, a novel hybrid image steganography method of secret embedding is proposed in this paper. This method only employs edge pixels [...] Read more.
In digital image processing and steganography, images are often described using edges and local binary pattern (LBP) codes. By combining these two properties, a novel hybrid image steganography method of secret embedding is proposed in this paper. This method only employs edge pixels that influence how well the novel approach embeds data. To increase the quantity of computed edge pixels, several edge detectors are applied and hybridized using a logical OR operation. A morphological dilation procedure in the hybridized edge image is employed to this purpose. The least significant bits (LSB) and all LBP codes are calculated for edge pixels. Afterward, these LBP codes, LSBs, and secret bits using an exclusive-OR operation are merged. These resulting implanted bits are delivered to edge pixels’ LSBs. The experimental results show that the suggested approach outperforms current strategies in terms of measuring perceptual transparency, such as peak signal-to-noise ratio (PSNR) and structural similarity index (SSI). The embedding capacity per tempered pixel in the proposed approach is also substantial. Its embedding guidelines protect the privacy of implanted data. The entropy, correlation coefficient, cosine similarity, and pixel difference histogram data show that our proposed method is more resistant to various types of cyber-attacks. Full article
(This article belongs to the Collection Information Systems Security)
Show Figures

Figure 1

25 pages, 1933 KiB  
Article
Dealing with Deadlocks in Industrial Multi Agent Systems
by František Čapkovič
Future Internet 2023, 15(3), 107; https://doi.org/10.3390/fi15030107 - 9 Mar 2023
Cited by 4 | Viewed by 5441
Abstract
Automated Manufacturing Systems (AMS) consisting of many cooperating devices incorporated into multiple cooperating production lines, sharing common resources, represent industrial Multi-Agent Systems (MAS). Deadlocks may occur during operation of such MAS. It is necessary to deal with deadlocks (more precisely said, to prevent [...] Read more.
Automated Manufacturing Systems (AMS) consisting of many cooperating devices incorporated into multiple cooperating production lines, sharing common resources, represent industrial Multi-Agent Systems (MAS). Deadlocks may occur during operation of such MAS. It is necessary to deal with deadlocks (more precisely said, to prevent them) to ensure the correct behavior of AMS. For this purpose, among other methods, methods based on Petri nets (PN) are used too. Because AMS are very often described by PN models, two PN-based methods will be presented here, namely based on (i) PN place invariants (P-invariants); and (ii) PN siphons and traps. Intended final results of usage these methods is finding a supervisor allowing a deadlock-free activity of the global MAS. While the former method yields results in analytical terms, latter one need computation of siphons and traps. Full article
(This article belongs to the Special Issue Modern Trends in Multi-Agent Systems)
Show Figures

Figure 1

20 pages, 7317 KiB  
Article
Complex Queries for Querying Linked Data
by Hasna Boumechaal and Zizette Boufaida
Future Internet 2023, 15(3), 106; https://doi.org/10.3390/fi15030106 - 9 Mar 2023
Cited by 1 | Viewed by 1464
Abstract
Querying Linked Data is one of the most important issues for the semantic web community today because it requires the user to understand the structure and vocabularies used in various data sources. Furthermore, users must be familiar with the syntax of query languages, [...] Read more.
Querying Linked Data is one of the most important issues for the semantic web community today because it requires the user to understand the structure and vocabularies used in various data sources. Furthermore, users must be familiar with the syntax of query languages, such as SPARQL. However, because users are accustomed to natural language-based search, novice users may find it challenging to use these features. As a result, new approaches for querying Linked Data sources on the web with NL queries must be defined. In this paper, we propose a novel system for converting natural language queries into SPARQL queries to query linked and heterogeneous semantic data on the web. While most existing methods have focused on simple queries and have ignored complex queries, the method described in this work aims to handle various types of NL queries, particularly complex queries containing negation, numbers, superlatives, and comparative adjectives. Three complementary strategies are used in this context: (1) identifying the semantic relations between query terms in order to understand the user’s needs; (2) mapping the NL terms to semantic entities; and (3) constructing the query’s valid triples based on the different links used to describe the identified entities in order to generate correct SPARQL queries. The empirical evaluations show that the proposed system is effective. Full article
(This article belongs to the Special Issue Information Retrieval on the Semantic Web)
Show Figures

Figure 1

15 pages, 2592 KiB  
Article
Beamforming Based on a SSS Angle Estimation Algorithm for 5G NR Networks
by Daniel Andrade, Roberto Magueta, Adão Silva and Paulo Marques
Future Internet 2023, 15(3), 105; https://doi.org/10.3390/fi15030105 - 9 Mar 2023
Cited by 1 | Viewed by 1984
Abstract
The current 5G-NR standard includes the transmission of multiple synchronization signal blocks (SSBs) in different directions to be exploited in beamforming techniques. However, choosing a pair of these beams leads to performance degradation, mainly for the cases where the transmit and receive beams [...] Read more.
The current 5G-NR standard includes the transmission of multiple synchronization signal blocks (SSBs) in different directions to be exploited in beamforming techniques. However, choosing a pair of these beams leads to performance degradation, mainly for the cases where the transmit and receive beams are not aligned, because it considers that only few fixed directions among wide beams are established. Therefore, in this article, we design a new 3GPP-standard- compliant beam pair selection algorithm based on secondary synchronization signal (SSS) angle estimation (BSAE) that makes use of multiple synchronization signal blocks (SSBs) to maximize the reference signal received power (RSRP) value at the receiver. This optimization is performed using the SSSs present in each SSB to perform channel estimation in the digital domain. Afterwards, the combination of those estimations is used to perform equivalent channel propagation matrix estimation without the analog processing effects. Finally, through the estimated channel propagation matrix, the angle that maximizes the RSRP is determined to compute the most suitable beam. The proposed algorithm was evaluated and compared with a conventional beam pair selection algorithm. Ours has better performance results. Furthermore, the proposed algorithm achieved performance close to the optimal performance, where all channel state information (CSI) is available, emphasizing the interest of the proposed approach for practical 5G mmWave mMIMO implementations. Full article
(This article belongs to the Special Issue 5G Wireless Communication Networks)
Show Figures

Figure 1

14 pages, 1566 KiB  
Article
Optimizing Task Execution: The Impact of Dynamic Time Quantum and Priorities on Round Robin Scheduling
by Mansoor Iqbal, Zahid Ullah, Izaz Ahmad Khan, Sheraz Aslam, Haris Shaheer, Mujtaba Humayon, Muhammad Asjad Salahuddin and Adeel Mehmood
Future Internet 2023, 15(3), 104; https://doi.org/10.3390/fi15030104 - 8 Mar 2023
Cited by 2 | Viewed by 4706
Abstract
Task scheduling algorithms are crucial for optimizing the utilization of computing resources. This work proposes a unique approach for improving task execution in real-time systems using an enhanced Round Robin scheduling algorithm variant incorporating dynamic time quantum and priority. The proposed algorithm adjusts [...] Read more.
Task scheduling algorithms are crucial for optimizing the utilization of computing resources. This work proposes a unique approach for improving task execution in real-time systems using an enhanced Round Robin scheduling algorithm variant incorporating dynamic time quantum and priority. The proposed algorithm adjusts the time slice allocated to each task based on execution time and priority, resulting in more efficient resource utilization. We also prioritize higher-priority tasks and execute them as soon as they arrive in the ready queue, ensuring the timely completion of critical tasks. We evaluate the performance of our algorithm using a set of real-world tasks and compare it with traditional Round Robin scheduling. The results show that our proposed approach significantly improves task execution time and resource utilization compared to conventional Round Robin scheduling. Our approach offers a promising solution for optimizing task execution in real-time systems. The combination of dynamic time quantum and priorities adds a unique element to the existing literature in this field. Full article
Show Figures

Figure 1

16 pages, 2777 KiB  
Article
Utilizing Random Forest with iForest-Based Outlier Detection and SMOTE to Detect Movement and Direction of RFID Tags
by Ganjar Alfian, Muhammad Syafrudin, Norma Latif Fitriyani, Sahirul Alam, Dinar Nugroho Pratomo, Lukman Subekti, Muhammad Qois Huzyan Octava, Ninis Dyah Yulianingsih, Fransiskus Tatas Dwi Atmaji and Filip Benes
Future Internet 2023, 15(3), 103; https://doi.org/10.3390/fi15030103 - 8 Mar 2023
Cited by 8 | Viewed by 3115
Abstract
In recent years, radio frequency identification (RFID) technology has been utilized to monitor product movements within a supply chain in real time. By utilizing RFID technology, the products can be tracked automatically in real-time. However, the RFID cannot detect the movement and direction [...] Read more.
In recent years, radio frequency identification (RFID) technology has been utilized to monitor product movements within a supply chain in real time. By utilizing RFID technology, the products can be tracked automatically in real-time. However, the RFID cannot detect the movement and direction of the tag. This study investigates the performance of machine learning (ML) algorithms to detect the movement and direction of passive RFID tags. The dataset utilized in this study was created by considering a variety of conceivable tag motions and directions that may occur in actual warehouse settings, such as going inside and out of the gate, moving close to the gate, turning around, and static tags. The statistical features are derived from the received signal strength (RSS) and the timestamp of tags. Our proposed model combined Isolation Forest (iForest) outlier detection, Synthetic Minority Over Sampling Technique (SMOTE) and Random Forest (RF) has shown the highest accuracy up to 94.251% as compared to other ML models in detecting the movement and direction of RFID tags. In addition, we demonstrated the proposed classification model could be applied to a web-based monitoring system, so that tagged products that move in or out through a gate can be correctly identified. This study is expected to improve the RFID gate on detecting the status of products (being received or delivered) automatically. Full article
Show Figures

Figure 1

18 pages, 906 KiB  
Article
IoT-Portrait: Automatically Identifying IoT Devices via Transformer with Incremental Learning
by Juan Wang, Jing Zhong and Jiangqi Li
Future Internet 2023, 15(3), 102; https://doi.org/10.3390/fi15030102 - 7 Mar 2023
Cited by 4 | Viewed by 2232
Abstract
With the development of IoT, IoT devices have proliferated. With the increasing demands of network management and security evaluation, automatic identification of IoT devices becomes necessary. However, existing works require a lot of manual effort and face the challenge of catastrophic forgetting. In [...] Read more.
With the development of IoT, IoT devices have proliferated. With the increasing demands of network management and security evaluation, automatic identification of IoT devices becomes necessary. However, existing works require a lot of manual effort and face the challenge of catastrophic forgetting. In this paper, we propose IoT-Portrait, an automatic IoT device identification framework based on a transformer network. IoT-Portrait automatically acquires information about IoT devices as labels and learns the traffic behavior characteristics of devices through a transformer neural network. Furthermore, for privacy protection and overhead reasons, it is not easy to save all past samples to retrain the classification model when new devices join the network. Therefore, we use a class incremental learning method to train the new model to preserve old classes’ features while learning new devices’ features. We implement a prototype of IoT-Portrait based on our lab environment and open-source database. Experimental results show that IoT-Portrait achieves a high identification rate of up to 99% and is well resistant to catastrophic forgetting with a negligible added cost both in memory and time. It indicates that IoT-Portrait can classify IoT devices effectively and continuously. Full article
(This article belongs to the Section Internet of Things)
Show Figures

Figure 1

16 pages, 1336 KiB  
Article
Relational Action Bank with Semantic–Visual Attention for Few-Shot Action Recognition
by Haoming Liang, Jinze Du, Hongchen Zhang, Bing Han and Yan Ma
Future Internet 2023, 15(3), 101; https://doi.org/10.3390/fi15030101 - 3 Mar 2023
Viewed by 1656
Abstract
Recently, few-shot learning has attracted significant attention in the field of video action recognition, owing to its data-efficient learning paradigm. Despite the encouraging progress, identifying ways to further improve the few-shot learning performance by exploring additional or auxiliary information for video action recognition [...] Read more.
Recently, few-shot learning has attracted significant attention in the field of video action recognition, owing to its data-efficient learning paradigm. Despite the encouraging progress, identifying ways to further improve the few-shot learning performance by exploring additional or auxiliary information for video action recognition remains an ongoing challenge. To address this problem, in this paper we make the first attempt to propose a relational action bank with semantic–visual attention for few-shot action recognition. Specifically, we introduce a relational action bank as the auxiliary library to assist the network in understanding the actions in novel classes. Meanwhile, the semantic–visual attention is devised to adaptively capture the connections to the foregone actions via both semantic correlation and visual similarity. We extensively evaluate our approach via two backbone models (ResNet-50 and C3D) on HMDB and Kinetics datasets, and demonstrate that the proposed model can obtain significantly better performance compared against state-of-the-art methods. Notably, our results demonstrate an average improvement of about 6.2% when compared to the second-best method on the Kinetics dataset. Full article
(This article belongs to the Section Techno-Social Smart Systems)
Show Figures

Figure 1

33 pages, 5150 KiB  
Article
Securing Critical User Information over the Internet of Medical Things Platforms Using a Hybrid Cryptography Scheme
by Oluwakemi Christiana Abikoye, Esau Taiwo Oladipupo, Agbotiname Lucky Imoize, Joseph Bamidele Awotunde, Cheng-Chi Lee and Chun-Ta Li
Future Internet 2023, 15(3), 99; https://doi.org/10.3390/fi15030099 - 28 Feb 2023
Cited by 20 | Viewed by 1992
Abstract
The application of the Internet of Medical Things (IoMT) in medical systems has brought much ease in discharging healthcare services by medical practitioners. However, the security and privacy preservation of critical user data remain the reason the technology has not yet been fully [...] Read more.
The application of the Internet of Medical Things (IoMT) in medical systems has brought much ease in discharging healthcare services by medical practitioners. However, the security and privacy preservation of critical user data remain the reason the technology has not yet been fully maximized. Undoubtedly, a secure IoMT model that preserves individual users’ privacy will enhance the wide acceptability of IoMT technology. However, existing works that have attempted to solve these privacy and insecurity problems are not space-conservative, computationally intensive, and also vulnerable to security attacks. In this paper, an IoMT-based model that conserves the privacy of the data, is less computationally intensive, and is resistant to various cryptanalysis attacks is proposed. Specifically, an efficient privacy-preserving technique where an efficient searching algorithm through encrypted data was used and a hybrid cryptography algorithm that combines the modification of the Caesar cipher with the Elliptic Curve Diffie Hellman (ECDH) and Digital Signature Algorithm (DSA) were projected to achieve user data security and privacy preservation of the patient. Furthermore, the modified algorithm can secure messages during transmission, perform key exchanges between clients and healthcare centres, and guarantee user authentication by authorized healthcare centres. The proposed IoMT model, leveraging the hybrid cryptography algorithm, was analysed and compared against different security attacks. The analysis results revealed that the model is secure, preserves the privacy of critical user information, and shows robust resistance against different cryptanalysis attacks. Full article
(This article belongs to the Special Issue Machine Learning for Blockchain and IoT Systems in Smart City)
Show Figures

Figure 1

20 pages, 4781 KiB  
Article
Towards an Integrated Methodology and Toolchain for Machine Learning-Based Intrusion Detection in Urban IoT Networks and Platforms
by Denis Rangelov, Philipp Lämmel, Lisa Brunzel, Stephan Borgert, Paul Darius, Nikolay Tcholtchev and Michell Boerger
Future Internet 2023, 15(3), 98; https://doi.org/10.3390/fi15030098 - 28 Feb 2023
Cited by 4 | Viewed by 3898
Abstract
The constant increase in volume and wide variety of available Internet of Things (IoT) devices leads to highly diverse software and hardware stacks, which opens new avenues for exploiting previously unknown vulnerabilities. The ensuing risks are amplified by the inherent IoT resource constraints [...] Read more.
The constant increase in volume and wide variety of available Internet of Things (IoT) devices leads to highly diverse software and hardware stacks, which opens new avenues for exploiting previously unknown vulnerabilities. The ensuing risks are amplified by the inherent IoT resource constraints both in terms of performance and energy expenditure. At the same time, IoT devices often generate or collect sensitive, real-time data used in critical application scenarios (e.g., health monitoring, transportation, smart energy, etc.). All these factors combined make IoT networks a primary target and potential victim of malicious actors. In this paper, we presented a brief overview of existing attacks and defense strategies and used this as motivation for proposing an integrated methodology for developing protection mechanisms for smart city IoT networks. The goal of this work was to lay out a theoretical plan and a corresponding pipeline of steps, i.e., a development and implementation process, for the design and application of cybersecurity solutions for urban IoT networks. The end goal of following the proposed process is the deployment and continuous improvement of appropriate IoT security measures in real-world urban IoT infrastructures. The application of the methodology was exemplified on an OMNET++-simulated scenario, which was developed in collaboration with industrial partners and a municipality. Full article
(This article belongs to the Section Cybersecurity)
Show Figures

Figure 1

21 pages, 4294 KiB  
Review
Integrating a Blockchain-Based Governance Framework for Responsible AI
by Rameez Asif, Syed Raheel Hassan and Gerard Parr
Future Internet 2023, 15(3), 97; https://doi.org/10.3390/fi15030097 - 28 Feb 2023
Cited by 5 | Viewed by 4017
Abstract
This research paper reviews the potential of smart contracts for responsible AI with a focus on frameworks, hardware, energy efficiency, and cyberattacks. Smart contracts are digital agreements that are executed by a blockchain, and they have the potential to revolutionize the way we [...] Read more.
This research paper reviews the potential of smart contracts for responsible AI with a focus on frameworks, hardware, energy efficiency, and cyberattacks. Smart contracts are digital agreements that are executed by a blockchain, and they have the potential to revolutionize the way we conduct business by increasing transparency and trust. When it comes to responsible AI systems, smart contracts can play a crucial role in ensuring that the terms and conditions of the contract are fair and transparent as well as that any automated decision-making is explainable and auditable. Furthermore, the energy consumption of blockchain networks has been a matter of concern; this article explores the energy efficiency element of smart contracts. Energy efficiency in smart contracts may be enhanced by the use of techniques such as off-chain processing and sharding. The study emphasises the need for careful auditing and testing of smart contract code in order to protect against cyberattacks along with the use of secure libraries and frameworks to lessen the likelihood of smart contract vulnerabilities. Full article
(This article belongs to the Special Issue Blockchain Security and Privacy II)
Show Figures

Figure 1

15 pages, 6298 KiB  
Article
Blockchain-Enabled Chebyshev Polynomial-Based Group Authentication for Secure Communication in an Internet of Things Network
by Raman Singh, Sean Sturley and Hitesh Tewari
Future Internet 2023, 15(3), 96; https://doi.org/10.3390/fi15030096 - 28 Feb 2023
Cited by 2 | Viewed by 1946
Abstract
The utilization of Internet of Things (IoT) devices in various smart city and industrial applications is growing rapidly. Within a trusted authority (TA), such as an industry or smart city, all IoT devices are closely monitored in a controlled infrastructure. However, in cases [...] Read more.
The utilization of Internet of Things (IoT) devices in various smart city and industrial applications is growing rapidly. Within a trusted authority (TA), such as an industry or smart city, all IoT devices are closely monitored in a controlled infrastructure. However, in cases where an IoT device from one TA needs to communicate with another IoT device from a different TA, the trust establishment between these devices becomes extremely important. Obtaining a digital certificate from a certificate authority for each IoT device can be expensive. To solve this issue, a group authentication framework is proposed that can establish trust between group IoT devices owned by different entities. The Chebyshev polynomial has many important properties, semigroup is one of the most important. These properties make the Chebyshev polynomial a good candidate for the proposed group authentication mechanism. The secure exchange of information between trusted authorities is supported by Blockchain technology. The proposed framework was implemented and tested using Python and deployed on Blockchain using Ethereum’s Goerli’s testnet. The results show that the proposed framework can reasonably use Chebyshev polynomials with degrees up to four digits in length. The values of various parameters related to Blockchain are also discussed to understand the usability of the proposed framework. Full article
(This article belongs to the Special Issue Blockchain Security and Privacy II)
Show Figures

Figure 1

23 pages, 1343 KiB  
Article
A Vulnerability Assessment Approach for Transportation Networks Subjected to Cyber–Physical Attacks
by Konstantinos Ntafloukas, Liliana Pasquale, Beatriz Martinez-Pastor and Daniel P. McCrum
Future Internet 2023, 15(3), 100; https://doi.org/10.3390/fi15030100 - 28 Feb 2023
Cited by 2 | Viewed by 2475
Abstract
Transportation networks are fundamental to the efficient and safe functioning of modern societies. In the past, physical and cyber space were treated as isolated environments, resulting in transportation network being considered vulnerable only to threats from the physical space (e.g., natural hazards). The [...] Read more.
Transportation networks are fundamental to the efficient and safe functioning of modern societies. In the past, physical and cyber space were treated as isolated environments, resulting in transportation network being considered vulnerable only to threats from the physical space (e.g., natural hazards). The integration of Internet of Things-based wireless sensor networks into the sensing layer of critical transportation infrastructure has resulted in transportation networks becoming susceptible to cyber–physical attacks due to the inherent vulnerabilities of IoT devices. However, current vulnerability assessment methods lack details related to the integration of the cyber and physical space in transportation networks. In this paper, we propose a new vulnerability assessment approach for transportation networks subjected to cyber–physical attacks at the sensing layer. The novelty of the approach used relies on the combination of the physical and cyber space, using a Bayesian network attack graph that enables the probabilistic modelling of vulnerability states in both spaces. A new probability indicator is proposed to enable the assignment of probability scores to vulnerability states, considering different attacker profile characteristics and control barriers. A probability-based ranking table is developed that details the most vulnerable nodes of the graph. The vulnerability of the transportation network is measured as a drop in network efficiency after the removal of the highest probability-based ranked nodes. We demonstrate the application of the approach by studying the vulnerability of a transportation network case study to a cyber–physical attack at the sensing layer. Monte Carlo simulations and sensitivity analysis are performed as methods to evaluate the results. The results indicate that the vulnerability of the transportation network depends to a large extent on the successful exploitation of vulnerabilities, both in the cyber and physical space. Additionally, we demonstrate the usefulness of the proposed approach by comparing the results with other currently available methods. The approach is of interest to stakeholders who are attempting to incorporate the cyber domain into the vulnerability assessment procedures of their system. Full article
Show Figures

Figure 1

16 pages, 693 KiB  
Article
When Operation Technology Meets Information Technology: Challenges and Opportunities
by Davide Berardi, Franco Callegati, Andrea Giovine, Andrea Melis, Marco Prandini and Lorenzo Rinieri
Future Internet 2023, 15(3), 95; https://doi.org/10.3390/fi15030095 - 27 Feb 2023
Cited by 14 | Viewed by 2788
Abstract
Industry 4.0 has revolutionized process innovation while facilitating and encouraging many new possibilities. The objective of Industry 4.0 is the radical enhancement of productivity, a goal that presupposes the integration of Operational Technology (OT) networks with Information Technology (IT) networks, which were hitherto [...] Read more.
Industry 4.0 has revolutionized process innovation while facilitating and encouraging many new possibilities. The objective of Industry 4.0 is the radical enhancement of productivity, a goal that presupposes the integration of Operational Technology (OT) networks with Information Technology (IT) networks, which were hitherto isolated. This disruptive approach is enabled by adopting several emerging technologies in Enterprise processes. In this manuscript, we discuss what we believe to be one of the main challenges preventing the full employment of Industry 4.0, namely, the integration of Operation Technology networking and Information Technology networking. We discuss the technical challenges alongside the potential tools while providing a state-of-the-art use case scenario. We showcase a possible solution based on the Asset Administration Shell approach, referring to the use case of camera synchronization for collaborative tasks. Full article
(This article belongs to the Special Issue State-of-the-Art Future Internet Technology in Italy 2022–2023)
Show Figures

Figure 1

36 pages, 5618 KiB  
Review
Quantum Computing for Healthcare: A Review
by Raihan Ur Rasool, Hafiz Farooq Ahmad, Wajid Rafique, Adnan Qayyum, Junaid Qadir and Zahid Anwar
Future Internet 2023, 15(3), 94; https://doi.org/10.3390/fi15030094 - 27 Feb 2023
Cited by 62 | Viewed by 27909
Abstract
In recent years, the interdisciplinary field of quantum computing has rapidly developed and garnered substantial interest from both academia and industry due to its ability to process information in fundamentally different ways, leading to hitherto unattainable computational capabilities. However, despite its potential, the [...] Read more.
In recent years, the interdisciplinary field of quantum computing has rapidly developed and garnered substantial interest from both academia and industry due to its ability to process information in fundamentally different ways, leading to hitherto unattainable computational capabilities. However, despite its potential, the full extent of quantum computing’s impact on healthcare remains largely unexplored. This survey paper presents the first systematic analysis of the various capabilities of quantum computing in enhancing healthcare systems, with a focus on its potential to revolutionize compute-intensive healthcare tasks such as drug discovery, personalized medicine, DNA sequencing, medical imaging, and operational optimization. Through a comprehensive analysis of existing literature, we have developed taxonomies across different dimensions, including background and enabling technologies, applications, requirements, architectures, security, open issues, and future research directions, providing a panoramic view of the quantum computing paradigm for healthcare. Our survey aims to aid both new and experienced researchers in quantum computing and healthcare by helping them understand the current research landscape, identifying potential opportunities and challenges, and making informed decisions when designing new architectures and applications for quantum computing in healthcare. Full article
(This article belongs to the Special Issue Internet of Things (IoT) for Smart Living and Public Health)
Show Figures

Figure 1

13 pages, 565 KiB  
Article
Data Protection and Multi-Database Data-Driven Models
by Lili Jiang and Vicenç Torra
Future Internet 2023, 15(3), 93; https://doi.org/10.3390/fi15030093 - 27 Feb 2023
Cited by 2 | Viewed by 1735
Abstract
Anonymization and data masking have effects on data-driven models. Different anonymization methods have been developed to provide a good trade-off between privacy guarantees and data utility. Nevertheless, the effects of data protection (e.g., data microaggregation and noise addition) on data integration and on [...] Read more.
Anonymization and data masking have effects on data-driven models. Different anonymization methods have been developed to provide a good trade-off between privacy guarantees and data utility. Nevertheless, the effects of data protection (e.g., data microaggregation and noise addition) on data integration and on data-driven models (e.g., machine learning models) built from these data are not known. In this paper, we study how data protection affects data integration, and the corresponding effects on the results of machine learning models built from the outcome of the data integration process. The experimental results show that the levels of protection that prevent proper database integration do not affect machine learning models that learn from the integrated database to the same degree. Concretely, our preliminary analysis and experiments show that data protection techniques have a lower level of impact on data integration than on machine learning models. Full article
(This article belongs to the Special Issue Big Data Analytics, Privacy and Visualization)
Show Figures

Figure 1

17 pages, 2932 KiB  
Article
A Game Theory-Based Model for the Dissemination of Privacy Information in Online Social Networks
by Jingsha He, Yue Li and Nafei Zhu
Future Internet 2023, 15(3), 92; https://doi.org/10.3390/fi15030092 - 27 Feb 2023
Cited by 4 | Viewed by 2009
Abstract
Online social networks (OSNs) have experienced rapid growth in recent years, and an increasing number of people now use OSNs, such as Facebook and Twitter, to share and spread information on a daily basis. As a special type of information, user personal information [...] Read more.
Online social networks (OSNs) have experienced rapid growth in recent years, and an increasing number of people now use OSNs, such as Facebook and Twitter, to share and spread information on a daily basis. As a special type of information, user personal information is also widely disseminated in such networks, posing threats to user privacy. The study on privacy information dissemination is thus useful for the development of mechanisms and tools for the effective protection of privacy information in OSNs. In this paper, we propose to apply the game theory to establish a sender–receiver game model and the Nash equilibrium to describe the behavioral strategies of users in disseminating privacy information. Factors that affect the dissemination of privacy information are also analyzed with two important aspects: intimacy and popularity of the privacy-concerning subject. Simulation experiments were conducted based on real data sets from scale-free networks and real social networks to compare and analyze the effectiveness of the model. Results show that the proposed game theory is applicable to the privacy information dissemination model, which implements intimacy and popularity in the modeling of the dissemination of privacy information in OSNs. Both the impact of the macro-level OSNs and the micro-relationships between users are evaluated on the dissemination of privacy information, which provides a new perspective for exploring the dissemination of privacy information and facilitates the development of effective mechanisms for privacy protection in OSNs. Full article
(This article belongs to the Special Issue Cryptography in Digital Networks)
Show Figures

Figure 1

17 pages, 4436 KiB  
Article
Scope and Accuracy of Analytic and Approximate Results for FIFO, Clock-Based and LRU Caching Performance
by Gerhard Hasslinger, Konstantinos Ntougias, Frank Hasslinger and Oliver Hohlfeld
Future Internet 2023, 15(3), 91; https://doi.org/10.3390/fi15030091 - 24 Feb 2023
Cited by 3 | Viewed by 1780
Abstract
We evaluate analysis results and approximations for the performance of basic caching methods, assuming independent requests. Compared with simulative evaluations, the analysis results are accurate, but their computation is tractable only within a limited scope. We compare the scalability of analytical FIFO and [...] Read more.
We evaluate analysis results and approximations for the performance of basic caching methods, assuming independent requests. Compared with simulative evaluations, the analysis results are accurate, but their computation is tractable only within a limited scope. We compare the scalability of analytical FIFO and LRU solutions including extensions for multisegment caches and for caches with data of varying sizes. On the other hand, approximations have been proposed for the FIFO and LRU hit ratio. They are simple and scalable, but their accuracy is confirmed mainly through asymptotic behaviour only for large caches. We derive bounds on the approximation errors in a detailed worst-case study with a focus on small caches. The approximations are extended to data of different sizes. Then a fraction of unused cache space can add to the deviations, which is estimated in order to improve the solution. Full article
(This article belongs to the Special Issue Future Communication Networks for the Internet of Things (IoT))
Show Figures

Figure 1

34 pages, 2792 KiB  
Article
BPMNE4IoT: A Framework for Modeling, Executing and Monitoring IoT-Driven Processes
by Yusuf Kirikkayis, Florian Gallik, Michael Winter and Manfred Reichert
Future Internet 2023, 15(3), 90; https://doi.org/10.3390/fi15030090 - 22 Feb 2023
Cited by 15 | Viewed by 5502
Abstract
The Internet of Things (IoT) enables a variety of smart applications, including smart home, smart manufacturing, and smart city. By enhancing Business Process Management Systems with IoT capabilities, the execution and monitoring of business processes can be significantly improved. Providing a holistic support [...] Read more.
The Internet of Things (IoT) enables a variety of smart applications, including smart home, smart manufacturing, and smart city. By enhancing Business Process Management Systems with IoT capabilities, the execution and monitoring of business processes can be significantly improved. Providing a holistic support for modeling, executing and monitoring IoT-driven processes, however, constitutes a challenge. Existing process modeling and process execution languages, such as BPMN 2.0, are unable to fully meet the IoT characteristics (e.g., asynchronicity and parallelism) of IoT-driven processes. In this article, we present BPMNE4IoT—A holistic framework for modeling, executing and monitoring IoT-driven processes. We introduce various artifacts and events based on the BPMN 2.0 metamodel that allow realizing the desired IoT awareness of business processes. The framework is evaluated along two real-world scenarios from two different domains. Moreover, we present a user study for comparing BPMNE4IoT and BPMN 2.0. In particular, this study has confirmed that the BPMNE4IoT framework facilitates the support of IoT-driven processes. Full article
(This article belongs to the Special Issue IoT-Based BPM for Smart Environments)
Show Figures

Figure 1

22 pages, 6279 KiB  
Article
Fast Way to Predict Parking Lots Availability: For Shared Parking Lots Based on Dynamic Parking Fee System
by Sheng-Ming Wang and Wei-Min Cheng
Future Internet 2023, 15(3), 89; https://doi.org/10.3390/fi15030089 - 22 Feb 2023
Cited by 1 | Viewed by 2190
Abstract
This study mainly focuses on the estimation calculation of urban parking space. Urban parking has always been a problem that plagues governments worldwide. Due to limited parking space, if the parking space is not controlled correctly, with the city’s development, the city will [...] Read more.
This study mainly focuses on the estimation calculation of urban parking space. Urban parking has always been a problem that plagues governments worldwide. Due to limited parking space, if the parking space is not controlled correctly, with the city’s development, the city will eventually face the result that there is nowhere to park. In order to effectively manage the urban parking problem, using the dynamic parking fee pricing mechanism combined with the concept of shared parking is an excellent way to alleviate the parking problem, but how to quickly estimate the total number of available parking spaces in the area is a big problem. This study provides a fast parking space estimation method and verifies the feasibility of this estimation method through actual data from various types of fields. This study also comprehensively discusses the changing characteristics of parking space data in multiple areas and possible data anomalies and studies and explains the causes of data anomalies. The study also concludes with a description of potential applications of the predictive model in conjunction with subsequent dynamic parking pricing mechanisms and self-driving systems. Full article
Show Figures

Figure 1

17 pages, 2570 KiB  
Article
Machine Learning for Data Center Optimizations: Feature Selection Using Shapley Additive exPlanation (SHAP)
by Yibrah Gebreyesus, Damian Dalton, Sebastian Nixon, Davide De Chiara and Marta Chinnici
Future Internet 2023, 15(3), 88; https://doi.org/10.3390/fi15030088 - 21 Feb 2023
Cited by 23 | Viewed by 7229
Abstract
The need for artificial intelligence (AI) and machine learning (ML) models to optimize data center (DC) operations increases as the volume of operations management data upsurges tremendously. These strategies can assist operators in better understanding their DC operations and help them make informed [...] Read more.
The need for artificial intelligence (AI) and machine learning (ML) models to optimize data center (DC) operations increases as the volume of operations management data upsurges tremendously. These strategies can assist operators in better understanding their DC operations and help them make informed decisions upfront to maintain service reliability and availability. The strategies include developing models that optimize energy efficiency, identifying inefficient resource utilization and scheduling policies, and predicting outages. In addition to model hyperparameter tuning, feature subset selection (FSS) is critical for identifying relevant features for effectively modeling DC operations to provide insight into the data, optimize model performance, and reduce computational expenses. Hence, this paper introduces the Shapley Additive exPlanation (SHAP) values method, a class of additive feature attribution values for identifying relevant features that is rarely discussed in the literature. We compared its effectiveness with several commonly used, importance-based feature selection methods. The methods were tested on real DC operations data streams obtained from the ENEA CRESCO6 cluster with 20,832 cores. To demonstrate the effectiveness of SHAP compared to other methods, we selected the top ten most important features from each method, retrained the predictive models, and evaluated their performance using the MAE, RMSE, and MPAE evaluation criteria. The results presented in this paper demonstrate that the predictive models trained using features selected with the SHAP-assisted method performed well, with a lower error and a reasonable execution time compared to other methods. Full article
(This article belongs to the Special Issue Machine Learning Perspective in the Convolutional Neural Network Era)
Show Figures

Figure 1

Previous Issue
Next Issue
Back to TopTop