Next Issue
Volume 16, December
Previous Issue
Volume 16, October
 
 

Future Internet, Volume 16, Issue 11 (November 2024) – 47 articles

Cover Story (view full-size image): This research aims to develop different strategies for implementing collaborative virtual and augmented reality (VR/AR) applications within the cloud continuum. Three distinct architectures that distribute computational load differently were designed, including through the use of sequence and component diagrams, implemented and tested: one fully cloud-based, one hybrid between cloud and edge computing, and one primarily edge-based.
Different applications were also built for each architecture to evaluate their functionality. The scenarios were stress-tested with numerous users using tools such as Cloud Analyst to simulate real performance requirements. The results provide well-defined reference architectures that significantly improve the deployment and scalability of collaborative VR/AR applications. View this paper
  • Issues are regarded as officially published after their release is announced to the table of contents alert mailing list.
  • You may sign up for e-mail alerts to receive table of contents of newly released issues.
  • PDF is the official format for papers published in both, html and pdf forms. To view the papers in pdf format, click on the "PDF Full-text" link, and use the free Adobe Reader to open them.
Order results
Result details
Section
Select all
Export citation of selected articles as:
24 pages, 1273 KiB  
Article
Flexible Hyper-Distributed IoT–Edge–Cloud Platform for Real-Time Digital Twin Applications on 6G-Intended Testbeds for Logistics and Industry
by Maria Crespo-Aguado, Raul Lozano, Fernando Hernandez-Gobertti, Nuria Molner and David Gomez-Barquero
Future Internet 2024, 16(11), 431; https://doi.org/10.3390/fi16110431 - 20 Nov 2024
Viewed by 362
Abstract
This paper presents the design and development of a flexible hyper-distributed IoT–Edge–Cloud computing platform for real-time Digital Twins in real logistics and industrial environments, intended as a novel living lab and testbed for future 6G applications. It expands the limited capabilities of IoT [...] Read more.
This paper presents the design and development of a flexible hyper-distributed IoT–Edge–Cloud computing platform for real-time Digital Twins in real logistics and industrial environments, intended as a novel living lab and testbed for future 6G applications. It expands the limited capabilities of IoT devices with extended Cloud and Edge computing functionalities, creating an IoT–Edge–Cloud continuum platform composed of multiple stakeholder solutions, in which vertical application developers can take full advantage of the computing resources of the infrastructure. The platform is built together with a private 5G network to connect machines and sensors on a large scale. Artificial intelligence and machine learning are used to allocate computing resources for real-time services by an end-to-end intelligent orchestrator, and real-time distributed analytic tools leverage Edge computing platforms to support different types of Digital Twin applications for logistics and industry, such as immersive remote driving, with specific characteristics and features. Performance evaluations demonstrated the platform’s capability to support the high-throughput communications required for Digital Twins, achieving user-experienced rates close to the maximum theoretical values, up to 552 Mb/s for the downlink and 87.3 Mb/s for the uplink in the n78 frequency band. Moreover, the platform’s support for Digital Twins was validated via QoE assessments conducted on an immersive remote driving prototype, which demonstrated high levels of user satisfaction in key dimensions such as presence, engagement, control, sensory integration, and cognitive load. Full article
(This article belongs to the Special Issue Convergence of Edge Computing and Next Generation Networking)
Show Figures

Figure 1

22 pages, 945 KiB  
Review
Resilience in the Internet of Medical Things: A Review and Case Study
by Vikas Tomer, Sachin Sharma and Mark Davis
Future Internet 2024, 16(11), 430; https://doi.org/10.3390/fi16110430 - 20 Nov 2024
Viewed by 575
Abstract
The Internet of Medical Things (IoMT), an extension of the Internet of Things (IoT), is still in its early stages of development. Challenges that are inherent to IoT, persist in IoMT as well. The major focus is on data transmission within the healthcare [...] Read more.
The Internet of Medical Things (IoMT), an extension of the Internet of Things (IoT), is still in its early stages of development. Challenges that are inherent to IoT, persist in IoMT as well. The major focus is on data transmission within the healthcare domain due to its profound impact on health and public well-being. Issues such as latency, bandwidth constraints, and concerns regarding security and privacy are critical in IoMT owing to the sensitive nature of patient data, including patient identity and health status. Numerous forms of cyber-attacks pose threats to IoMT networks, making the reliable and secure transmission of critical medical data a challenging task. Several other situations, such as natural disasters, war, construction works, etc., can cause IoMT networks to become unavailable and fail to transmit the data. The first step in these situations is to recover from failure as quickly as possible, resume the data transfer, and detect the cause of faults, failures, and errors. Several solutions exist in the literature to make the IoMT resilient to failure. However, no single approach proposed in the literature can simultaneously protect the IoMT networks from various attacks, failures, and faults. This paper begins with a detailed description of IoMT and its applications. It considers the underlying requirements of resilience for IoMT networks, such as monitoring, control, diagnosis, and recovery. This paper comprehensively analyzes existing research efforts to provide IoMT network resilience against diverse causes. After investigating several research proposals, we identify that the combination of software-defined networks (SDNs), machine learning (ML), and microservices architecture (MSA) has the capabilities to fulfill the requirements for achieving resilience in the IoMT networks. It mainly focuses on the analysis of technologies, such as SDN, ML, and MSA, separately, for meeting the resilience requirements in the IoMT networks. SDN can be used for monitoring and control, and ML can be used for anomaly detection and diagnosis, whereas MSA can be used for bringing distributed functionality and recovery into the IoMT networks. This paper provides a case study that describes the remote patient monitoring (RPM) of a heart patient in IoMT networks. It covers the different failure scenarios in IoMT infrastructure. Finally, we provide a proposed methodology that elaborates how distributed functionality can be achieved during these failures using machine learning, software-defined networks, and microservices technologies. Full article
(This article belongs to the Special Issue The Future Internet of Medical Things II)
Show Figures

Figure 1

19 pages, 3800 KiB  
Article
Fully Open-Source Meeting Minutes Generation Tool
by Amma Liesvarastranta Haz, Yohanes Yohanie Fridelin Panduman, Nobuo Funabiki, Evianita Dewi Fajrianti and Sritrusta Sukaridhoto
Future Internet 2024, 16(11), 429; https://doi.org/10.3390/fi16110429 - 20 Nov 2024
Viewed by 960
Abstract
With the increasing use of online meetings, there is a growing need for efficient tools that can automatically generate meeting minutes from recorded sessions. Current solutions often rely on proprietary systems, limiting adaptability and flexibility. This paper investigates whether various open-source models and [...] Read more.
With the increasing use of online meetings, there is a growing need for efficient tools that can automatically generate meeting minutes from recorded sessions. Current solutions often rely on proprietary systems, limiting adaptability and flexibility. This paper investigates whether various open-source models and methods such as audio-to-text conversion, summarization, keyword extraction, and optical character recognition (OCR) can be integrated to create a meeting minutes generation tool for recorded video presentations. For this purpose, a series of evaluations are conducted to identify suitable models. Then, the models are integrated into a system that is modular yet accurate. The utilization of an open-source approach ensures that the tool remains accessible and adaptable to the latest innovations, thereby ensuring continuous improvement over time. Furthermore, this approach also benefits organizations and individuals by providing a cost-effective and flexible alternative. This work contributes to creating a modular and easily extensible open-source framework that integrates several advanced technologies and future new models into a cohesive system. The system was evaluated on ten videos created under controlled conditions, which may not fully represent typical online presentation recordings. It showed strong performance in audio-to-text conversion with a low word-error rate. Summarization and keyword extraction were functional but showed room for improvement in terms of precision and relevance, as gathered from the users’ feedback. These results confirm the system’s effectiveness and efficiency in generating usable meeting minutes from recorded presentation videos, with room for improvement in future works. Full article
(This article belongs to the Special Issue Deep Learning and Natural Language Processing II)
Show Figures

Figure 1

26 pages, 1394 KiB  
Article
Fault Prediction and Reconfiguration Optimization in Smart Grids: AI-Driven Approach
by David Carrascal, Paula Bartolomé, Elisa Rojas, Diego Lopez-Pajares, Nicolas Manso and Javier Diaz-Fuentes
Future Internet 2024, 16(11), 428; https://doi.org/10.3390/fi16110428 - 20 Nov 2024
Viewed by 560
Abstract
Smart grids (SGs) are essential for the efficient and distributed management of electrical distribution networks. A key task in SG management is fault detection and subsequently, network reconfiguration to minimize power losses and balance loads. This process should minimize power losses while optimizing [...] Read more.
Smart grids (SGs) are essential for the efficient and distributed management of electrical distribution networks. A key task in SG management is fault detection and subsequently, network reconfiguration to minimize power losses and balance loads. This process should minimize power losses while optimizing distribution by balancing loads across the grid. However, the current literature yields a lack of methods for efficient fault prediction and fast reconfiguration. To achieve this goal, this paper builds on DEN2DE, an adaptable routing and reconfiguration solution potentially applicable to SGs, and investigates its potential extension with AI-based fault prediction using real-world datasets and randomly generated topologies based on the IEEE 123 Node Test Feeder. The study applies models based on Machine Learning (ML) and Deep Learning (DL) techniques, specifically evaluating Random Forest (RF) and Support Vector Machine (SVM) as ML methods, and Artificial Neural Network (ANN) as a DL method, evaluating each for accuracy, precision, and recall. Results indicate that the RF model with Recursive Feature Elimination (RFECV) achieves 94.28% precision and 81.05% recall, surpassing SVM (precision 89.32%, recall 6.95%) and ANN (precision 72.17%, recall 13.49%) in fault detection accuracy and reliability. Full article
(This article belongs to the Section Smart System Infrastructure and Applications)
Show Figures

Figure 1

16 pages, 725 KiB  
Article
Virtualization vs. Containerization, a Comparative Approach for Application Deployment in the Computing Continuum Focused on the Edge
by Hamish Sturley, Augustin Fournier, Andoni Salcedo-Navarro, Miguel Garcia-Pineda and Jaume Segura-Garcia
Future Internet 2024, 16(11), 427; https://doi.org/10.3390/fi16110427 - 19 Nov 2024
Viewed by 439
Abstract
With the emergence of containerization 10 years ago, we saw a compact, convenient and portable way of running apps directly concurrently with virtualization. The major difference is in the architecture. Containers share the same kernel as the guest and then do not virtualize [...] Read more.
With the emergence of containerization 10 years ago, we saw a compact, convenient and portable way of running apps directly concurrently with virtualization. The major difference is in the architecture. Containers share the same kernel as the guest and then do not virtualize low-layer components like the Central Processing Unit (CPU). On the one hand, they are lighter and more flexible than virtual machines (VMs). On the other hand, VMs can more precisely meet the low-layer needs and are completely autonomous systems. Nowadays, what is the best architecture to use to develop an application? In this paper, we will study the two main virtual methods of deploying this. We will compare both methods on several criteria: compatibility based on user experience and the ease of installation/deployment, scalability based on the automatic elasticity facing the workload and energy efficiency in terms of energy and computer resources. After the tests, we conclude that the containerization option is the most ecologically advantageous option in terms of energy consumption. Full article
Show Figures

Figure 1

19 pages, 2662 KiB  
Article
Identifying Persons of Interest in Digital Forensics Using NLP-Based AI
by Jonathan Adkins, Ali Al Bataineh and Majd Khalaf
Future Internet 2024, 16(11), 426; https://doi.org/10.3390/fi16110426 - 18 Nov 2024
Viewed by 798
Abstract
The field of digital forensics relies on expertise from multiple domains, including computer science, criminology, and law. It also relies on different toolsets and an analyst’s expertise to parse enormous amounts of user-generated data to find clues that help crack a case. This [...] Read more.
The field of digital forensics relies on expertise from multiple domains, including computer science, criminology, and law. It also relies on different toolsets and an analyst’s expertise to parse enormous amounts of user-generated data to find clues that help crack a case. This process of investigative analysis is often done manually. Artificial Intelligence (AI) can provide practical solutions to efficiently mine enormous amounts of data to find useful patterns that can be leveraged to investigate crimes. Natural Language Processing (NLP) is a subdomain of research under AI that deals with problems involving unstructured data, specifically language. The domain of NLP includes several tools to parse text, including topic modeling, pairwise correlation, word vector cosine distance measurement, and sentiment analysis. In this research, we propose a digital forensic investigative technique that uses an ensemble of NLP tools to identify a person of interest list based on a corpus of text. Our proposed method serves as a type of human feature reduction, where a total pool of suspects is filtered down to a short list of candidates who possess a higher correlation with the crime being investigated. Full article
Show Figures

Figure 1

24 pages, 2020 KiB  
Article
Enhanced Long-Range Network Performance of an Oil Pipeline Monitoring System Using a Hybrid Deep Extreme Learning Machine Model
by Abbas Kubba, Hafedh Trabelsi and Faouzi Derbel
Future Internet 2024, 16(11), 425; https://doi.org/10.3390/fi16110425 - 17 Nov 2024
Viewed by 991
Abstract
Leak detection in oil and gas pipeline networks is a climacteric and frequent issue in the oil and gas field. Many establishments have long depended on stationary hardware or traditional assessments to monitor and detect abnormalities. Rapid technological progress; innovation in engineering; and [...] Read more.
Leak detection in oil and gas pipeline networks is a climacteric and frequent issue in the oil and gas field. Many establishments have long depended on stationary hardware or traditional assessments to monitor and detect abnormalities. Rapid technological progress; innovation in engineering; and advanced technologies providing cost-effective, rapidly executed, and easy to implement solutions lead to building an efficient oil pipeline leak detection and real-time monitoring system. In this area, wireless sensor networks (WSNs) are increasingly required to enhance the reliability of checkups and improve the accuracy of real-time oil pipeline monitoring systems with limited hardware resources. The real-time transient model (RTTM) is a leak detection method integrated with LoRaWAN technology, which is proposed in this study to implement a wireless oil pipeline network for long distances. This study will focus on enhancing the LoRa network parameters, e.g., node power consumption, average packet loss, and delay, by applying several machine learning techniques in order to optimize the durability of individual nodes’ lifetimes and enhance total system performance. The proposed system is implemented in an OMNeT++ network simulator with several frameworks, such as Flora and Inet, to cover the LoRa network, which is used as the system’s network infrastructure. In order to implement artificial intelligence over the FLoRa network, the LoRa network was integrated with several programming tools and libraries, such as Python script and the TensorFlow libraries. Several machine learning algorithms have been applied, such as the random forest (RF) algorithm and the deep extreme learning machine (DELM) technique, to develop the proposed model and improve the LoRa network’s performance. They improved the LoRa network’s output performance, e.g., its power consumption, packet loss, and packet delay, with different enhancement ratios. Finally, a hybrid deep extreme learning machine model was built and selected as the proposed model due to its ability to improve the LoRa network’s performance, with perfect prediction accuracy, a mean square error of 0.75, and an exceptional enhancement ratio of 39% for LoRa node power consumption. Full article
(This article belongs to the Topic Advances in Wireless and Mobile Networking)
Show Figures

Figure 1

21 pages, 1716 KiB  
Article
AI-Driven Neuro-Monitoring: Advancing Schizophrenia Detection and Management Through Deep Learning and EEG Analysis
by Elena-Anca Paraschiv, Lidia Băjenaru, Cristian Petrache, Ovidiu Bica and Dragoș-Nicolae Nicolau
Future Internet 2024, 16(11), 424; https://doi.org/10.3390/fi16110424 - 16 Nov 2024
Viewed by 477
Abstract
Schizophrenia is a complex neuropsychiatric disorder characterized by disruptions in brain connectivity and cognitive functioning. Continuous monitoring of neural activity is essential, as it allows for the detection of subtle changes in brain connectivity patterns, which could provide early warnings of cognitive decline [...] Read more.
Schizophrenia is a complex neuropsychiatric disorder characterized by disruptions in brain connectivity and cognitive functioning. Continuous monitoring of neural activity is essential, as it allows for the detection of subtle changes in brain connectivity patterns, which could provide early warnings of cognitive decline or symptom exacerbation, ultimately facilitating timely therapeutic interventions. This paper proposes a novel approach for detecting schizophrenia-related abnormalities using deep learning (DL) techniques applied to electroencephalogram (EEG) data. Using an openly available EEG dataset on schizophrenia, the focus is on preprocessed event-related potentials (ERPs) from key electrode sites and applied transfer entropy (TE) analysis to quantify the directional flow of information between brain regions. TE matrices were generated to capture neural connectivity patterns, which were then used as input for a hybrid DL model, combining convolutional neural networks (CNNs) and Bidirectional Long Short-Term Memory (BiLSTM) networks. The model achieved a performant accuracy of 99.94% in classifying schizophrenia-related abnormalities, demonstrating its potential for real-time mental health monitoring. The generated TE matrices revealed significant differences in connectivity between the two groups, particularly in frontal and central brain regions, which are critical for cognitive processing. These findings were further validated by correlating the results with EEG data obtained from the Muse 2 headband, emphasizing the potential for portable, non-invasive monitoring of schizophrenia in real-world settings. The final model, integrated into the NeuroPredict platform, offers a scalable solution for continuous mental health monitoring. By incorporating EEG data, heart rate, sleep patterns, and environmental metrics, NeuroPredict facilitates early detection and personalized interventions for schizophrenia patients. Full article
(This article belongs to the Special Issue eHealth and mHealth)
Show Figures

Graphical abstract

26 pages, 4934 KiB  
Article
Capacity and Coverage Dimensioning for 5G Standalone Mixed-Cell Architecture: An Impact of Using Existing 4G Infrastructure
by Naba Raj Khatiwoda, Babu Ram Dawadi and Sashidhar Ram Joshi
Future Internet 2024, 16(11), 423; https://doi.org/10.3390/fi16110423 - 14 Nov 2024
Viewed by 1252
Abstract
With the increasing demand for expected data volume daily, current telecommunications infrastructure can not meet requirements without using enhanced technologies adopted by 5G and beyond networks. Due to their diverse features, 5G technologies and services will be phenomenal in the coming days. Proper [...] Read more.
With the increasing demand for expected data volume daily, current telecommunications infrastructure can not meet requirements without using enhanced technologies adopted by 5G and beyond networks. Due to their diverse features, 5G technologies and services will be phenomenal in the coming days. Proper planning procedures are to be adopted to provide cost-effective and quality telecommunication services. In this paper, we planned 5G network deployment in two frequency ranges, 3.5 GHz and 28 GHz, using a mixed cell structure. We used metaheuristic approaches such as Grey Wolf Optimization (GWO), Sparrow Search Algorithm (SSA), Whale Optimization Algorithm (WOA), Marine Predator Algorithm (MPA), Particle Swarm Optimization (PSO), and Ant Lion Optimization (ALO) for optimizing the locations of remote radio units. The comparative analysis of metaheuristic algorithms shows that the proposed network is efficient in providing an average data rate of 50 Mbps, can meet the coverage requirements of at least 98%, and meets quality-of-service requirements. We carried out the case study for an urban area and another suburban area of Kathmandu Valley, Nepal. We analyzed the outcomes of 5G greenfield deployment and 5G deployment using existing 4G infrastructure. Deploying 5G networks using existing 4G infrastructure, resources can be saved up to 33.7% and 54.2% in urban and suburban areas, respectively. Full article
(This article belongs to the Topic Advances in Wireless and Mobile Networking)
Show Figures

Figure 1

20 pages, 552 KiB  
Article
SBNNR: Small-Size Bat-Optimized KNN Regression
by Rasool Seyghaly, Jordi Garcia, Xavi Masip-Bruin and Jovana Kuljanin
Future Internet 2024, 16(11), 422; https://doi.org/10.3390/fi16110422 - 14 Nov 2024
Viewed by 334
Abstract
Small datasets are frequent in some scientific fields. Such datasets are usually created due to the difficulty or cost of producing laboratory and experimental data. On the other hand, researchers are interested in using machine learning methods to analyze this scale of data. [...] Read more.
Small datasets are frequent in some scientific fields. Such datasets are usually created due to the difficulty or cost of producing laboratory and experimental data. On the other hand, researchers are interested in using machine learning methods to analyze this scale of data. For this reason, in some cases, low-performance, overfitting models are developed for small-scale data. As a result, it appears necessary to develop methods for dealing with this type of data. In this research, we provide a new and innovative framework for regression problems with a small sample size. The base of our proposed method is the K-nearest neighbors (KNN) algorithm. For feature selection, instance selection, and hyperparameter tuning, we use the bat optimization algorithm (BA). Generative Adversarial Networks (GANs) are employed to generate synthetic data, effectively addressing the challenges associated with data sparsity. Concurrently, Deep Neural Networks (DNNs), as a deep learning approach, are utilized for feature extraction from both synthetic and real datasets. This hybrid framework integrates KNN, DNN, and GAN as foundational components and is optimized in multiple aspects (features, instances, and hyperparameters) using BA. The outcomes exhibit an enhancement of up to 5% in the coefficient of determination (R2 score) using the proposed method compared to the standard KNN method optimized through grid search. Full article
(This article belongs to the Special Issue Deep Learning Techniques Addressing Data Scarcity)
Show Figures

Figure 1

13 pages, 420 KiB  
Article
Towards a Decentralized Collaborative Framework for Scalable Edge AI
by Ahmed M. Abdelmoniem , Mona Jaber , Ali Anwar , Yuchao Zhang  and Mingliang Gao 
Future Internet 2024, 16(11), 421; https://doi.org/10.3390/fi16110421 - 14 Nov 2024
Viewed by 419
Abstract
Nowadays, Edge Intelligence has seen unprecedented growth in most of our daily life applications. Traditionally, most applications required significant efforts into data collection for data-driven analytics, raising privacy concerns. The proliferation of specialized hardware on sensors, wearable, mobile, and IoT devices has led [...] Read more.
Nowadays, Edge Intelligence has seen unprecedented growth in most of our daily life applications. Traditionally, most applications required significant efforts into data collection for data-driven analytics, raising privacy concerns. The proliferation of specialized hardware on sensors, wearable, mobile, and IoT devices has led to the growth of Edge Intelligence, which has become an integral part of the development cycle of most modern applications. However, scalability issues hinder their wide-scale adoption. We aim to focus on these challenges and propose a scalable decentralized edge intelligence framework. Therefore, we analyze and empirically evaluate the challenges of existing methods, and design an architecture that overcomes these challenges. The proposed approach is client-driven and model-centric, allowing models to be shared between entities in a scalable fashion. We conduct experiments over various benchmarks to show that the proposed approach presents an efficient alternative to the existing baseline method, and it can be a viable solution to scale edge intelligence. Full article
(This article belongs to the Special Issue IoT, Edge, and Cloud Computing in Smart Cities)
Show Figures

Figure 1

16 pages, 25350 KiB  
Article
Eye Tracking and Human Influence Factors’ Impact on Quality of Experience of Mobile Gaming
by Omer Nawaz, Siamak Khatibi, Muhammad Nauman Sheikh and Markus Fiedler
Future Internet 2024, 16(11), 420; https://doi.org/10.3390/fi16110420 - 13 Nov 2024
Viewed by 340
Abstract
Mobile gaming accounts for more than 50% of global online gaming revenue, surpassing console and browser-based gaming. The success of mobile gaming titles depends on optimizing applications for the specific hardware constraints of mobile devices, such as smaller displays and lower computational power, [...] Read more.
Mobile gaming accounts for more than 50% of global online gaming revenue, surpassing console and browser-based gaming. The success of mobile gaming titles depends on optimizing applications for the specific hardware constraints of mobile devices, such as smaller displays and lower computational power, to maximize battery life. Additionally, these applications must dynamically adapt to the variations in network speed inherent in mobile environments. Ultimately, user engagement and satisfaction are critical, necessitating a favorable comparison to browser and console-based gaming experiences. While Quality of Experience (QoE) subjective evaluations through user surveys are the most reliable method for assessing user perception, various factors, termed influence factors (IFs), can affect user ratings of stimulus quality. This study examines human influence factors in mobile gaming, specifically analyzing the impact of user delight towards displayed content and the effect of gaze tracking. Using Pupil Core eye-tracking hardware, we captured user interactions with mobile devices and measured visual attention. Video stimuli from eight popular games were selected, with resolutions of 720p and 1080p and frame rates of 30 and 60 fps. Our results indicate a statistically significant impact of user delight on the MOS for most video stimuli across all games. Additionally, a trend favoring higher frame rates over screen resolution emerged in user ratings. These findings underscore the significance of optimizing mobile gaming experiences by incorporating models that estimate human influence factors to enhance user satisfaction and engagement. Full article
Show Figures

Figure 1

30 pages, 1179 KiB  
Review
Advancing Additive Manufacturing Through Machine Learning Techniques: A State-of-the-Art Review
by Shaoping Xiao, Junchao Li, Zhaoan Wang, Yingbin Chen and Soheyla Tofighi
Future Internet 2024, 16(11), 419; https://doi.org/10.3390/fi16110419 - 13 Nov 2024
Viewed by 1262
Abstract
In the fourth industrial revolution, artificial intelligence and machine learning (ML) have increasingly been applied to manufacturing, particularly additive manufacturing (AM), to enhance processes and production. This study provides a comprehensive review of the state-of-the-art achievements in this domain, highlighting not only the [...] Read more.
In the fourth industrial revolution, artificial intelligence and machine learning (ML) have increasingly been applied to manufacturing, particularly additive manufacturing (AM), to enhance processes and production. This study provides a comprehensive review of the state-of-the-art achievements in this domain, highlighting not only the widely discussed supervised learning but also the emerging applications of semi-supervised learning and reinforcement learning. These advanced ML techniques have recently gained significant attention for their potential to further optimize and automate AM processes. The review aims to offer insights into various ML technologies employed in current research projects and to promote the diverse applications of ML in AM. By exploring the latest advancements and trends, this study seeks to foster a deeper understanding of ML’s transformative role in AM, paving the way for future innovations and improvements in manufacturing practices. Full article
Show Figures

Figure 1

19 pages, 654 KiB  
Article
A Methodological Approach to Securing Cyber-Physical Systems for Critical Infrastructures
by Antonello Calabrò, Enrico Cambiaso, Manuel Cheminod, Ivan Cibrario Bertolotti, Luca Durante, Agostino Forestiero, Flavio Lombardi, Giuseppe Manco, Eda Marchetti, Albina Orlando and Giuseppe Papuzzo
Future Internet 2024, 16(11), 418; https://doi.org/10.3390/fi16110418 - 12 Nov 2024
Viewed by 433
Abstract
Modern ICT infrastructures, i.e., cyber-physical systems and critical infrastructures relying on interconnected IT (Information Technology)- and OT (Operational Technology)-based components and (sub-)systems, raise complex challenges in tackling security and safety issues. Nowadays, many security controls and mechanisms have been made available and exploitable [...] Read more.
Modern ICT infrastructures, i.e., cyber-physical systems and critical infrastructures relying on interconnected IT (Information Technology)- and OT (Operational Technology)-based components and (sub-)systems, raise complex challenges in tackling security and safety issues. Nowadays, many security controls and mechanisms have been made available and exploitable to solve specific security needs, but, when dealing with very complex and multifaceted heterogeneous systems, a methodology is needed on top of the selection of each security control that will allow the designer/maintainer to drive her/his choices to build and keep the system secure as a whole, leaving the choice of the security controls to the last step of the system design/development. This paper aims at providing a comprehensive methodological approach to design and preliminarily implement an Open Platform Architecture (OPA) to secure the cyber-physical systems of critical infrastructures. Here, the Open Platform Architecture (OPA) depicts how an already existing or under-design target system (TS) can be equipped with technologies that are modern or currently under development, to monitor and timely detect possibly dangerous situations and to react in an automatic way by putting in place suitable countermeasures. A multifaceted use case (UC) that is able to show the OPA, starting from the security and safety requirements to the fully designed system, will be developed step by step to show the feasibility and the effectiveness of the proposed methodology. Full article
(This article belongs to the Special Issue State-of-the-Art Future Internet Technology in Italy 2024–2025)
Show Figures

Figure 1

27 pages, 2928 KiB  
Article
Approaches to Identifying Emotions and Affections During the Museum Learning Experience in the Context of the Future Internet
by Iana Fominska, Stefano Di Tore, Michele Nappi, Gerardo Iovane, Maurizio Sibilio and Angela Gelo
Future Internet 2024, 16(11), 417; https://doi.org/10.3390/fi16110417 - 10 Nov 2024
Viewed by 1265
Abstract
The Future Internet aims to revolutionize digital interaction by integrating advanced technologies like AI and IoT, enabling a dynamic and resilient network. It envisions emotionally intelligent systems that can interpret and respond to human feelings, creating immersive, empathy-driven learning experiences. This evolution aspires [...] Read more.
The Future Internet aims to revolutionize digital interaction by integrating advanced technologies like AI and IoT, enabling a dynamic and resilient network. It envisions emotionally intelligent systems that can interpret and respond to human feelings, creating immersive, empathy-driven learning experiences. This evolution aspires to form a responsive digital ecosystem that seamlessly connects technology and human emotion. This paper presents a computational model aimed at enhancing the emotional aspect of learning experiences within museum environments. The model is designed to represent and manage affective and emotional feedback, with a focus on how emotions can significantly impact the learning process in a museum context. The proposed model seeks to identify and quantify emotions during a visitor’s engagement with museum exhibits. To achieve this goal, we primarily explored the following: (i) methods and techniques for assessing and recognizing emotional responses in museum visitors, (ii) feedback management strategies based on the detection of visitors’ emotional states. Then, the methodology was tested on 1000 cases via specific questionnaire forms, along with the presentation of images and short videos, and the results of data analysis are reported. The findings contribute toward establishing a comprehensive methodology for the identification and quantification of the emotional state of museum visitors. Full article
(This article belongs to the Section Internet of Things)
Show Figures

Figure 1

16 pages, 747 KiB  
Article
Automatically Injecting Robustness Statements into Distributed Applications
by Daniele Marletta, Alessandro Midolo and Emiliano Tramontana
Future Internet 2024, 16(11), 416; https://doi.org/10.3390/fi16110416 - 10 Nov 2024
Viewed by 356
Abstract
When developing a distributed application, several issues need to be handled, and software components should include some mechanisms to make their execution resilient when network faults, delays, or tampering occur. For example, synchronous calls represent a too-tight connection between a client requesting a [...] Read more.
When developing a distributed application, several issues need to be handled, and software components should include some mechanisms to make their execution resilient when network faults, delays, or tampering occur. For example, synchronous calls represent a too-tight connection between a client requesting a service and the service itself, whereby potential network delays or temporary server overloads would keep the client side hanging, exposing it to a domino effect. The proposed approach assists developers in dealing with such issues by providing an automatic tool that enhances a distributed application using simple blocking calls and makes it robust in the face of adverse events. The proposed devised solution consists in automatically identifying the parts of the application that connect to remote services using simple synchronous calls and substituting them with a generated customized snippet of code that handles potential network delays or faults. To accurately perform the proposed transformation, the devised tool finds application code statements that are data-dependent on the results of the original synchronous calls. Then, for the dependent statements, a solution involving guarding code, proper synchronization, and timeouts is injected. We experimented with the analysis and transformation of several applications and report a meaningful example, together with the analysis of the results achieved. Full article
Show Figures

Figure 1

41 pages, 438 KiB  
Review
Recent Advancements in Federated Learning: State of the Art, Fundamentals, Principles, IoT Applications and Future Trends
by Christos Papadopoulos, Konstantinos-Filippos Kollias and George F. Fragulis
Future Internet 2024, 16(11), 415; https://doi.org/10.3390/fi16110415 - 9 Nov 2024
Viewed by 702
Abstract
Federated learning (FL) is creating a paradigm shift in machine learning by directing the focus of model training to where the data actually exist. Instead of drawing all data into a central location, which raises concerns about privacy, costs, and delays, FL allows [...] Read more.
Federated learning (FL) is creating a paradigm shift in machine learning by directing the focus of model training to where the data actually exist. Instead of drawing all data into a central location, which raises concerns about privacy, costs, and delays, FL allows learning to take place directly on the device, keeping the data safe and minimizing the need for transfer. This approach is especially important in areas like healthcare, where protecting patient privacy is critical, and in industrial IoT settings, where moving large numbers of data is not practical. What makes FL even more compelling is its ability to reduce the bias that can occur when all data are centralized, leading to fairer and more inclusive machine learning outcomes. However, it is not without its challenges—particularly with regard to keeping the models secure from attacks. Nonetheless, the potential benefits are clear: FL can lower the costs associated with data storage and processing, while also helping organizations to meet strict privacy regulations like GDPR. As edge computing continues to grow, FL’s decentralized approach could play a key role in shaping how we handle data in the future, moving toward a more privacy-conscious world. This study identifies ongoing challenges in ensuring model security against adversarial attacks, pointing to the need for further research in this area. Full article
(This article belongs to the Special Issue IoT Security: Threat Detection, Analysis and Defense)
Show Figures

Figure 1

24 pages, 453 KiB  
Article
An Effective Ensemble Approach for Preventing and Detecting Phishing Attacks in Textual Form
by Zaher Salah, Hamza Abu Owida, Esraa Abu Elsoud, Esraa Alhenawi, Suhaila Abuowaida and Nawaf Alshdaifat
Future Internet 2024, 16(11), 414; https://doi.org/10.3390/fi16110414 - 8 Nov 2024
Viewed by 1376
Abstract
Phishing email assaults have been a prevalent cybercriminal tactic for many decades. Various detectors have been suggested over time that rely on textual information. However, to address the growing prevalence of phishing emails, more sophisticated techniques are required to use all aspects of [...] Read more.
Phishing email assaults have been a prevalent cybercriminal tactic for many decades. Various detectors have been suggested over time that rely on textual information. However, to address the growing prevalence of phishing emails, more sophisticated techniques are required to use all aspects of emails to improve the detection capabilities of machine learning classifiers. This paper presents a novel approach to detecting phishing emails. The proposed methodology combines ensemble learning techniques with various variables, such as word frequency, the presence of specific keywords or phrases, and email length, to improve detection accuracy. We provide two approaches for the planned task; The first technique employs ensemble learning soft voting, while the second employs weighted ensemble learning. Both strategies use distinct machine learning algorithms to concurrently process the characteristics, reducing their complexity and enhancing the model’s performance. An extensive assessment and analysis are conducted, considering unique criteria designed to minimize biased and inaccurate findings. Our empirical experiments demonstrates that using ensemble learning to merge attributes in the evolution of phishing emails showcases the competitive performance of ensemble learning over other machine learning algorithms. This superiority is underscored by achieving an F1-score of 0.90 in the weighted ensemble method and 0.85 in the soft voting method, showcasing the effectiveness of this approach. Full article
Show Figures

Figure 1

28 pages, 462 KiB  
Review
A Joint Survey in Decentralized Federated Learning and TinyML: A Brief Introduction to Swarm Learning
by Evangelia Fragkou and Dimitrios Katsaros
Future Internet 2024, 16(11), 413; https://doi.org/10.3390/fi16110413 - 8 Nov 2024
Viewed by 1257
Abstract
TinyML/DL is a new subfield of ML that allows for the deployment of ML algorithms on low-power devices to process their own data. The lack of resources restricts the aforementioned devices to running only inference tasks (static TinyML), while training is handled by [...] Read more.
TinyML/DL is a new subfield of ML that allows for the deployment of ML algorithms on low-power devices to process their own data. The lack of resources restricts the aforementioned devices to running only inference tasks (static TinyML), while training is handled by a more computationally efficient system, such as the cloud. In recent literature, the focus has been on conducting real-time on-device training tasks (Reformable TinyML) while being wirelessly connected. With data processing being shift to edge devices, the development of decentralized federated learning (DFL) schemes becomes justified. Within these setups, nodes work together to train a neural network model, eliminating the necessity of a central coordinator. Ensuring secure communication among nodes is of utmost importance for protecting data privacy during edge device training. Swarm Learning (SL) emerges as a DFL paradigm that promotes collaborative learning through peer-to-peer interaction, utilizing edge computing and blockchain technology. While SL provides a robust defense against adversarial attacks, it comes at a high computational expense. In this survey, we emphasize the current literature regarding both DFL and TinyML/DL fields. We explore the obstacles encountered by resource-starved devices in this collaboration and provide a brief overview of the potential of transitioning to Swarm Learning. Full article
Show Figures

Figure 1

33 pages, 1638 KiB  
Article
Enhancing Communication Security in Drones Using QRNG in Frequency Hopping Spread Spectrum
by J. de Curtò, I. de Zarzà, Juan-Carlos Cano and Carlos T. Calafate
Future Internet 2024, 16(11), 412; https://doi.org/10.3390/fi16110412 - 8 Nov 2024
Viewed by 1232
Abstract
This paper presents a novel approach to enhancing the security and reliability of drone communications through the integration of Quantum Random Number Generators (QRNG) in Frequency Hopping Spread Spectrum (FHSS) systems. We propose a multi-drone framework that leverages QRNG technology to generate truly [...] Read more.
This paper presents a novel approach to enhancing the security and reliability of drone communications through the integration of Quantum Random Number Generators (QRNG) in Frequency Hopping Spread Spectrum (FHSS) systems. We propose a multi-drone framework that leverages QRNG technology to generate truly random frequency hopping sequences, significantly improving resistance against jamming and interception attempts. Our method introduces a concurrent access protocol for multiple drones to share a QRNG device efficiently, incorporating robust error handling and a shared memory system for random number distribution. The implementation includes secure communication protocols, ensuring data integrity and confidentiality through encryption and Hash-based Message Authentication Code (HMAC) verification. We demonstrate the system’s effectiveness through comprehensive simulations and statistical analyses, including spectral density, frequency distribution, and autocorrelation studies of the generated frequency sequences. The results show a significant enhancement in the unpredictability and uniformity of frequency distributions compared to traditional pseudo-random number generator-based approaches. Specifically, the frequency distributions of the drones exhibited a relatively uniform spread across the available spectrum, with minimal discernible patterns in the frequency sequences, indicating high unpredictability. Autocorrelation analyses revealed a sharp peak at zero lag and linear decrease to zero values for other lags, confirming a general absence of periodicity or predictability in the sequences, which enhances resistance to predictive attacks. Spectral analysis confirmed a relatively flat power spectral density across frequencies, characteristic of truly random sequences, thereby minimizing vulnerabilities to spectral-based jamming. Statistical tests, including Chi-squared and Kolmogorov-Smirnov, further confirm the unpredictability of the frequency sequences generated by QRNG, supporting enhanced security measures against predictive attacks. While some short-term correlations were observed, suggesting areas for improvement in QRNG technology, the overall findings confirm the potential of QRNG-based FHSS systems in significantly improving the security and reliability of drone communications. This work contributes to the growing field of quantum-enhanced wireless communications, offering substantial advancements in security and reliability for drone operations. The proposed system has potential applications in military, emergency response, and secure commercial drone operations, where enhanced communication security is paramount. Full article
(This article belongs to the Section Internet of Things)
Show Figures

Figure 1

23 pages, 1624 KiB  
Article
An Explainable Deep Learning-Enhanced IoMT Model for Effective Monitoring and Reduction of Maternal Mortality Risks
by Sherine Nagy Saleh, Mazen Nabil Elagamy, Yasmine N. M. Saleh and Radwa Ahmed Osman
Future Internet 2024, 16(11), 411; https://doi.org/10.3390/fi16110411 - 8 Nov 2024
Viewed by 1071
Abstract
Maternal mortality (MM) is considered one of the major worldwide concerns. Despite the advances of artificial intelligence (AI) in healthcare, the lack of transparency in AI models leads to reluctance to adopt them. Employing explainable artificial intelligence (XAI) thus helps improve the transparency [...] Read more.
Maternal mortality (MM) is considered one of the major worldwide concerns. Despite the advances of artificial intelligence (AI) in healthcare, the lack of transparency in AI models leads to reluctance to adopt them. Employing explainable artificial intelligence (XAI) thus helps improve the transparency and effectiveness of AI-driven healthcare solutions. Accordingly, this article proposes a complete framework integrating an Internet of Medical Things (IoMT) architecture with an XAI-based deep learning model. The IoMT system continuously monitors pregnant women’s vital signs, while the XAI model analyzes the collected data to identify risk factors and generate actionable insights. Additionally, an efficient IoMT transmission model is developed to ensure reliable data transfer with the best-required system quality of service (QoS). Further analytics are performed on the data collected from different regions in a country to address high-risk cities. The experiments demonstrate the effectiveness of the proposed framework by achieving an accuracy of 80% for patients and 92.6% for regional risk prediction and providing interpretable explanations. The XAI-generated insights empower healthcare providers to make informed decisions and implement timely interventions. Furthermore, the IoMT transmission model ensures efficient and secure data transfer. Full article
(This article belongs to the Special Issue eHealth and mHealth)
Show Figures

Figure 1

16 pages, 1111 KiB  
Article
Design and Evaluation of Steganographic Channels in Fifth-Generation New Radio
by Markus Walter and Jörg Keller
Future Internet 2024, 16(11), 410; https://doi.org/10.3390/fi16110410 - 6 Nov 2024
Viewed by 457
Abstract
Mobile communication is ubiquitous in everyday life. The fifth generation of mobile networks (5G) introduced 5G New Radio as a radio access technology that meets current bandwidth, quality, and application requirements. Network steganographic channels that hide secret message transfers in an innocent carrier [...] Read more.
Mobile communication is ubiquitous in everyday life. The fifth generation of mobile networks (5G) introduced 5G New Radio as a radio access technology that meets current bandwidth, quality, and application requirements. Network steganographic channels that hide secret message transfers in an innocent carrier communication are a particular threat in mobile communications as these channels are often used for malware, ransomware, and data leakage. We systematically analyze the protocol stack of the 5G–air interface for its susceptibility to network steganography, addressing both storage and timing channels. To ensure large coverage, we apply hiding patterns that collect the essential ideas used to create steganographic channels. Based on the results of this analysis, we design and implement a network covert storage channel, exploiting reserved bits in the header of the Packet Data Convergence Protocol (PDCP). the covert sender and receiver are located in a 5G base station and mobile device, respectively. Furthermore, we sketch a timing channel based on a recent overshadowing attack. We evaluate our steganographic storage channel both in simulation and real-world experiments with respect to steganographic bandwidth, robustness, and stealthiness. Moreover, we discuss countermeasures. Our implementation demonstrates the feasibility of a covert channel in 5G New Radio and the possibility of achieving large steganographic bandwidth for broadband transmissions. We also demonstrate that the detection of the channel by a network analyzer is possible, limiting its scope to application scenarios where operators are unaware or ignorant of this threat. Full article
(This article belongs to the Special Issue 5G Security: Challenges, Opportunities, and the Road Ahead)
Show Figures

Figure 1

30 pages, 7823 KiB  
Article
Real-Time Evaluation of the Improved Eagle Strategy Model in the Internet of Things
by Venushini Rajendran and R Kanesaraj Ramasamy
Future Internet 2024, 16(11), 409; https://doi.org/10.3390/fi16110409 - 6 Nov 2024
Viewed by 1200
Abstract
With the rapid expansion of cloud computing and the pervasive growth of IoT across industries and educational sectors, the need for efficient remote data management and service orchestration has become paramount. Web services, facilitated by APIs, offer a modular approach to integrating and [...] Read more.
With the rapid expansion of cloud computing and the pervasive growth of IoT across industries and educational sectors, the need for efficient remote data management and service orchestration has become paramount. Web services, facilitated by APIs, offer a modular approach to integrating and streamlining complex business processes. However, real-time monitoring and optimal service selection within large-scale, cloud-based repositories remain significant challenges. This study introduces the novel Improved Eagle Strategy (IES) hybrid model, which uniquely integrates bio-inspired optimization with clustering techniques to drastically reduce computation time while ensuring highly accurate service selection tailored to specific user requirements. Through comprehensive NetLogo simulations, the IES model demonstrates superior efficiency in service selection compared to existing methodologies. Additionally, the IES model’s application through a web dashboard system highlights its capability to manage both functional and non-functional service attributes effectively. When deployed on real-time IoT devices, the IES model not only enhances computation speed but also ensures a more responsive and user-centric service environment. This research underscores the transformative potential of the IES model, marking a significant advancement in optimizing cloud computing processes, particularly within the IoT ecosystem. Full article
Show Figures

Figure 1

26 pages, 1452 KiB  
Article
Machine Learning-Based Resource Allocation Algorithm to Mitigate Interference in D2D-Enabled Cellular Networks
by Md Kamruzzaman, Nurul I. Sarkar and Jairo Gutierrez
Future Internet 2024, 16(11), 408; https://doi.org/10.3390/fi16110408 - 6 Nov 2024
Viewed by 1114
Abstract
Mobile communications have experienced exponential growth both in connectivity and multimedia traffic in recent years. To support this tremendous growth, device-to-device (D2D) communications play a significant role in 5G and beyond 5G networks. However, enabling D2D communications in an underlay, heterogeneous cellular network [...] Read more.
Mobile communications have experienced exponential growth both in connectivity and multimedia traffic in recent years. To support this tremendous growth, device-to-device (D2D) communications play a significant role in 5G and beyond 5G networks. However, enabling D2D communications in an underlay, heterogeneous cellular network poses two major challenges. First, interference management between D2D and cellular users directly affects a system’s performance. Second, achieving an acceptable level of link quality for both D2D and cellular networks is necessary. An optimum resource allocation is required to mitigate the interference and improve a system’s performance. In this paper, we provide a solution to interference management with an acceptable quality of services (QoS). To this end, we propose a machine learning-based resource allocation method to maximize throughput and achieve minimum QoS requirements for all active D2D pairs and cellular users. We first solve a resource optimization problem by allocating spectrum resources and controlling power transmission on demand. As resource optimization is an integer nonlinear programming problem, we address this problem by proposing a deep Q-network-based reinforcement learning algorithm (DRL) to optimize the resource allocation issue. The proposed DRL algorithm is trained with a decision-making policy to obtain the best solution in terms of spectrum efficiency, computational time, and throughput. The system performance is validated by simulation. The results show that the proposed method outperforms the existing ones. Full article
Show Figures

Figure 1

30 pages, 3027 KiB  
Article
Privacy-Preserving Data Analytics in Internet of Medical Things
by Bakhtawar Mudassar, Shahzaib Tahir, Fawad Khan, Syed Aziz Shah, Syed Ikram Shah and Qammer Hussain Abbasi
Future Internet 2024, 16(11), 407; https://doi.org/10.3390/fi16110407 - 5 Nov 2024
Viewed by 1497
Abstract
The healthcare sector has changed dramatically in recent years due to depending more and more on big data to improve patient care, enhance or improve operational effectiveness, and forward medical research. Protecting patient privacy in the era of digital health records is a [...] Read more.
The healthcare sector has changed dramatically in recent years due to depending more and more on big data to improve patient care, enhance or improve operational effectiveness, and forward medical research. Protecting patient privacy in the era of digital health records is a major challenge, as there could be a chance of privacy leakage during the process of collecting patient data. To overcome this issue, we propose a secure, privacy-preserving scheme for healthcare data to ensure maximum privacy of an individual while also maintaining their utility and allowing for the performance of queries based on sensitive attributes under differential privacy. We implemented differential privacy on two publicly available healthcare datasets, the Breast Cancer Prediction Dataset and the Nursing Home COVID-19 Dataset. Moreover, we examined the impact of varying privacy parameter (ε) values on both the privacy and utility of the data. A significant part of this study involved the selection of ε, which determines the degree of privacy protection. We also conducted a computational time comparison by performing multiple complex queries on these datasets to analyse the computational overhead introduced by differential privacy. The outcomes demonstrate that, despite a slight increase in query processing time, it remains within reasonable bounds, ensuring the practicality of differential privacy for real-time applications. Full article
(This article belongs to the Special Issue Privacy and Security Issues with Edge Learning in IoT Systems)
Show Figures

Figure 1

30 pages, 10493 KiB  
Article
Visualisation Design Ideation with AI: A New Framework, Vocabulary, and Tool
by Aron E. Owen and Jonathan C. Roberts
Future Internet 2024, 16(11), 406; https://doi.org/10.3390/fi16110406 - 5 Nov 2024
Viewed by 1748
Abstract
This paper introduces an innovative framework for visualisation design ideation, which includes a collection of terms for creative visualisation design, the five-step process, and an implementation called VisAlchemy. Throughout the visualisation ideation process, individuals engage in exploring various concepts, brainstorming, sketching ideas, prototyping, [...] Read more.
This paper introduces an innovative framework for visualisation design ideation, which includes a collection of terms for creative visualisation design, the five-step process, and an implementation called VisAlchemy. Throughout the visualisation ideation process, individuals engage in exploring various concepts, brainstorming, sketching ideas, prototyping, and experimenting with different methods to visually represent data or information. Sometimes, designers feel incapable of sketching, and the ideation process can be quite lengthy. In such cases, generative AI can provide assistance. However, even with AI, it can be difficult to know which vocabulary to use and how to strategically approach the design process. Our strategy prompts imaginative and structured narratives for generative AI use, facilitating the generation and refinement of visualisation design ideas. We aim to inspire fresh and innovative ideas, encouraging creativity and exploring unconventional concepts. VisAlchemy is a five-step framework: a methodical approach to defining, exploring, and refining prompts to enhance the generative AI process. The framework blends design elements and aesthetics with context and application. In addition, we present a vocabulary set of 300 words, underpinned from a corpus of visualisation design and art papers, along with a demonstration tool called VisAlchemy. The interactive interface of the VisAlchemy tool allows users to adhere to the framework and generate innovative visualisation design concepts. It is built using the SDXL Turbo language model. Finally, we demonstrate its use through case studies and examples and show the transformative power of the framework to create inspired and exciting design ideas through refinement, re-ordering, weighting of words and word rephrasing. Full article
(This article belongs to the Special Issue Human-Centered Artificial Intelligence)
Show Figures

Figure 1

18 pages, 3552 KiB  
Article
A Secure Auditable Remote Registry Pattern for IoT Systems
by Antonio Maña, Francisco J. Jaime and Lucía Gutiérrez
Future Internet 2024, 16(11), 405; https://doi.org/10.3390/fi16110405 - 4 Nov 2024
Viewed by 383
Abstract
In software engineering, pattern papers serve the purpose of providing a description of a generalized, reusable solution to recurring design problems, based on practical experience and established best practices. This paper presents an architectural pattern for a Secure Auditable Registry service based on [...] Read more.
In software engineering, pattern papers serve the purpose of providing a description of a generalized, reusable solution to recurring design problems, based on practical experience and established best practices. This paper presents an architectural pattern for a Secure Auditable Registry service based on Message-Oriented Middleware to be used in large-scale IoT systems that must provide auditing capabilities to external entities. To prepare the pattern, the direct experience in applying the pattern solution in an industry-funded R&D project has been a key aspect because it has allowed us to gain a deep understanding of the problem and the solution, and it has contributed to the correctness and real-world applicability of the pattern as described. To further improve the quality of the paper, we have followed the commonly accepted practices in pattern development (including peer reviews) to ensure that the core aspects of the solution are correctly represented and that the description allows it to be applicable to similar problems in other domains, such as healthcare, autonomous devices, banking, food tracing or manufacturing to name a few. The work done in applying this pattern confirms that it solves a recurring problem for IoT systems, but also that it can be adopted in other domains, providing an effective solution in order to achieve enhancement of the auditability capabilities of the target systems. This pattern will be part of a pattern language (i.e., a family of related patterns) that we are developing for transitioning from legacy systems to IoT with an emphasis on security. Full article
(This article belongs to the Special Issue Cybersecurity in the IoT)
Show Figures

Figure 1

35 pages, 4745 KiB  
Article
6G Use Cases and Scenarios: A Comparison Analysis Between ITU and Other Initiatives
by Alessandro Vizzarri and Franco Mazzenga
Future Internet 2024, 16(11), 404; https://doi.org/10.3390/fi16110404 - 1 Nov 2024
Viewed by 1651
Abstract
In the next decade, the amount of network traffic is estimated to reach Zettabytes. The future International Mobile Telecommunications-2030 (IMT-2030) standard of mobile networks, known as 6G, introduces an important paradigm shift in the context of wireless communication systems thanks to capabilities such [...] Read more.
In the next decade, the amount of network traffic is estimated to reach Zettabytes. The future International Mobile Telecommunications-2030 (IMT-2030) standard of mobile networks, known as 6G, introduces an important paradigm shift in the context of wireless communication systems thanks to capabilities such as low latency and high data rates. Official documents on 6G standardization have been released by the International Telecommunication Union (ITU). However, other visions and use cases of 6G have been proposed by industrial stakeholders and research institutions, thus generating a multitude of use cases and usage scenarios that are only apparently different from each other. This paper would contribute to providing a holistic vision of the 6G-enabled use cases and potentially impacted vertical market sectors. The differences and similarities between what has been proposed by ITU and other initiatives are identified through a comparison based on the technological characterization of use cases and of the considered vertical market sectors. The main findings presented in this paper demonstrate that many of the use cases proposed by ITU and by the other initiatives are almost identical in many cases. Full article
Show Figures

Figure 1

23 pages, 1060 KiB  
Article
Uncertainty-Aware Time Series Anomaly Detection
by Paul Wiessner, Grigor Bezirganyan, Sana Sellami, Richard Chbeir and Hans-Joachim Bungartz
Future Internet 2024, 16(11), 403; https://doi.org/10.3390/fi16110403 - 31 Oct 2024
Viewed by 1796
Abstract
Traditional anomaly detection methods in time series data often struggle with inherent uncertainties like noise and missing values. Indeed, current approaches mostly focus on quantifying epistemic uncertainty and ignore data-dependent uncertainty. However, consideration of noise in data is important as it may have [...] Read more.
Traditional anomaly detection methods in time series data often struggle with inherent uncertainties like noise and missing values. Indeed, current approaches mostly focus on quantifying epistemic uncertainty and ignore data-dependent uncertainty. However, consideration of noise in data is important as it may have the potential to lead to more robust detection of anomalies and a better capability of distinguishing between real anomalies and anomalous patterns provoked by noise. In this paper, we propose LSTMAE-UQ (Long Short-Term Memory Autoencoder with Aleatoric and Epistemic Uncertainty Quantification), a novel approach that incorporates both aleatoric (data noise) and epistemic (model uncertainty) uncertainties for more robust anomaly detection. The model combines the strengths of LSTM networks for capturing complex time series relationships and autoencoders for unsupervised anomaly detection and quantifies uncertainties based on the Bayesian posterior approximation method Monte Carlo (MC) Dropout, enabling a deeper understanding of noise recognition. Our experimental results across different real-world datasets show that consideration of uncertainty effectively increases the robustness to noise and point outliers, making predictions more reliable for longer periodic sequential data. Full article
(This article belongs to the Special Issue Industrial Internet of Things (IIoT): Trends and Technologies)
Show Figures

Figure 1

24 pages, 9406 KiB  
Article
Lightweight Digit Recognition in Smart Metering System Using Narrowband Internet of Things and Federated Learning
by Vladimir Nikić, Dušan Bortnik, Milan Lukić, Dejan Vukobratović and Ivan Mezei
Future Internet 2024, 16(11), 402; https://doi.org/10.3390/fi16110402 - 31 Oct 2024
Viewed by 1932
Abstract
Replacing mechanical utility meters with digital ones is crucial due to the numerous benefits they offer, including increased time resolution in measuring consumption, remote monitoring capabilities for operational efficiency, real-time data for informed decision-making, support for time-of-use billing, and integration with smart grids, [...] Read more.
Replacing mechanical utility meters with digital ones is crucial due to the numerous benefits they offer, including increased time resolution in measuring consumption, remote monitoring capabilities for operational efficiency, real-time data for informed decision-making, support for time-of-use billing, and integration with smart grids, leading to enhanced customer service, reduced energy waste, and progress towards environmental sustainability goals. However, the cost associated with replacing mechanical meters with their digital counterparts is a key factor contributing to the relatively slow roll-out of such devices. In this paper, we present a low-cost and power-efficient solution for retrofitting the existing metering infrastructure, based on state-of-the-art communication and artificial intelligence technologies. The edge device we developed contains a camera for capturing images of a dial meter, a 32-bit microcontroller capable of running the digit recognition algorithm, and an NB-IoT module with (E)GPRS fallback, which enables nearly ubiquitous connectivity even in difficult radio conditions. Our digit recognition methodology, based on the on-device training and inference, augmented with federated learning, achieves a high level of accuracy (97.01%) while minimizing the energy consumption and associated communication overhead (87 μWh per day on average). Full article
Show Figures

Figure 1

Previous Issue
Next Issue
Back to TopTop