1. Introduction
Over the last years few years, the number of interconnected devices within the context of Internet of Things (IoT) has rapidly grown; some statistics state that the total number of IoT-connected devices in 2023 has reached the groundbreaking number of 17 billion. While this number may appear substantially large, if compared with the total number of people on Earth, it is still limited and is thus expected to rise significantly in the next few years. Several new application sectors are expected to rise together with the rapid increase in the performances of all the technological segments of IoT infrastructures.
Embedded systems in particular have experienced a rapid growth in their computation capabilities: the clock frequency of microcontrollers and microprocessors is almost comparable with the one achievable by common desktop computers, while their cost is still low and their power consumption is also continually decreasing. This aspect goes hand in hand with the availability of Internet of Things (IoT)-devoted data transmission technologies, which are being extensively exploited to move several data processing tasks from the cloud towards the edge of the networks. Together with the technologies traditionally employed for data transmission, like WiFi and Bluetooth, new ones have been developed, focusing specifically on the requirements of the IoT; these include the transmission of small quantities of data (few kBs or even less), at the price of a reduced power consumption and of negligible costs.
While the availability of low-cost hardware platforms enables the implementation of artificial intelligence (AI) algorithms at the edge, on the cloud, and spread throughout the network, these data transmission technologies allow a capillary data exchange among IoT nodes, as well as the distribution of the computation load according to the fog computing paradigm. Such architectural flexibility is opening the way to a plethora of applications, which fully exploit the availability of computing power available at all levels. The definition of the ideal computing model is no longer only linked to the computation performances of single processing tools, but also to the best trade-off between these performances and other requirements like cost and power consumption reduction, easy system deployability, the time required for the results of the computations to foster a prompt actuation, and so on.
2. Contributions
The four contributions present in this Special Issue tackle different aspects related to distributed computing architectures within the IoT domain. While some of them focus on specific aspects, some others are more devoted to the description of novel applications and innovative ways to face specific problems.
This is the case of the first paper [
1], which focuses specifically on the field of hydraulic power units. In this context, the scope of this paper is to describe a complete architecture from the sensors used to acquire a large set of parameters, to the edge computing tools, up to the cloud infrastructure. Following the description of the system architecture, anomaly detection algorithms are implemented in the case of this work on the cloud. The acquired data are processed, focusing on two different techniques, with the aim of detecting anomalies in the working parameters of the monitored hydraulic power unit.
The second paper [
2] does not focus on a specific application, but discusses the ways in which to improve the Quality of Experience (QoE) of video streaming in the presence of a large number of users. In particular, the paper analyses Content Delivery Networks (CDNs), whose main purpose is to provide users with services like audio or video streaming in real time. In order to improve the efficiency of CDNs, the authors propose the use of multiagent reinforcement learning, with the aim of solving the problem of multiedge servers cooperatively serving a large quantity of users.
The third paper [
3] focuses on an interesting aspect of distributed computing that is regarded as the network forensics in the context of edge and fog computing. While the theme of network forensic, i.e., the collection, analysis, and interpretation of network-related data from a cybersecurity perspective, has been extensively studied, the context of fog and edge environments poses a number of entirely new challenges. The paper analyses all these challenges by providing interesting suggestions on open problems as well as interesting future research directions.
Finally, the fourth paper [
4] focuses on crucial architectural aspects like resource allocation, exploitation, and management in the Cloud Continuum, as well as a way to indicate the capillary network that interconnects the edge and the cloud, thereby enabling dynamic and adaptive services. In this context, the paper analyses a number of techniques in the domain of reinforcement learning and computational intelligence, with the aim of solving a service management problem by dealing with the optimization of the services allocated in the Cloud Continuum itself.
Funding
This research received no external funding.
Acknowledgments
The Editor wishes to thank all the authors who, with their knowledge and research outcomes, contributed to this Special Issue. Additional thanks to the Reviewers who, through their crucial work, allowed the improvement and publication of high-profile contributions.
Conflicts of Interest
The author declares no conflicts of interest.
References
- Fic, P.; Czornik, A.; Rosikowski, P. Anomaly Detection for Hydraulic Power Units—A Case Study. Future Internet 2023, 15, 206. [Google Scholar] [CrossRef]
- Tang, X.; Chen, F.; He, Y. Intelligent Video Streaming at Network Edge: An Attention-Based Multiagent Reinforcement Learning Solution. Future Internet 2023, 15, 234. [Google Scholar] [CrossRef]
- Spiekermann, D.; Keller, J. Challenges of Network Forensic Investigation in Fog and Edge Computing. Future Internet 2023, 15, 342. [Google Scholar] [CrossRef]
- Poltronieri, F.; Stefanelli, C.; Tortonesi, M.; Zaccarini, M. Reinforcement Learning vs. Computational Intelligence: Comparing Service Management Approaches for the Cloud Continuum. Future Internet 2023, 15, 359. [Google Scholar] [CrossRef]
| Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2024 by the author. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).