Next Issue
Volume 4, September
Previous Issue
Volume 4, March
 
 

Telecom, Volume 4, Issue 2 (June 2023) – 8 articles

Cover Story (view full-size image): A multi-objective optimization of a machine learning algorithm has been developed and used to reduce the carbon footprint of wireless telecommunication. Various hyperparameter combinations, including layers, neurons, optimizer algorithm, batch size, and dropout, have been assessed to evaluate energy consumption and performance. Using a practical case study and following the training phase of a Recurrent Neural Network (RNN) algorithm, our analysis showed the importance of selecting optimal hyperparameters to achieve a balance between minimizing energy consumption and maximizing machine learning performance. View this paper
  • Issues are regarded as officially published after their release is announced to the table of contents alert mailing list.
  • You may sign up for e-mail alerts to receive table of contents of newly released issues.
  • PDF is the official format for papers published in both, html and pdf forms. To view the papers in pdf format, click on the "PDF Full-text" link, and use the free Adobe Reader to open them.
Order results
Result details
Select all
Export citation of selected articles as:
35 pages, 3392 KiB  
Article
AI-Assisted Multi-Operator RAN Sharing for Energy-Efficient Networks
by Saivenkata Krishna Gowtam Peesapati, Magnus Olsson, Sören Andersson, Christer Qvarfordt and Anders Dahlen
Telecom 2023, 4(2), 334-368; https://doi.org/10.3390/telecom4020020 - 19 Jun 2023
Cited by 1 | Viewed by 2707
Abstract
Recent times have seen a significant rise in interest from mobile operators, vendors, and research projects toward achieving more energy-efficient and sustainable networks. Not surprisingly, it comes at a time when higher traffic demand and more stringent and diverse network requirements result in [...] Read more.
Recent times have seen a significant rise in interest from mobile operators, vendors, and research projects toward achieving more energy-efficient and sustainable networks. Not surprisingly, it comes at a time when higher traffic demand and more stringent and diverse network requirements result in diminishing benefits for operators using complex AI-driven network optimization solutions. In this paper, we propose the idea of tower companies that facilitate radio access network (RAN) infrastructure sharing between operators and evaluate the additional energy savings obtained in this process. In particular, we focus on the RAN-as-a-Service (RANaaS) implementation, wherein each operator leases and controls an independent logical RAN instance running on the shared infrastructure. We show how an AI system can assist operators in optimizing their share of resources under multiple constraints. This paper aims to provide a vision, a quantitative and qualitative analysis of the RANaaS paradigm, and its benefits in terms of energy efficiency. Through simulations, we show the possibility to achieve up to 75 percent energy savings per operator over 24 h compared to the scenario where none of the energy-saving features are activated. This is an additional 55 percent energy savings from sharing the RAN infrastructure compared to the baseline scenario where the operators use independent hardware. Full article
Show Figures

Figure 1

21 pages, 9149 KiB  
Article
Modeling and Evaluation of a Dynamic Channel Selection Framework for Multi-Channel Operation in ITS-G5
by Ottó Váczi and László Bokor
Telecom 2023, 4(2), 313-333; https://doi.org/10.3390/telecom4020019 - 19 Jun 2023
Viewed by 1722
Abstract
Many years ago, it seemed inconceivable that our cars could drive autonomously or communicate with each other to form a self-organizing convoy or platoon. By 2023, however, technological advances have taken us to the point where most of these goals will be achieved. [...] Read more.
Many years ago, it seemed inconceivable that our cars could drive autonomously or communicate with each other to form a self-organizing convoy or platoon. By 2023, however, technological advances have taken us to the point where most of these goals will be achieved. In the time of what was initially known as Day 1, single-channel Intelligent Transport System (ITS) devices fully met the requirements for safe communication. The trends show that with the rapid development and the emergence of new, more robust Vehicle-to-Everything (V2X) applications, which require higher bandwidth (collectively called Day 2), the current single-channel medium access method in the available ITS bands will no longer achieve the desired capacities. The main reason is that Day 2 and beyond V2X information dissemination protocols introduce increasing packet sizes and sending frequencies. To complete a resource-friendly and more efficient operation with Day 2 or other advanced V2X services, ITS standards present the Multi-Channel Operation (MCO) constellation as a potential solution. In the case of MCO, we use two or more channels simultaneously, thus preventing the radio medium from saturating its capacity. However, there are still several pending questions about MCO applicability, practical usage, configuration, and deployment, to name a few. The primary purpose of this article is to present a dynamic channel selection framework design and implementation capable of modeling and simulating advanced multi-channel communication use cases. We used this framework to investigate Channel Busy Ratio (CBR) based dynamic channel switching within the Artery/OMNeT++ simulation environment. Full article
Show Figures

Figure 1

15 pages, 829 KiB  
Article
Benchmarking Message Queues
by Rokin Maharjan, Md Showkat Hossain Chy, Muhammad Ashfakur Arju and Tomas Cerny
Telecom 2023, 4(2), 298-312; https://doi.org/10.3390/telecom4020018 - 13 Jun 2023
Cited by 2 | Viewed by 4999
Abstract
Message queues are a way for different software components or applications to communicate with each other asynchronously by passing messages through a shared buffer. This allows a sender to send a message without needing to wait for an immediate response from the receiver, [...] Read more.
Message queues are a way for different software components or applications to communicate with each other asynchronously by passing messages through a shared buffer. This allows a sender to send a message without needing to wait for an immediate response from the receiver, which can help to improve the system’s performance, reduce latency, and allow components to operate independently. In this paper, we compared and evaluated the performance of four popular message queues: Redis, ActiveMQ Artemis, RabbitMQ, and Apache Kafka. The aim of this study was to provide insights into the strengths and weaknesses of each technology and to help practitioners choose the most appropriate solution for their use case. We primarily evaluated each technology in terms of latency and throughput. Our experiments were conducted using a diverse array of workloads to test the message queues under various scenarios. This enables practitioners to evaluate the performance of the systems and choose the one that best meets their needs. The results show that each technology has its own pros and cons. Specifically, Redis performed the best in terms of latency, whereas Kafka significantly outperformed the other three technologies in terms of throughput. The optimal choice depends on the specific requirements of the use case. This paper presents valuable insights for practitioners and researchers working with message queues. Furthermore, the results of our experiments are provided in JSON format as a supplement to this paper. Full article
Show Figures

Figure 1

19 pages, 1490 KiB  
Article
Phishing Detection in Blockchain Transaction Networks Using Ensemble Learning
by Roseline Oluwaseun Ogundokun, Micheal Olaolu Arowolo, Robertas Damaševičius and Sanjay Misra
Telecom 2023, 4(2), 279-297; https://doi.org/10.3390/telecom4020017 - 31 May 2023
Cited by 7 | Viewed by 4324
Abstract
The recent progress in blockchain and wireless communication infrastructures has paved the way for creating blockchain-based systems that protect data integrity and enable secure information sharing. Despite these advancements, concerns regarding security and privacy continue to impede the widespread adoption of blockchain technology, [...] Read more.
The recent progress in blockchain and wireless communication infrastructures has paved the way for creating blockchain-based systems that protect data integrity and enable secure information sharing. Despite these advancements, concerns regarding security and privacy continue to impede the widespread adoption of blockchain technology, especially when sharing sensitive data. Specific security attacks against blockchains, such as data poisoning attacks, privacy leaks, and a single point of failure, must be addressed to develop efficient blockchain-supported IT infrastructures. This study proposes the use of deep learning methods, including Long Short-Term Memory (LSTM), Bi-directional LSTM (Bi-LSTM), and convolutional neural network LSTM (CNN-LSTM), to detect phishing attacks in a blockchain transaction network. These methods were evaluated on a dataset comprising malicious and benign addresses from the Ethereum blockchain dark list and whitelist dataset, and the results showed an accuracy of 99.72%. Full article
Show Figures

Figure 1

14 pages, 2877 KiB  
Article
Digital Twins: Enabling Interoperability in Smart Manufacturing Networks
by Eoin O’Connell, William O’Brien, Mangolika Bhattacharya, Denis Moore and Mihai Penica
Telecom 2023, 4(2), 265-278; https://doi.org/10.3390/telecom4020016 - 11 May 2023
Cited by 25 | Viewed by 5470
Abstract
As Industry 4.0 networks continue to evolve at a rapid pace, they are becoming increasingly complex and distributed. These networks incorporate a range of technologies that are integrated into smart manufacturing systems, requiring adaptability, security, and resilience. However, managing the complexity of Industry [...] Read more.
As Industry 4.0 networks continue to evolve at a rapid pace, they are becoming increasingly complex and distributed. These networks incorporate a range of technologies that are integrated into smart manufacturing systems, requiring adaptability, security, and resilience. However, managing the complexity of Industry 4.0 networks presents significant challenges, particularly in terms of security and the integration of diverse technologies into a functioning and efficient infrastructure. To address these challenges, emerging digital twin standards are enabling the connection of various systems by linking individual digital twins, creating a system of systems. The objective is to develop a “universal translator” that can interpret inputs from both the real and digital worlds, merging them into a seamless cyber-physical reality. It will be demonstrated how the myriad of technologies and systems in Industry 4.0 networks can be connected through the use of digital twins to create a seamless “system of systems”. This will improve interoperability, resilience, and security in smart manufacturing systems. The paper will also outline the potential benefits and limitations of digital twins in addressing the challenges of Industry 4.0 networks. Full article
Show Figures

Figure 1

16 pages, 343 KiB  
Article
Deep Learning Optimisation of Static Malware Detection with Grid Search and Covering Arrays
by Fahad T. ALGorain and Abdulrahman S. Alnaeem
Telecom 2023, 4(2), 249-264; https://doi.org/10.3390/telecom4020015 - 4 May 2023
Cited by 2 | Viewed by 2450
Abstract
This paper investigates the impact of several hyperparameters on static malware detection using deep learning, including the number of epochs, batch size, number of layers and neurons, optimisation method, dropout rate, type of activation function, and learning rate. We employed the cAgen tool [...] Read more.
This paper investigates the impact of several hyperparameters on static malware detection using deep learning, including the number of epochs, batch size, number of layers and neurons, optimisation method, dropout rate, type of activation function, and learning rate. We employed the cAgen tool and grid search optimisation from the scikit-learn Python library to identify the best hyperparameters for our Keras deep learning model. Our experiments reveal that cAgen is more efficient than grid search in finding the optimal parameters, and we find that the selection of hyperparameter values has a significant impact on the model’s accuracy. Specifically, our approach leads to significant improvements in the neural network model’s accuracy for static malware detection on the Ember dataset (from 81.2% to 95.7%) and the Kaggle dataset (from 94% to 98.6%). These results demonstrate the effectiveness of our proposed approach, and have important implications for the field of static malware detection. Full article
Show Figures

Figure 1

13 pages, 1575 KiB  
Article
Ecological Dynamics and Evolution of Cooperation in Vehicular Ad Hoc Networks
by Javad Salimi Sartakhti and Fatemeh Stodt
Telecom 2023, 4(2), 236-248; https://doi.org/10.3390/telecom4020014 - 25 Apr 2023
Viewed by 1424
Abstract
In Vehicular Ad Hoc Networks (VANETs), promoting cooperative behavior is a challenging problem for mechanism designers. Cooperative actions, such as disseminating data, can seem at odds with rationality and may benefit other vehicles at a cost to oneself. Without additional mechanisms, it is [...] Read more.
In Vehicular Ad Hoc Networks (VANETs), promoting cooperative behavior is a challenging problem for mechanism designers. Cooperative actions, such as disseminating data, can seem at odds with rationality and may benefit other vehicles at a cost to oneself. Without additional mechanisms, it is expected that cooperative behavior in the population will decrease and eventually disappear. Classical game theoretical models for cooperation, such as the public goods game, predict this outcome, but they assume fixed population sizes and overlook the ecological dynamics of the interacting vehicles. In this paper, we propose an evolutionary public goods game that incorporates VANET ecological dynamics and offers new insights for promoting cooperation. Our model considers free spaces, population density, departure rates of vehicles, and randomly composed groups for each data sender. Theoretical analysis and simulation results show that higher population densities and departure rates, due to minimum differences between pay-offs of vehicles, promote cooperative behavior. This feedback between ecological dynamics and evolutionary game dynamics leads to interesting results. Our proposed model demonstrates a new extension of evolutionary dynamics to vehicles of varying densities. We show that it is possible to promote cooperation in VANETs without the need for any supporting mechanisms. Future research can investigate the potential for using this model in practical settings. Full article
Show Figures

Figure 1

17 pages, 2334 KiB  
Article
Analysis and Multiobjective Optimization of a Machine Learning Algorithm for Wireless Telecommunication
by Samah Temim, Larbi Talbi and Farid Bensebaa
Telecom 2023, 4(2), 219-235; https://doi.org/10.3390/telecom4020013 - 17 Apr 2023
Cited by 1 | Viewed by 2834
Abstract
There has been a fast deployment of wireless networks in recent years, which has been accompanied by significant impacts on the environment. Among the solutions that have been proven to be effective in reducing the energy consumption of wireless networks is the use [...] Read more.
There has been a fast deployment of wireless networks in recent years, which has been accompanied by significant impacts on the environment. Among the solutions that have been proven to be effective in reducing the energy consumption of wireless networks is the use of machine learning algorithms in cell traffic management. However, despite promising results, it should be noted that the computations required by machine learning algorithms have increased at an exponential rate. Massive computing has a surprisingly large carbon footprint, which could affect its real-world deployment. Thus, additional attention needs to be paid to the design and parameterization of these algorithms applied in order to reduce the energy consumption of wireless networks. In this article, we analyze the impact of hyperparameters on the energy consumption and performance of machine learning algorithms used for cell traffic prediction. For each hyperparameter (number of layers, number of neurons per layer, optimizer algorithm, batch size, and dropout) we identified a set of feasible values. Then, for each combination of hyperparameters, we trained our model and analyzed energy consumption and the resulting performance. The results from this study reveal a great correlation between hyperparameters and energy consumption, confirming the paramount importance of selecting optimal hyperparameters. A tradeoff between the minimization of energy consumption and the maximization of machine learning performance is suggested. Full article
Show Figures

Figure 1

Previous Issue
Next Issue
Back to TopTop