sensors-logo

Journal Browser

Journal Browser

Artificial Intelligence and Machine Learning in Cellular Networks

A special issue of Sensors (ISSN 1424-8220). This special issue belongs to the section "Internet of Things".

Deadline for manuscript submissions: closed (20 December 2021) | Viewed by 11659

Special Issue Editors


E-Mail Website
Guest Editor
Department of Communications Engineering, Universidad de Málaga, 29071 Málaga, Spain
Interests: mobile communications; satellite communications; 5G; artificial intelligence; machine learning

E-Mail Website
Guest Editor
Department of Communications Engineering, University of Málaga, 29071 Málaga, Spain
Interests: cellular networks; wireless industrial communications; machine learning; IoT; big data analytics; wireless network testbeds; wireless network simulation; robotics
Special Issues, Collections and Topics in MDPI journals

Special Issue Information

Dear Colleagues,

In the last few decades, cellular networks have progressed in many directions. On the one hand, their adoption has grown both in the volume of users and the variety of applications. In the age of 5G, cellular networks are no longer centered only on end-users; machine-type communications, such as the Internet of Things (IoT), Industrial IoT, sensor networks, etc. are now first-class citizens in the design, planning, and deployment of novel connectivity solutions. On the other hand, the number of different technologies coexisting in the same ecosystem has also increased, adding layers of complexity to their management, and, thus, increasing the costs of operation and maintenance (O&M). Therefore, the automation of network management becomes a necessity for maintaining the competitiveness of operators. This topic has been explored in the past, with concepts such as cognitive networks and self-organizing networks. With new applications and 5/6G technologies, new challenges arise, calling for novel automation solutions. This Special Issue will gather novel research regarding the application of artificial intelligence and machine learning for the automation of O&M and the improvement of performance in all the diverse scenarios where cellular networks are currently being used, including, but not limited to, IoT, IIoT, sensor networks, device-to-device communications, etc.

Dr. Raquel Barco
Dr. Emil J. Khatib
Guest Editors

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All submissions that pass pre-check are peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Sensors is an international peer-reviewed open access semimonthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 2600 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Keywords

  • network management
  • AI
  • ML
  • 5G
  • cellular networks
  • sensor networks
  • IoT
  • IIoT
  • device-to-device communication (D2D)
  • heterogeneous networks

Benefits of Publishing in a Special Issue

  • Ease of navigation: Grouping papers by topic helps scholars navigate broad scope journals more efficiently.
  • Greater discoverability: Special Issues support the reach and impact of scientific research. Articles in Special Issues are more discoverable and cited more frequently.
  • Expansion of research network: Special Issues facilitate connections among authors, fostering scientific collaborations.
  • External promotion: Articles in Special Issues are often promoted through the journal's social media, increasing their visibility.
  • e-Book format: Special Issues with more than 10 articles can be published as dedicated e-books, ensuring wide and rapid dissemination.

Further information on MDPI's Special Issue polices can be found here.

Published Papers (4 papers)

Order results
Result details
Select all
Export citation of selected articles as:

Research

20 pages, 1032 KiB  
Article
A Dynamic Algorithm for Interference Management in D2D-Enabled Heterogeneous Cellular Networks: Modeling and Analysis
by Md Kamruzzaman, Nurul I. Sarkar and Jairo Gutierrez
Sensors 2022, 22(3), 1063; https://doi.org/10.3390/s22031063 - 29 Jan 2022
Cited by 11 | Viewed by 3083
Abstract
To supporting a wider and diverse range of applications, device-to-device (D2D) communication is a key enabler in heterogeneous cellular networks (HetCNets). It plays an important role in fulfilling the performance and quality of service (QoS) requirements for 5G networks and beyond. D2D-enabled cellular [...] Read more.
To supporting a wider and diverse range of applications, device-to-device (D2D) communication is a key enabler in heterogeneous cellular networks (HetCNets). It plays an important role in fulfilling the performance and quality of service (QoS) requirements for 5G networks and beyond. D2D-enabled cellular networks enable user equipment (UE) to communicate directly, without any or with a partial association with base stations (eNBs). Interference management is one of the critical and complex issues in D2D-enabled HetCNets. Despite the wide adoption of D2D communications, there are very few researchers addressing the problems of mode selection (MS), as well as resource allocation for mutual interference in three-tier cellular networks. In this paper, we first identify and analyze three key factors, namely outage probability, signal-to-interference and noise ratio (SINR), and cell density that influence the performance of D2D-enabled HetCNets. We then propose a dynamic algorithm based on a distance-based approach to minimize the interference and to guarantee QoS for both cellular and D2D communication links. Results obtained show that outage probability is improved by 35% and 49% in eNB and SCeNB links, respectively, when compared with traditional neighbor-based methods. The findings reported in this paper provide some insights into interference management in D2D communications that can help network researchers and engineers contribute to further developing next-generation cellular networks. Full article
(This article belongs to the Special Issue Artificial Intelligence and Machine Learning in Cellular Networks)
Show Figures

Figure 1

18 pages, 844 KiB  
Article
Dynamic Packet Duplication for Industrial URLLC
by David Segura, Emil J. Khatib and Raquel Barco
Sensors 2022, 22(2), 587; https://doi.org/10.3390/s22020587 - 13 Jan 2022
Cited by 7 | Viewed by 2647
Abstract
The fifth-generation (5G) network is presented as one of the main options for Industry 4.0 connectivity. To comply with critical messages, 5G offers the Ultra-Reliable and Low latency Communications (URLLC) service category with a millisecond end-to-end delay and reduced probability of failure. There [...] Read more.
The fifth-generation (5G) network is presented as one of the main options for Industry 4.0 connectivity. To comply with critical messages, 5G offers the Ultra-Reliable and Low latency Communications (URLLC) service category with a millisecond end-to-end delay and reduced probability of failure. There are several approaches to achieve these requirements; however, these come at a cost in terms of redundancy, particularly the solutions based on multi-connectivity, such as Packet Duplication (PD). Specifically, this paper proposes a Machine Learning (ML) method to predict whether PD is required at a specific data transmission to successfully send a URLLC message. This paper is focused on reducing the resource usage with respect to pure static PD. The concept was evaluated on a 5G simulator, comparing between single connection, static PD and PD with the proposed prediction model. The evaluation results show that the prediction model reduced the number of packets sent with PD by 81% while maintaining the same level of latency as a static PD technique, which derives from a more efficient usage of the network resources. Full article
(This article belongs to the Special Issue Artificial Intelligence and Machine Learning in Cellular Networks)
Show Figures

Figure 1

23 pages, 53530 KiB  
Article
A Novel Interference Avoidance Based on a Distributed Deep Learning Model for 5G-Enabled IoT
by Radwa Ahmed Osman, Sherine Nagy Saleh and Yasmine N. M. Saleh
Sensors 2021, 21(19), 6555; https://doi.org/10.3390/s21196555 - 30 Sep 2021
Cited by 16 | Viewed by 2332
Abstract
The co-existence of fifth-generation (5G) and Internet-of-Things (IoT) has become inevitable in many applications since 5G networks have created steadier connections and operate more reliably, which is extremely important for IoT communication. During transmission, IoT devices (IoTDs) communicate with IoT Gateway (IoTG), whereas [...] Read more.
The co-existence of fifth-generation (5G) and Internet-of-Things (IoT) has become inevitable in many applications since 5G networks have created steadier connections and operate more reliably, which is extremely important for IoT communication. During transmission, IoT devices (IoTDs) communicate with IoT Gateway (IoTG), whereas in 5G networks, cellular users equipment (CUE) may communicate with any destination (D) whether it is a base station (BS) or other CUE, which is known as device-to-device (D2D) communication. One of the challenges that face 5G and IoT is interference. Interference may exist at BSs, CUE receivers, and IoTGs due to the sharing of the same spectrum. This paper proposes an interference avoidance distributed deep learning model for IoT and device to any destination communication by learning from data generated by the Lagrange optimization technique to predict the optimum IoTD-D, CUE-IoTG, BS-IoTD and IoTG-CUE distances for uplink and downlink data communication, thus achieving higher overall system throughput and energy efficiency. The proposed model was compared to state-of-the-art regression benchmarks, which provided a huge improvement in terms of mean absolute error and root mean squared error. Both analytical and deep learning models reached the optimal throughput and energy efficiency while suppressing interference to any destination and IoTG. Full article
(This article belongs to the Special Issue Artificial Intelligence and Machine Learning in Cellular Networks)
Show Figures

Figure 1

18 pages, 1447 KiB  
Article
RRH Clustering Using Affinity Propagation Algorithm with Adaptive Thresholding and Greedy Merging in Cloud Radio Access Network
by Seju Park, Han-Shin Jo, Cheol Mun and Jong-Gwan Yook
Sensors 2021, 21(2), 480; https://doi.org/10.3390/s21020480 - 12 Jan 2021
Cited by 3 | Viewed by 2572
Abstract
Affinity propagation (AP) clustering with low complexity and high performance is suitable for radio remote head (RRH) clustering for real-time joint transmission in the cloud radio access network. The existing AP algorithms for joint transmission have the limitation of high computational complexities owing [...] Read more.
Affinity propagation (AP) clustering with low complexity and high performance is suitable for radio remote head (RRH) clustering for real-time joint transmission in the cloud radio access network. The existing AP algorithms for joint transmission have the limitation of high computational complexities owing to re-sweeping preferences (diagonal components of the similarity matrix) to determine the optimal number of clusters as system parameters such as network topology. To overcome this limitation, we propose a new approach in which preferences are fixed, where the threshold changes in response to the variations in system parameters. In AP clustering, each diagonal value of a final converged matrix is mapped to the position (x,y coordinates) of a corresponding RRH to form two-dimensional image. Furthermore, an environment-adaptive threshold value is determined by adopting Otsu’s method, which uses the gray-scale histogram of the image to make a statistical decision. Additionally, a simple greedy merging algorithm is proposed to resolve the problem of inter-cluster interference owing to the adjacent RRHs selected as exemplars (cluster centers). For a realistic performance assessment, both grid and uniform network topologies are considered, including exterior interference and various transmitting power levels of an RRH. It is demonstrated that with similar normalized execution times, the proposed algorithm provides better spectral and energy efficiencies than those of the existing algorithms. Full article
(This article belongs to the Special Issue Artificial Intelligence and Machine Learning in Cellular Networks)
Show Figures

Figure 1

Back to TopTop