sensors-logo

Journal Browser

Journal Browser

Internet of Things and Sensors Networks in 5G Wireless Communications

A special issue of Sensors (ISSN 1424-8220). This special issue belongs to the section "Internet of Things".

Deadline for manuscript submissions: closed (30 September 2019) | Viewed by 65184

Printed Edition Available!
A printed edition of this Special Issue is available here.

Special Issue Editors


E-Mail Website
Guest Editor
James Watt School of Engineering, University of Glasgow, Glasgow G12 8QF, UK
Interests: distributed consensus; distributed systems; blockchain/distributed ledger technology (DLT); connected autonomous systems; Internet of Things (IoT)
Special Issues, Collections and Topics in MDPI journals

E-Mail Website
Guest Editor
School of Engineering, University of Glasgow, Glasgow G12 8QQ, UK
Interests: wireless communications; real-time control systems

E-Mail Website
Guest Editor
James Watt School of Engineering, University of Glasgow, Glasgow, UK
Interests: 5G and Beyond networks
Special Issues, Collections and Topics in MDPI journals

Special Issue Information

Dear Colleagues,

Internet of Things (IoT) has attracted the attention of society, industry, and academia as a promising technology that can enhance day-to-day activities, the creation of new business models, and products and services, and as a broad source of research topics and ideas. It is envisioned that a future digital society will be composed of numerous wireless connected sensors and devices. Judging by this demand, massive IoT (mIoT) or massive machine-type communication (mMTC) has been identified as one of the three main communication scenarios for 5G. In addition to connectivity, computing and storage and data management are also long-standing issues for low-cost devices and sensors. The proposed Special Issue is anticipated as a collection of outstanding technical research and industrial papers covering new research results with a wide range of ingredients within the 5G-and-beyond framework. The main aim of this Special Issue is to provide a platform for the discussion of the major research challenges and achievements on this topic. Theoretical investigations as well as experiments/demos are welcome.

Potential topics include but are not limited to the following:

  • Communication algorithms, protocols, standards, and architectures for IoT and sensor networks;
  • Massive IoT/massive machine-type communication;
  • Low power wide area (LPWA) technologies such as NB-IoT, Lora, etc;
  • Sensor network and the applications to, e.g., healthcare/ wearable systems, and smart X;
  • Security and privacy for IoT and sensor networks;
  • IoT and SN hardware and new devices;
  • Software for IoT and sensor networks;
  • Wireless ad hoc Sensor networks;
  • Routing and data transfer in IoT and SN;
  • Computing and storage and data management for IoT and sensor networks;
  • URLLC communications;
  • Relay, D2D, Could-RAN, and cooperative networks;
  • Resource management, network slicing;
  • Autonomous systems and UAVs;
  • Other related issues.

Dr. Lei Zhang
Dr. Guodong Zhao
Prof. Dr. Muhammad Ali Imran
Guest Editors

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All submissions that pass pre-check are peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Sensors is an international peer-reviewed open access semimonthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 2600 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Keywords

  • IoT
  • Sensor networks
  • 5G
  • mMTC
  • URLLC
  • Security

Benefits of Publishing in a Special Issue

  • Ease of navigation: Grouping papers by topic helps scholars navigate broad scope journals more efficiently.
  • Greater discoverability: Special Issues support the reach and impact of scientific research. Articles in Special Issues are more discoverable and cited more frequently.
  • Expansion of research network: Special Issues facilitate connections among authors, fostering scientific collaborations.
  • External promotion: Articles in Special Issues are often promoted through the journal's social media, increasing their visibility.
  • e-Book format: Special Issues with more than 10 articles can be published as dedicated e-books, ensuring wide and rapid dissemination.

Further information on MDPI's Special Issue polices can be found here.

Published Papers (10 papers)

Order results
Result details
Select all
Export citation of selected articles as:

Research

Jump to: Review

23 pages, 5610 KiB  
Article
A Distributed Testbed for 5G Scenarios: An Experimental Study
by Mohammad Kazem Chamran, Kok-Lim Alvin Yau, Rafidah M. D. Noor and Richard Wong
Sensors 2020, 20(1), 18; https://doi.org/10.3390/s20010018 - 19 Dec 2019
Cited by 10 | Viewed by 5064
Abstract
This paper demonstrates the use of Universal Software Radio Peripheral (USRP), together with Raspberry Pi3 B+ (RP3) as the brain (or the decision making engine), to develop a distributed wireless network in which nodes can communicate with other nodes independently and make decision [...] Read more.
This paper demonstrates the use of Universal Software Radio Peripheral (USRP), together with Raspberry Pi3 B+ (RP3) as the brain (or the decision making engine), to develop a distributed wireless network in which nodes can communicate with other nodes independently and make decision autonomously. In other words, each USRP node (i.e., sensor) is embedded with separate processing units (i.e., RP3), which has not been investigated in the literature, so that each node can make independent decisions in a distributed manner. The proposed testbed in this paper is compared with the traditional distributed testbed, which has been widely used in the literature. In the traditional distributed testbed, there is a single processing unit (i.e., a personal computer) that makes decisions in a centralized manner, and each node (i.e., USRP) is connected to the processing unit via a switch. The single processing unit exchanges control messages with nodes via the switch, while the nodes exchange data packets among themselves using a wireless medium in a distributed manner. The main disadvantage of the traditional testbed is that, despite the network being distributed in nature, decisions are made in a centralized manner. Hence, the response delay of the control message exchange is always neglected. The use of such testbed is mainly due to the limited hardware and monetary cost to acquire a separate processing unit for each node. The experiment in our testbed has shown the increase of end-to-end delay and decrease of packet delivery ratio due to software and hardware delays. The observed multihop transmission is performed using device-to-device (D2D) communication, which has been enabled in 5G. Therefore, nodes can either communicate with other nodes via: (a) a direct communication with the base station at the macrocell, which helps to improve network performance; or (b) D2D that improve spectrum efficiency, whereby traffic is offloaded from macrocell to small cells. Our testbed is the first of its kind in this scale, and it uses RP3 as the distributed decision-making engine incorporated into the USRP/GNU radio platform. This work provides an insight to the development of a 5G network. Full article
(This article belongs to the Special Issue Internet of Things and Sensors Networks in 5G Wireless Communications)
Show Figures

Figure 1

18 pages, 1035 KiB  
Article
User Association and Power Control for Energy Efficiency Maximization in M2M-Enabled Uplink Heterogeneous Networks with NOMA
by Shuang Zhang and Guixia Kang
Sensors 2019, 19(23), 5307; https://doi.org/10.3390/s19235307 - 2 Dec 2019
Cited by 5 | Viewed by 3032
Abstract
To support a vast number of devices with less energy consumption, we propose a new user association and power control scheme for machine to machine enabled heterogeneous networks with non-orthogonal multiple access (NOMA), where a mobile user (MU) acting as a machine-type communication [...] Read more.
To support a vast number of devices with less energy consumption, we propose a new user association and power control scheme for machine to machine enabled heterogeneous networks with non-orthogonal multiple access (NOMA), where a mobile user (MU) acting as a machine-type communication gateway can decode and forward both the information of machine-type communication devices and its own data to the base station (BS) directly. MU association and power control are jointly considered in the formulated as optimization problem for energy efficiency (EE) maximization under the constraints of minimum data rate requirements of MUs. A many-to-one MU association matching algorithm is firstly proposed based on the theory of matching game. By taking swap matching operations among MUs, BSs, and sub-channels, the original problem can be solved by dealing with the EE maximization for each sub-channel. Then, two power control algorithms are proposed, where the tools of sequential optimization, fractional programming, and exhaustive search have been employed. Simulation results are provided to demonstrate the optimality properties of our algorithms under different parameter settings. Full article
(This article belongs to the Special Issue Internet of Things and Sensors Networks in 5G Wireless Communications)
Show Figures

Figure 1

19 pages, 1321 KiB  
Article
K-Means Spreading Factor Allocation for Large-Scale LoRa Networks
by Muhammad Asad Ullah, Junnaid Iqbal, Arliones Hoeller, Richard Demo Souza and Hirley Alves
Sensors 2019, 19(21), 4723; https://doi.org/10.3390/s19214723 - 30 Oct 2019
Cited by 28 | Viewed by 5694
Abstract
Low-power wide-area networks (LPWANs) are emerging rapidly as a fundamental Internet of Things (IoT) technology because of their low-power consumption, long-range connectivity, and ability to support massive numbers of users. With its high growth rate, Long-Range (LoRa) is becoming the most adopted LPWAN [...] Read more.
Low-power wide-area networks (LPWANs) are emerging rapidly as a fundamental Internet of Things (IoT) technology because of their low-power consumption, long-range connectivity, and ability to support massive numbers of users. With its high growth rate, Long-Range (LoRa) is becoming the most adopted LPWAN technology. This research work contributes to the problem of LoRa spreading factor (SF) allocation by proposing an algorithm on the basis of K-means clustering. We assess the network performance considering the outage probabilities of a large-scale unconfirmed-mode class-A LoRa Wide Area Network (LoRaWAN) model, without retransmissions. The proposed algorithm allows for different user distribution over SFs, thus rendering SF allocation flexible. Such distribution translates into network parameters that are application dependent. Simulation results consider different network scenarios and realistic parameters to illustrate how the distance from the gateway and the number of nodes in each SF affects transmission reliability. Theoretical and simulation results show that our SF allocation approach improves the network’s average coverage probability up to 5 percentage points when compared to the baseline model. Moreover, our results show a fairer network operation where the performance difference between the best- and worst-case nodes is significantly reduced. This happens because our method seeks to equalize the usage of each SF. We show that the worst-case performance in one deployment scenario can be enhanced by 1.53 times. Full article
(This article belongs to the Special Issue Internet of Things and Sensors Networks in 5G Wireless Communications)
Show Figures

Figure 1

13 pages, 472 KiB  
Article
Interference-Aware Subcarrier Allocation for Massive Machine-Type Communication in 5G-Enabled Internet of Things
by Wenjun Hou, Song Li, Yanjing Sun, Jiasi Zhou, Xiao Yun and Nannan Lu
Sensors 2019, 19(20), 4530; https://doi.org/10.3390/s19204530 - 18 Oct 2019
Cited by 4 | Viewed by 2898
Abstract
Massive machine-type communication (mMTC) is investigated as one of three typical scenes of the 5th-generation (5G) network. In this paper, we propose a 5G-enabled internet of things (IoT) in which some enhanced mobile broadband devices transmit video stream to a centralized controller and [...] Read more.
Massive machine-type communication (mMTC) is investigated as one of three typical scenes of the 5th-generation (5G) network. In this paper, we propose a 5G-enabled internet of things (IoT) in which some enhanced mobile broadband devices transmit video stream to a centralized controller and some mMTC devices exchange short packet data with adjacent devices via D2D communication to promote inter-device cooperation. Since massive MTC devices have data transmission requirements in 5G-enabled IoT with limited spectrum resources, the subcarrier allocation problem is investigated to maximize the connectivity of mMTC devices subject to the quality of service (QoS) requirement of enhanced Mobile Broadband (eMBB) devices and mMTC devices. To solve the formulated mixed-integer non-linear programming (MINLP) problem, which is NP-hard, an interference-aware subcarrier allocation algorithm for mMTC communication (IASA) is developed to maximize the number of active mMTC devices. Finally, the performance of the proposed algorithm is evaluated by simulation. Numerical results demonstrate that the proposed algorithm outperforms the three traditional benchmark methods, which significantly improves the utilization of the uplink spectrum. This indicates that the proposed IASA algorithm provides a better solution for IoT application. Full article
(This article belongs to the Special Issue Internet of Things and Sensors Networks in 5G Wireless Communications)
Show Figures

Figure 1

18 pages, 428 KiB  
Article
A Dynamic Access Probability Adjustment Strategy for Coded Random Access Schemes
by Jingyun Sun, Rongke Liu and Enrico Paolini
Sensors 2019, 19(19), 4206; https://doi.org/10.3390/s19194206 - 27 Sep 2019
Cited by 3 | Viewed by 2269
Abstract
In this paper, a dynamic access probability adjustment strategy for coded random access schemes based on successive interference cancellation (SIC) is proposed. The developed protocol consists of judiciously tuning the access probability, therefore controlling the number of transmitting users, in order to resolve [...] Read more.
In this paper, a dynamic access probability adjustment strategy for coded random access schemes based on successive interference cancellation (SIC) is proposed. The developed protocol consists of judiciously tuning the access probability, therefore controlling the number of transmitting users, in order to resolve medium access control (MAC) layer congestion states in high load conditions. The protocol is comprised of two steps: Estimation of the number of transmitting users during the current MAC frame and adjustment of the access probability to the subsequent MAC frame, based on the performed estimation. The estimation algorithm exploits a posteriori information, i.e., available information at the end of the SIC process, in particular it relies on both the frame configuration (residual number of collision slots) and the recovered users configuration (vector of recovered users) to effectively reduce mean-square error (MSE). During the access probability adjustment phase, a target load threshold is employed, tailored to the packet loss rate in the finite frame length case. Simulation results revealed that the developed estimator was able to achieve remarkable performance owing to the information gathered from the SIC procedure. It also illustrated how the proposed dynamic access probability strategy can resolve congestion states efficiently. Full article
(This article belongs to the Special Issue Internet of Things and Sensors Networks in 5G Wireless Communications)
Show Figures

Figure 1

11 pages, 2481 KiB  
Article
Noninvasive Suspicious Liquid Detection Using Wireless Signals
by Jiewen Deng, Wanrong Sun, Lei Guan, Nan Zhao, Muhammad Bilal Khan, Aifeng Ren, Jianxun Zhao, Xiaodong Yang and Qammer H. Abbasi
Sensors 2019, 19(19), 4086; https://doi.org/10.3390/s19194086 - 21 Sep 2019
Cited by 7 | Viewed by 3892
Abstract
Conventional liquid detection instruments are very expensive and not conducive to large-scale deployment. In this work, we propose a method for detecting and identifying suspicious liquids based on the dielectric constant by utilizing the radio signals at a 5G frequency band. There are [...] Read more.
Conventional liquid detection instruments are very expensive and not conducive to large-scale deployment. In this work, we propose a method for detecting and identifying suspicious liquids based on the dielectric constant by utilizing the radio signals at a 5G frequency band. There are three major experiments: first, we use wireless channel information (WCI) to distinguish between suspicious and nonsuspicious liquids; then we identify the type of suspicious liquids; and finally, we distinguish the different concentrations of alcohol. The K-Nearest Neighbor (KNN) algorithm is used to classify the amplitude information extracted from the WCI matrix to detect and identify liquids, which is suitable for multimodal problems and easy to implement without training. The experimental result analysis showed that our method could detect more than 98% of the suspicious liquids, identify more than 97% of the suspicious liquid types, and distinguish up to 94% of the different concentrations of alcohol. Full article
(This article belongs to the Special Issue Internet of Things and Sensors Networks in 5G Wireless Communications)
Show Figures

Figure 1

20 pages, 4504 KiB  
Article
Aggregated Throughput Prediction for Collated Massive Machine-Type Communications in 5G Wireless Networks
by Ahmed Adel Aly, Hussein M. ELAttar, Hesham ElBadawy and Wael Abbas
Sensors 2019, 19(17), 3651; https://doi.org/10.3390/s19173651 - 22 Aug 2019
Cited by 12 | Viewed by 3371
Abstract
The demand for extensive data rates in dense-traffic wireless networks has expanded and needs proper controlling schemes. The fifth generation of mobile communications (5G) will accommodate these massive communications, such as massive Machine Type Communications (mMTC), which is considered to be one of [...] Read more.
The demand for extensive data rates in dense-traffic wireless networks has expanded and needs proper controlling schemes. The fifth generation of mobile communications (5G) will accommodate these massive communications, such as massive Machine Type Communications (mMTC), which is considered to be one of its top services. To achieve optimal throughput, which is considered a mandatory quality of service (QoS) metric, the carrier sense multiple access (CSMA) transmission attempt rate needs optimization. As the gradient descent algorithms consume a long time to converge, an approximation technique that distributes a dense global network into local neighborhoods that are less complex than the global ones is presented in this paper. Newton’s method of optimization was used to achieve fast convergence rates, thus, obtaining optimal throughput. The convergence rate depended only on the size of the local networks instead of global dense ones. Additionally, polynomial interpolation was used to estimate the average throughput of the network as a function of the number of nodes and target service rates. Three-dimensional planes of the average throughput were presented to give a profound description to network’s performance. The fast convergence time of the proposed model and its lower complexity are more practical than the previous gradient descent algorithm. Full article
(This article belongs to the Special Issue Internet of Things and Sensors Networks in 5G Wireless Communications)
Show Figures

Figure 1

18 pages, 2430 KiB  
Article
On the Capacity of 5G NR Grant-Free Scheduling with Shared Radio Resources to Support Ultra-Reliable and Low-Latency Communications
by M. Carmen Lucas-Estañ, Javier Gozalvez and Miguel Sepulcre
Sensors 2019, 19(16), 3575; https://doi.org/10.3390/s19163575 - 16 Aug 2019
Cited by 24 | Viewed by 4537
Abstract
5G and beyond networks are being designed to support the future digital society, where numerous sensors, machinery, vehicles and humans will be connected in the so-called Internet of Things (IoT). The support of time-critical verticals such as Industry 4.0 will be especially challenging, [...] Read more.
5G and beyond networks are being designed to support the future digital society, where numerous sensors, machinery, vehicles and humans will be connected in the so-called Internet of Things (IoT). The support of time-critical verticals such as Industry 4.0 will be especially challenging, due to the demanding communication requirements of manufacturing applications such as motion control, control-to-control applications and factory automation, which will require the exchange of critical sensing and control information among the factory nodes. To this aim, important changes have been introduced in 5G for Ultra-Reliable and Low-Latency Communications (URLLC). One of these changes is the introduction of grant-free scheduling for uplink transmissions. The objective is to reduce latency by eliminating the need for User Equipments (UEs—sensors, devices or machinery) to request resources and wait until the network grants them. Grant-free scheduling can reserve radio resources for dedicated UEs or for groups of UEs. The latter option is particularly relevant to support applications with aperiodic or sporadic traffic and deterministic low latency requirements. In this case, when a UE has information to transmit, it must contend for the usage of radio resources. This can lead to potential packet collisions between UEs. 5G introduces the possibility of transmitting K replicas of the same packet to combat such collisions. Previous studies have shown that grant-free scheduling with K replicas and shared resources increases the packet delivery. However, relying upon the transmission of K replicas to achieve a target reliability level can result in additional delays, and it is yet unknown whether grant-free scheduling with K replicas and shared resources can guarantee very high reliability levels with very low latency. This is the objective of this study, that identifies the reliability and latency levels that can be achieved by 5G grant-free scheduling with K replicas and shared resources in the presence of aperiodic traffic, and as a function of the number of UEs, reserved radio resources and replicas K. The study demonstrates that current Fifth Generation New Radio (5G NR) grant-free scheduling has limitations to sustain stringent reliability and latency levels for aperiodic traffic. Full article
(This article belongs to the Special Issue Internet of Things and Sensors Networks in 5G Wireless Communications)
Show Figures

Figure 1

34 pages, 857 KiB  
Article
Narrowband Internet of Things (NB-IoT): From Physical (PHY) and Media Access Control (MAC) Layers Perspectives
by Collins Burton Mwakwata, Hassan Malik, Muhammad Mahtab Alam, Yannick Le Moullec, Sven Parand and Shahid Mumtaz
Sensors 2019, 19(11), 2613; https://doi.org/10.3390/s19112613 - 8 Jun 2019
Cited by 118 | Viewed by 21206
Abstract
Narrowband internet of things (NB-IoT) is a recent cellular radio access technology based on Long-Term Evolution (LTE) introduced by Third-Generation Partnership Project (3GPP) for Low-Power Wide-Area Networks (LPWAN). The main aim of NB-IoT is to support massive machine-type communication (mMTC) and enable low-power, [...] Read more.
Narrowband internet of things (NB-IoT) is a recent cellular radio access technology based on Long-Term Evolution (LTE) introduced by Third-Generation Partnership Project (3GPP) for Low-Power Wide-Area Networks (LPWAN). The main aim of NB-IoT is to support massive machine-type communication (mMTC) and enable low-power, low-cost, and low-data-rate communication. NB-IoT is based on LTE design with some changes to meet the mMTC requirements. For example, in the physical (PHY) layer only single-antenna and low-order modulations are supported, and in the Medium Access Control (MAC) layers only one physical resource block is allocated for resource scheduling. The aim of this survey is to provide a comprehensive overview of the design changes brought in the NB-IoT standardization along with the detailed research developments from the perspectives of Physical and MAC layers. The survey also includes an overview of Evolved Packet Core (EPC) changes to support the Service Capability Exposure Function (SCEF) to manage both IP and non-IP data packets through Control Plane (CP) and User Plane (UP), the possible deployment scenarios of NB-IoT in future Heterogeneous Wireless Networks (HetNet). Finally, existing and emerging research challenges in this direction are presented to motivate future research activities. Full article
(This article belongs to the Special Issue Internet of Things and Sensors Networks in 5G Wireless Communications)
Show Figures

Figure 1

Review

Jump to: Research

38 pages, 2319 KiB  
Review
Fog Computing Enabling Industrial Internet of Things: State-of-the-Art and Research Challenges
by Rabeea Basir, Saad Qaisar, Mudassar Ali, Monther Aldwairi, Muhammad Ikram Ashraf, Aamir Mahmood and Mikael Gidlund
Sensors 2019, 19(21), 4807; https://doi.org/10.3390/s19214807 - 5 Nov 2019
Cited by 97 | Viewed by 11223
Abstract
Industry is going through a transformation phase, enabling automation and data exchange in manufacturing technologies and processes, and this transformation is called Industry 4.0. Industrial Internet-of-Things (IIoT) applications require real-time processing, near-by storage, ultra-low latency, reliability and high data rate, all of which [...] Read more.
Industry is going through a transformation phase, enabling automation and data exchange in manufacturing technologies and processes, and this transformation is called Industry 4.0. Industrial Internet-of-Things (IIoT) applications require real-time processing, near-by storage, ultra-low latency, reliability and high data rate, all of which can be satisfied by fog computing architecture. With smart devices expected to grow exponentially, the need for an optimized fog computing architecture and protocols is crucial. Therein, efficient, intelligent and decentralized solutions are required to ensure real-time connectivity, reliability and green communication. In this paper, we provide a comprehensive review of methods and techniques in fog computing. Our focus is on fog infrastructure and protocols in the context of IIoT applications. This article has two main research areas: In the first half, we discuss the history of industrial revolution, application areas of IIoT followed by key enabling technologies that act as building blocks for industrial transformation. In the second half, we focus on fog computing, providing solutions to critical challenges and as an enabler for IIoT application domains. Finally, open research challenges are discussed to enlighten fog computing aspects in different fields and technologies. Full article
(This article belongs to the Special Issue Internet of Things and Sensors Networks in 5G Wireless Communications)
Show Figures

Figure 1

Back to TopTop