sensors-logo

Journal Browser

Journal Browser

Big Data Analytics in Internet of Things Environment

A special issue of Sensors (ISSN 1424-8220). This special issue belongs to the section "Internet of Things".

Deadline for manuscript submissions: closed (30 March 2022) | Viewed by 59394

Special Issue Editors

Department of Computer Science, University of Swabi, Swabi 23430, Pakistan
Interests: software engineering; big data science; machine learning and deep learning; modelling
Special Issues, Collections and Topics in MDPI journals

E-Mail Website
Guest Editor
Department of Computer Science and Engineering of Systems, University of Zaragoza, 50001 Teruel, Spain
Interests: mobile applications for health and well-being; Internet of things; wearable sensors; big data; datamining; agent-based simulation and multi-agent systems

Special Issue Information

Dear Colleagues,

The growth of information technology and increase in the use of smart devices such as sensors, actuators, wearables, and other devices, has produced a massive amount of data. This data is available in various forms such as healthcare, medical records, data warehouses, and many others. This increase of information can yield to research issues and challenges, as extracting useful information becomes a challenging task for research. The useful insights once drawn in a successful way can ultimately be used for different purposes including analysis of data, managing the data, patient care, etc., which will provide effective solutions in the area. Big data is the data that need to be shaped for its volume, size, and shape in order to extract meaningful information for an explicit purpose. Data is ever playing a significant role in organization and industry for daily activities to smoothly function.

Keeping in view the importance of data and to extract significant information, the data need to be shaped in a structured way to mine the information and useful insights for the use of research and practice. Diverse approaches, methods, and mechanisms are in practice to tackle the issues of big data and its analytics. Scientific programming is considered as a key tool and plays a major role in facilitating solutions to the current and future issues that exist in the managing of large-scale data, such as by supporting in the processing of huge data volumes, complex system modelling, and sourcing derivations from data and its simulations. Programming tools, such as Tableau, Apache Hadoop, and Informatica PowerCenter, analyze and manage the data efficiently and allow the visualization of expressive comprehensions extracted from data.

Scientific Programming facilitates research forums for elaborating research results in, and practical experiences with, analyzing, programming tools, programming languages, and experienced models of computation in data science aimed specifically at supporting scientific research in the domain of data science.

This Special Issue invites original research articles and review articles that demonstrate the incorporation of technologies in data science based on scientific programming and its applications in the domain. Studies and research that consider technological and computational barriers to data science management are particularly welcome.

Prof. Iván García-Magariño
Dr. Shah Nazir
Guest Editors

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All submissions that pass pre-check are peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Sensors is an international peer-reviewed open access semimonthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 2600 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Keywords

  • Scientific programming for data science
  • Tools and their applications in big data
  • Scientific programming for data science visualization
  • Deep learning and machine learning algorithms for data analytics
  • Evaluation of data science
  • Data warehouse and knowledge representation of big data technologies
  • Data mining and optimizations in IoT
  • Probabilistic computing for managing data science
  • Data science and decision support system
  • Data and its impact on science mining
  • Data science and software engineering

Benefits of Publishing in a Special Issue

  • Ease of navigation: Grouping papers by topic helps scholars navigate broad scope journals more efficiently.
  • Greater discoverability: Special Issues support the reach and impact of scientific research. Articles in Special Issues are more discoverable and cited more frequently.
  • Expansion of research network: Special Issues facilitate connections among authors, fostering scientific collaborations.
  • External promotion: Articles in Special Issues are often promoted through the journal's social media, increasing their visibility.
  • e-Book format: Special Issues with more than 10 articles can be published as dedicated e-books, ensuring wide and rapid dissemination.

Further information on MDPI's Special Issue polices can be found here.

Published Papers (17 papers)

Order results
Result details
Select all
Export citation of selected articles as:

Research

Jump to: Review

22 pages, 3426 KiB  
Article
A Computationally Efficient Online/Offline Signature Scheme for Underwater Wireless Sensor Networks
by Syed Sajid Ullah, Saddam Hussain, Mueen Uddin, Roobaea Alroobaea, Jawaid Iqbal, Abdullah M. Baqasah, Maha Abdelhaq and Raed Alsaqour
Sensors 2022, 22(14), 5150; https://doi.org/10.3390/s22145150 - 8 Jul 2022
Cited by 4 | Viewed by 1829
Abstract
Underwater wireless sensor networks (UWSNs) have emerged as the most widely used wireless network infrastructure in many applications. Sensing nodes are frequently deployed in hostile aquatic environments in order to collect data on resources that are severely limited in terms of transmission time [...] Read more.
Underwater wireless sensor networks (UWSNs) have emerged as the most widely used wireless network infrastructure in many applications. Sensing nodes are frequently deployed in hostile aquatic environments in order to collect data on resources that are severely limited in terms of transmission time and bandwidth. Since underwater information is very sensitive and unique, the authentication of users is very important to access the data and information. UWSNs have unique communication and computation needs that are not met by the existing digital signature techniques. As a result, a lightweight signature scheme is required to meet the communication and computation requirements. In this research, we present a Certificateless Online/Offline Signature (COOS) mechanism for UWSNs. The proposed scheme is based on the concept of a hyperelliptic curves cryptosystem, which offers the same degree of security as RSA, bilinear pairing, and elliptic curve cryptosystems (ECC) but with a smaller key size. In addition, the proposed scheme was proven secure in the random oracle model under the hyperelliptic curve discrete logarithm problem. A security analysis was also carried out, as well as comparisons with appropriate current online/offline signature schemes. The comparison demonstrated that the proposed scheme is superior to the existing schemes in terms of both security and efficiency. Additionally, we also employed the fuzzy-based Evaluation-based Distance from Average Solutions (EDAS) technique to demonstrate the effectiveness of the proposed scheme. Full article
(This article belongs to the Special Issue Big Data Analytics in Internet of Things Environment)
Show Figures

Figure 1

17 pages, 2192 KiB  
Article
SAX and Random Projection Algorithms for the Motif Discovery of Orbital Asteroid Resonance Using Big Data Platforms
by Lala Septem Riza, Muhammad Naufal Fazanadi, Judhistira Aria Utama, Khyrina Airin Fariza Abu Samah, Taufiq Hidayat and Shah Nazir
Sensors 2022, 22(14), 5071; https://doi.org/10.3390/s22145071 - 6 Jul 2022
Cited by 1 | Viewed by 1748
Abstract
The phenomenon of big data has occurred in many fields of knowledge, one of which is astronomy. One example of a large dataset in astronomy is that of numerically integrated time series asteroid orbital elements from a time span of millions to billions [...] Read more.
The phenomenon of big data has occurred in many fields of knowledge, one of which is astronomy. One example of a large dataset in astronomy is that of numerically integrated time series asteroid orbital elements from a time span of millions to billions of years. For example, the mean motion resonance (MMR) data of an asteroid are used to find out the duration that the asteroid was in a resonance state with a particular planet. For this reason, this research designs a computational model to obtain the mean motion resonance quickly and effectively by modifying and implementing the Symbolic Aggregate Approximation (SAX) algorithm and the motif discovery random projection algorithm on big data platforms (i.e., Apache Hadoop and Apache Spark). There are five following steps on the model: (i) saving data into the Hadoop Distributed File System (HDFS); (ii) importing files to the Resilient Distributed Datasets (RDD); (iii) preprocessing the data; (iv) calculating the motif discovery by executing the User-Defined Function (UDF) program; and (v) gathering the results from the UDF to the HDFS and the .csv file. The results indicated a very significant reduction in computational time between the use of the standalone method and the use of the big data platform. The proposed computational model obtained an average accuracy of 83%, compared with the SwiftVis software. Full article
(This article belongs to the Special Issue Big Data Analytics in Internet of Things Environment)
Show Figures

Figure 1

15 pages, 1355 KiB  
Article
Cloud of Things in Crowd Engineering: A Tile-Map-Based Method for Intelligent Monitoring of Outdoor Crowd Density
by Abdullah Alamri
Sensors 2022, 22(9), 3328; https://doi.org/10.3390/s22093328 - 26 Apr 2022
Cited by 8 | Viewed by 2587
Abstract
Managing citizen and community safety is one of the most essential services that future cities will require. Crowd analysis and monitoring are also a high priority in the current COVID-19 pandemic scenario, especially because large-scale gatherings can significantly increase the risk of infection [...] Read more.
Managing citizen and community safety is one of the most essential services that future cities will require. Crowd analysis and monitoring are also a high priority in the current COVID-19 pandemic scenario, especially because large-scale gatherings can significantly increase the risk of infection transmission. However, crowd tracking presents several complex technical challenges, including accurate people counting and privacy preservation. In this study, using a tile-map-based method, a new intelligent method is proposed which is integrated with the cloud of things and data analytics to provide intelligent monitoring of outdoor crowd density. The proposed system can detect and intelligently analyze the pattern of crowd activity to implement contingency plans, reducing accidents, ensuring public safety, and establishing a smart city. The experimental results demonstrate the feasibility of the proposed model in detecting crowd density status in real-time. It can effectively assist with crowd management tasks such as monitoring, guiding, and managing crowds to ensure safety. In addition, the proposed algorithm provides acceptable performance. Full article
(This article belongs to the Special Issue Big Data Analytics in Internet of Things Environment)
Show Figures

Figure 1

20 pages, 9923 KiB  
Article
Individual Tree Species Classification Based on Convolutional Neural Networks and Multitemporal High-Resolution Remote Sensing Images
by Xianfei Guo, Hui Li, Linhai Jing and Ping Wang
Sensors 2022, 22(9), 3157; https://doi.org/10.3390/s22093157 - 20 Apr 2022
Cited by 18 | Viewed by 2976
Abstract
The classification of individual tree species (ITS) is beneficial to forest management and protection. Previous studies in ITS classification that are primarily based on airborne LiDAR and aerial photographs have achieved the highest classification accuracies. However, because of the complex and high cost [...] Read more.
The classification of individual tree species (ITS) is beneficial to forest management and protection. Previous studies in ITS classification that are primarily based on airborne LiDAR and aerial photographs have achieved the highest classification accuracies. However, because of the complex and high cost of data acquisition, it is difficult to apply ITS classification in the classification of large-area forests. High-resolution, satellite remote sensing data have abundant sources and significant application potential in ITS classification. Based on Worldview-3 and Google Earth images, convolutional neural network (CNN) models were employed to improve the classification accuracy of ITS by fully utilizing the feature information contained in different seasonal images. Among the three CNN models, DenseNet yielded better performances than ResNet and GoogLeNet. It offered an OA of 75.1% for seven tree species using only the WorldView-3 image and an OA of 78.1% using the combinations of WorldView-3 and autumn Google Earth images. The results indicated that Google Earth images with suitable temporal detail could be employed as auxiliary data to improve the classification accuracy. Full article
(This article belongs to the Special Issue Big Data Analytics in Internet of Things Environment)
Show Figures

Figure 1

13 pages, 7893 KiB  
Article
Integrated IoT-Based Secure and Efficient Key Management Framework Using Hashgraphs for Autonomous Vehicles to Ensure Road Safety
by Sudan Jha, Nishant Jha, Deepak Prashar, Sultan Ahmad, Bader Alouffi and Abdullah Alharbi
Sensors 2022, 22(7), 2529; https://doi.org/10.3390/s22072529 - 25 Mar 2022
Cited by 15 | Viewed by 3877
Abstract
Autonomous vehicles offer various advantages to both vehicle owners and automobile companies. However, despite the advantages, there are various risks associated with these vehicles. These vehicles interact with each other by forming a vehicular network, also known as VANET, in a centralized manner. [...] Read more.
Autonomous vehicles offer various advantages to both vehicle owners and automobile companies. However, despite the advantages, there are various risks associated with these vehicles. These vehicles interact with each other by forming a vehicular network, also known as VANET, in a centralized manner. This centralized network is vulnerable to cyber-attacks which can cause data loss, resulting in road accidents. Thus, to prevent the vehicular network from being attacked and to prevent the privacy of the data, key management is used. However, key management alone over a centralized network is not effective in ensuring data integrity in a vehicular network. To resolve this issue, various studies have introduced a blockchain-based approach and enabled key management over a decentralized network. This technique is also found effective in ensuring the privacy of all the stakeholders involved in a vehicular network. Furthermore, a blockchain-based key management system can also help in storing a large amount of data over a distributed network, which can encourage a faster exchange of information between vehicles in a network. However, there are certain limitations of blockchain technology that may affect the efficient working of autonomous vehicles. Most of the existing blockchain-based systems are implemented over Ethereum or Bitcoin. The transaction-processing capability of these blockchains is in the range of 5 to 20 transactions per second, whereas hashgraphs are capable of processing thousands of transactions per second as the data are processed exponentially. Furthermore, a hashgraph prevents the user from altering the order of the transactions being processed, and they do not need high computational powers to operate, which may help in reducing the overall cost of the system. Due to the advantages offered by a hashgraph, an advanced key management framework based on a hashgraph for secure communication between the vehicles is suggested in this paper. The framework is developed using the concept of Leaving of Vehicles based on a Logical Key Hierarchy (LKH) and Batch Rekeying. The system is tested and compared with other closely related systems on the basis of the transaction compilation time and change in traffic rates. Full article
(This article belongs to the Special Issue Big Data Analytics in Internet of Things Environment)
Show Figures

Figure 1

17 pages, 520 KiB  
Article
Incremental Ant-Miner Classifier for Online Big Data Analytics
by Amal Al-Dawsari, Isra Al-Turaiki and Heba Kurdi
Sensors 2022, 22(6), 2223; https://doi.org/10.3390/s22062223 - 13 Mar 2022
Viewed by 2440
Abstract
Internet of Things (IoT) environments produce large amounts of data that are challenging to analyze. The most challenging aspect is reducing the quantity of consumed resources and time required to retrain a machine learning model as new data records arrive. Therefore, for big [...] Read more.
Internet of Things (IoT) environments produce large amounts of data that are challenging to analyze. The most challenging aspect is reducing the quantity of consumed resources and time required to retrain a machine learning model as new data records arrive. Therefore, for big data analytics in IoT environments where datasets are highly dynamic, evolving over time, it is highly advised to adopt an online (also called incremental) machine learning model that can analyze incoming data instantaneously, rather than an offline model (also called static), that should be retrained on the entire dataset as new records arrive. The main contribution of this paper is to introduce the Incremental Ant-Miner (IAM), a machine learning algorithm for online prediction based on one of the most well-established machine learning algorithms, Ant-Miner. IAM classifier tackles the challenge of reducing the time and space overheads associated with the classic offline classifiers, when used for online prediction. IAM can be exploited in managing dynamic environments to ensure timely and space-efficient prediction, achieving high accuracy, precision, recall, and F-measure scores. To show its effectiveness, the proposed IAM was run on six different datasets from different domains, namely horse colic, credit cards, flags, ionosphere, and two breast cancer datasets. The performance of the proposed model was compared to ten state-of-the-art classifiers: naive Bayes, logistic regression, multilayer perceptron, support vector machine, K*, adaptive boosting (AdaBoost), bagging, Projective Adaptive Resonance Theory (PART), decision tree (C4.5), and random forest. The experimental results illustrate the superiority of IAM as it outperformed all the benchmarks in nearly all performance measures. Additionally, IAM only needs to be rerun on the new data increment rather than the entire big dataset on the arrival of new data records, which makes IAM better in time- and resource-saving. These results demonstrate the strong potential and efficiency of the IAM classifier for big data analytics in various areas. Full article
(This article belongs to the Special Issue Big Data Analytics in Internet of Things Environment)
Show Figures

Figure 1

20 pages, 5812 KiB  
Article
Product Quality Assessment in the Internet of Things: A Consumer-Oriented Approach
by Mohammed Sharief Abdelrahman Naem, Mouloud Koudil and Zineeddine Ouldimam
Sensors 2022, 22(6), 2215; https://doi.org/10.3390/s22062215 - 12 Mar 2022
Cited by 1 | Viewed by 2874
Abstract
This paper proposes a consumer-oriented approach to IoT product recommendations. It is designed to help new consumers choose high-quality IoT products that best meet their needs. We used hybrid techniques to implement the proposed approach. Experiments were also conducted to implement an intelligent [...] Read more.
This paper proposes a consumer-oriented approach to IoT product recommendations. It is designed to help new consumers choose high-quality IoT products that best meet their needs. We used hybrid techniques to implement the proposed approach. Experiments were also conducted to implement an intelligent IoT marketing system at the Rehab marketplace. The system has shown good results in its performance, usability, and user satisfaction. These results confirm the applicability and effectiveness of the approach in assessing and recommending IoT products. Full article
(This article belongs to the Special Issue Big Data Analytics in Internet of Things Environment)
Show Figures

Figure 1

23 pages, 3437 KiB  
Article
Characterization of English Braille Patterns Using Automated Tools and RICA Based Feature Extraction Methods
by Sana Shokat, Rabia Riaz, Sanam Shahla Rizvi, Inayat Khan and Anand Paul
Sensors 2022, 22(5), 1836; https://doi.org/10.3390/s22051836 - 25 Feb 2022
Cited by 6 | Viewed by 3850
Abstract
Braille is used as a mode of communication all over the world. Technological advancements are transforming the way Braille is read and written. This study developed an English Braille pattern identification system using robust machine learning techniques using the English Braille Grade-1 dataset. [...] Read more.
Braille is used as a mode of communication all over the world. Technological advancements are transforming the way Braille is read and written. This study developed an English Braille pattern identification system using robust machine learning techniques using the English Braille Grade-1 dataset. English Braille Grade-1 dataset was collected using a touchscreen device from visually impaired students of the National Special Education School Muzaffarabad. For better visualization, the dataset was divided into two classes as class 1 (1–13) (a–m) and class 2 (14–26) (n–z) using 26 Braille English characters. A position-free braille text entry method was used to generate synthetic data. N = 2512 cases were included in the final dataset. Support Vector Machine (SVM), Decision Trees (DT) and K-Nearest Neighbor (KNN) with Reconstruction Independent Component Analysis (RICA) and PCA-based feature extraction methods were used for Braille to English character recognition. Compared to PCA, Random Forest (RF) algorithm and Sequential methods, better results were achieved using the RICA-based feature extraction method. The evaluation metrics used were the True Positive Rate (TPR), True Negative Rate (TNR), Positive Predictive Value (PPV), Negative Predictive Value (NPV), False Positive Rate (FPR), Total Accuracy, Area Under the Receiver Operating Curve (AUC) and F1-Score. A statistical test was also performed to justify the significance of the results. Full article
(This article belongs to the Special Issue Big Data Analytics in Internet of Things Environment)
Show Figures

Figure 1

21 pages, 47782 KiB  
Article
AFD-StackGAN: Automatic Mask Generation Network for Face De-Occlusion Using StackGAN
by Abdul Jabbar, Xi Li, Muhammad Assam, Javed Ali Khan, Marwa Obayya, Mimouna Abdullah Alkhonaini, Fahd N. Al-Wesabi and Muhammad Assad
Sensors 2022, 22(5), 1747; https://doi.org/10.3390/s22051747 - 23 Feb 2022
Cited by 8 | Viewed by 3175
Abstract
To address the problem of automatically detecting and removing the mask without user interaction, we present a GAN-based automatic approach for face de-occlusion, called Automatic Mask Generation Network for Face De-occlusion Using Stacked Generative Adversarial Networks (AFD-StackGAN). In this approach, we decompose the [...] Read more.
To address the problem of automatically detecting and removing the mask without user interaction, we present a GAN-based automatic approach for face de-occlusion, called Automatic Mask Generation Network for Face De-occlusion Using Stacked Generative Adversarial Networks (AFD-StackGAN). In this approach, we decompose the problem into two primary stages (i.e., Stage-I Network and Stage-II Network) and employ a separate GAN in both stages. Stage-I Network (Binary Mask Generation Network) automatically creates a binary mask for the masked region in the input images (occluded images). Then, Stage-II Network (Face De-occlusion Network) removes the mask object and synthesizes the damaged region with fine details while retaining the restored face’s appearance and structural consistency. Furthermore, we create a paired synthetic face-occluded dataset using the publicly available CelebA face images to train the proposed model. AFD-StackGAN is evaluated using real-world test images gathered from the Internet. Our extensive experimental results confirm the robustness and efficiency of the proposed model in removing complex mask objects from facial images compared to the previous image manipulation approaches. Additionally, we provide ablation studies for performance comparison between the user-defined mask and auto-defined mask and demonstrate the benefits of refiner networks in the generation process. Full article
(This article belongs to the Special Issue Big Data Analytics in Internet of Things Environment)
Show Figures

Figure 1

18 pages, 3150 KiB  
Article
An Enhanced Intrusion Detection Model Based on Improved kNN in WSNs
by Gaoyuan Liu, Huiqi Zhao, Fang Fan, Gang Liu, Qiang Xu and Shah Nazir
Sensors 2022, 22(4), 1407; https://doi.org/10.3390/s22041407 - 11 Feb 2022
Cited by 106 | Viewed by 5657
Abstract
Aiming at the intrusion detection problem of the wireless sensor network (WSN), considering the combined characteristics of the wireless sensor network, we consider setting up a corresponding intrusion detection system on the edge side through edge computing. An intrusion detection system (IDS), as [...] Read more.
Aiming at the intrusion detection problem of the wireless sensor network (WSN), considering the combined characteristics of the wireless sensor network, we consider setting up a corresponding intrusion detection system on the edge side through edge computing. An intrusion detection system (IDS), as a proactive network security protection technology, provides an effective defense system for the WSN. In this paper, we propose a WSN intelligent intrusion detection model, through the introduction of the k-Nearest Neighbor algorithm (kNN) in machine learning and the introduction of the arithmetic optimization algorithm (AOA) in evolutionary calculation, to form an edge intelligence framework that specifically performs the intrusion detection when the WSN encounters a DoS attack. In order to enhance the accuracy of the model, we use a parallel strategy to enhance the communication between the populations and use the Lévy flight strategy to adjust the optimization. The proposed PL-AOA algorithm performs well in the benchmark function test and effectively guarantees the improvement of the kNN classifier. We use Matlab2018b to conduct simulation experiments based on the WSN-DS data set and our model achieves 99% ACC, with a nearly 10% improvement compared with the original kNN when performing DoS intrusion detection. The experimental results show that the proposed intrusion detection model has good effects and practical application significance. Full article
(This article belongs to the Special Issue Big Data Analytics in Internet of Things Environment)
Show Figures

Figure 1

22 pages, 2939 KiB  
Article
An Efficient Multilevel Probabilistic Model for Abnormal Traffic Detection in Wireless Sensor Networks
by Muhammad Altaf Khan, Moustafa M. Nasralla, Muhammad Muneer Umar, Ghani-Ur-Rehman, Shafiullah Khan and Nikumani Choudhury
Sensors 2022, 22(2), 410; https://doi.org/10.3390/s22020410 - 6 Jan 2022
Cited by 24 | Viewed by 2651
Abstract
Wireless sensor networks (WSNs) are low-cost, special-purpose networks introduced to resolve various daily life domestic, industrial, and strategic problems. These networks are deployed in such places where the repairments, in most cases, become difficult. The nodes in WSNs, due to their vulnerable nature, [...] Read more.
Wireless sensor networks (WSNs) are low-cost, special-purpose networks introduced to resolve various daily life domestic, industrial, and strategic problems. These networks are deployed in such places where the repairments, in most cases, become difficult. The nodes in WSNs, due to their vulnerable nature, are always prone to various potential threats. The deployed environment of WSNs is noncentral, unattended, and administrativeless; therefore, malicious attacks such as distributed denial of service (DDoS) attacks can easily be commenced by the attackers. Most of the DDoS detection systems rely on the analysis of the flow of traffic, ultimately with a conclusion that high traffic may be due to the DDoS attack. On the other hand, legitimate users may produce a larger amount of traffic known, as the flash crowd (FC). Both DDOS and FC are considered abnormal traffic in communication networks. The detection of such abnormal traffic and then separation of DDoS attacks from FC is also a focused challenge. This paper introduces a novel mechanism based on a Bayesian model to detect abnormal data traffic and discriminate DDoS attacks from FC in it. The simulation results prove the effectiveness of the proposed mechanism, compared with the existing systems. Full article
(This article belongs to the Special Issue Big Data Analytics in Internet of Things Environment)
Show Figures

Figure 1

20 pages, 5862 KiB  
Article
Haptic Feedback to Assist Blind People in Indoor Environment Using Vibration Patterns
by Shah Khusro, Babar Shah, Inayat Khan and Sumayya Rahman
Sensors 2022, 22(1), 361; https://doi.org/10.3390/s22010361 - 4 Jan 2022
Cited by 21 | Viewed by 4794
Abstract
Feedback is one of the significant factors for the mental mapping of an environment. It is the communication of spatial information to blind people to perceive the surroundings. The assistive smartphone technologies deliver feedback for different activities using several feedback mediums, including voice, [...] Read more.
Feedback is one of the significant factors for the mental mapping of an environment. It is the communication of spatial information to blind people to perceive the surroundings. The assistive smartphone technologies deliver feedback for different activities using several feedback mediums, including voice, sonification and vibration. Researchers 0have proposed various solutions for conveying feedback messages to blind people using these mediums. Voice and sonification feedback are effective solutions to convey information. However, these solutions are not applicable in a noisy environment and may occupy the most important auditory sense. The privacy of a blind user can also be compromised with speech feedback. The vibration feedback could effectively be used as an alternative approach to these mediums. This paper proposes a real-time feedback system specifically designed for blind people to convey information to them based on vibration patterns. The proposed solution has been evaluated through an empirical study by collecting data from 24 blind people through a mixed-mode survey using a questionnaire. Results show the average recognition accuracy for 10 different vibration patterns are 90%, 82%, 75%, 87%, 65%, and 70%. Full article
(This article belongs to the Special Issue Big Data Analytics in Internet of Things Environment)
Show Figures

Figure 1

18 pages, 4364 KiB  
Article
Balancing Complex Signals for Robust Predictive Modeling
by Fazal Aman, Azhar Rauf, Rahman Ali, Jamil Hussain and Ibrar Ahmed
Sensors 2021, 21(24), 8465; https://doi.org/10.3390/s21248465 - 18 Dec 2021
Cited by 2 | Viewed by 3085
Abstract
Robust predictive modeling is the process of creating, validating, and testing models to obtain better prediction outcomes. Datasets usually contain outliers whose trend deviates from the most data points. Conventionally, outliers are removed from the training dataset during preprocessing before building predictive models. [...] Read more.
Robust predictive modeling is the process of creating, validating, and testing models to obtain better prediction outcomes. Datasets usually contain outliers whose trend deviates from the most data points. Conventionally, outliers are removed from the training dataset during preprocessing before building predictive models. Such models, however, may have poor predictive performance on the unseen testing data involving outliers. In modern machine learning, outliers are regarded as complex signals because of their significant role and are not suggested for removal from the training dataset. Models trained in modern regimes are interpolated (over trained) by increasing their complexity to treat outliers locally. However, such models become inefficient as they require more training due to the inclusion of outliers, and this also compromises the models’ accuracy. This work proposes a novel complex signal balancing technique that may be used during preprocessing to incorporate the maximum number of complex signals (outliers) in the training dataset. The proposed approach determines the optimal value for maximum possible inclusion of complex signals for training with the highest performance of the model in terms of accuracy, time, and complexity. The experimental results show that models trained after preprocessing with the proposed technique achieve higher predictive accuracy with improved execution time and low complexity as compared to traditional predictive modeling. Full article
(This article belongs to the Special Issue Big Data Analytics in Internet of Things Environment)
Show Figures

Graphical abstract

16 pages, 781 KiB  
Article
Diagnostic Approach for Accurate Diagnosis of COVID-19 Employing Deep Learning and Transfer Learning Techniques through Chest X-ray Images Clinical Data in E-Healthcare
by Amin Ul Haq, Jian Ping Li, Sultan Ahmad, Shakir Khan, Mohammed Ali Alshara and Reemiah Muneer Alotaibi
Sensors 2021, 21(24), 8219; https://doi.org/10.3390/s21248219 - 9 Dec 2021
Cited by 45 | Viewed by 3726
Abstract
COVID-19 is a transferable disease that is also a leading cause of death for a large number of people worldwide. This disease, caused by SARS-CoV-2, spreads very rapidly and quickly affects the respiratory system of the human being. Therefore, it is necessary to [...] Read more.
COVID-19 is a transferable disease that is also a leading cause of death for a large number of people worldwide. This disease, caused by SARS-CoV-2, spreads very rapidly and quickly affects the respiratory system of the human being. Therefore, it is necessary to diagnosis this disease at the early stage for proper treatment, recovery, and controlling the spread. The automatic diagnosis system is significantly necessary for COVID-19 detection. To diagnose COVID-19 from chest X-ray images, employing artificial intelligence techniques based methods are more effective and could correctly diagnosis it. The existing diagnosis methods of COVID-19 have the problem of lack of accuracy to diagnosis. To handle this problem we have proposed an efficient and accurate diagnosis model for COVID-19. In the proposed method, a two-dimensional Convolutional Neural Network (2DCNN) is designed for COVID-19 recognition employing chest X-ray images. Transfer learning (TL) pre-trained ResNet-50 model weight is transferred to the 2DCNN model to enhanced the training process of the 2DCNN model and fine-tuning with chest X-ray images data for final multi-classification to diagnose COVID-19. In addition, the data augmentation technique transformation (rotation) is used to increase the data set size for effective training of the R2DCNNMC model. The experimental results demonstrated that the proposed (R2DCNNMC) model obtained high accuracy and obtained 98.12% classification accuracy on CRD data set, and 99.45% classification accuracy on CXI data set as compared to baseline methods. This approach has a high performance and could be used for COVID-19 diagnosis in E-Healthcare systems. Full article
(This article belongs to the Special Issue Big Data Analytics in Internet of Things Environment)
Show Figures

Graphical abstract

18 pages, 25423 KiB  
Article
Detecting the Severity of Socio-Spatial Conflicts Involving Wild Boars in the City Using Social Media Data
by Małgorzata Dudzińska and Agnieszka Dawidowicz
Sensors 2021, 21(24), 8215; https://doi.org/10.3390/s21248215 - 8 Dec 2021
Cited by 1 | Viewed by 2717
Abstract
The encroachment of wild boars into urban areas is a growing problem. The occurrence of wild boars in cities leads to conflict situations. Socio-spatial conflicts can escalate to a varied degree. Assessments of these conflicts can be performed by analyzing spatial data concerning [...] Read more.
The encroachment of wild boars into urban areas is a growing problem. The occurrence of wild boars in cities leads to conflict situations. Socio-spatial conflicts can escalate to a varied degree. Assessments of these conflicts can be performed by analyzing spatial data concerning the affected locations and wild boar behaviors. The collection of spatial data is a laborious and costly process that requires access to urban surveillance systems, in addition to regular analyses of intervention reports. A supporting method for assessing the risk of wild boar encroachment and socio-spatial conflict in cities was proposed in the present study. The developed approach relies on big data, namely, multimedia and descriptive data that are on social media. The proposed method was tested in the city of Olsztyn in Poland. The main aim of this study was to evaluate the applicability of data crowdsourced from a popular social networking site for determining the location and severity of conflicts. A photointerpretation method and the kernel density estimation (KDE) tool implemented in ArcGIS Desktop 10.7.1 software were applied in the study. The proposed approach fills a gap in the application of crowdsourcing data to identify types of socio-spatial conflicts involving wild boars in urban areas. Validation of the results with reports of calls to intervention services showed the high coverage of this approach and thus the usefulness of crowdsourcing data. Full article
(This article belongs to the Special Issue Big Data Analytics in Internet of Things Environment)
Show Figures

Figure 1

15 pages, 1038 KiB  
Article
RM-ADR: Resource Management Adaptive Data Rate for Mobile Application in LoRaWAN
by Khola Anwar, Taj Rahman, Asim Zeb, Inayat Khan, Mahdi Zareei and Cesar Vargas-Rosales
Sensors 2021, 21(23), 7980; https://doi.org/10.3390/s21237980 - 30 Nov 2021
Cited by 17 | Viewed by 3114
Abstract
LoRaWAN is renowned and a mostly supported technology for the Internet of Things, using an energy-efficient Adaptive Data Rate (ADR) to allocate resources (e.g., Spreading Factor (SF)) and Transmit Power (TP) to a large number of End Devices (EDs). When these EDs are [...] Read more.
LoRaWAN is renowned and a mostly supported technology for the Internet of Things, using an energy-efficient Adaptive Data Rate (ADR) to allocate resources (e.g., Spreading Factor (SF)) and Transmit Power (TP) to a large number of End Devices (EDs). When these EDs are mobile, the fixed SF allocation is not efficient owing to the sudden changes caused in the link conditions between the ED and the gateway. As a result of this situation, significant packet loss occurs, increasing the retransmissions from EDs. Therefore, we propose a Resource Management ADR (RM-ADR) at both ED and Network Sides (NS) by considering the packet transmission information and received power to address this issue. Through simulation results, RM-ADR showed improved performance compared to the state-of-the-art ADR techniques. The findings indicate a faster convergence time by minimizing packet loss ratio and retransmission in a mobile LoRaWAN network environment. Full article
(This article belongs to the Special Issue Big Data Analytics in Internet of Things Environment)
Show Figures

Figure 1

Review

Jump to: Research

37 pages, 6012 KiB  
Review
A Comprehensive Survey on Signcryption Security Mechanisms in Wireless Body Area Networks
by Saddam Hussain, Syed Sajid Ullah, Mueen Uddin, Jawaid Iqbal and Chin-Ling Chen
Sensors 2022, 22(3), 1072; https://doi.org/10.3390/s22031072 - 29 Jan 2022
Cited by 25 | Viewed by 4025
Abstract
WBANs (Wireless Body Area Networks) are frequently depicted as a paradigm shift in healthcare from traditional to modern E-Healthcare. The vitals of the patient signs by the sensors are highly sensitive, secret, and vulnerable to numerous adversarial attacks. Since WBANs is a real-world [...] Read more.
WBANs (Wireless Body Area Networks) are frequently depicted as a paradigm shift in healthcare from traditional to modern E-Healthcare. The vitals of the patient signs by the sensors are highly sensitive, secret, and vulnerable to numerous adversarial attacks. Since WBANs is a real-world application of the healthcare system, it’s vital to ensure that the data acquired by the WBANs sensors is secure and not accessible to unauthorized parties or security hazards. As a result, effective signcryption security solutions are required for the WBANs’ success and widespread use. Over the last two decades, researchers have proposed a slew of signcryption security solutions to achieve this goal. The lack of a clear and unified study in terms of signcryption solutions can offer a bird’s eye view of WBANs. Based on the most recent signcryption papers, we analyzed WBAN’s communication architecture, security requirements, and the primary problems in WBANs to meet the aforementioned objectives. This survey also includes the most up to date signcryption security techniques in WBANs environments. By identifying and comparing all available signcryption techniques in the WBANs sector, the study will aid the academic community in understanding security problems and causes. The goal of this survey is to provide a comparative review of the existing signcryption security solutions and to analyze the previously indicated solution given for WBANs. A multi-criteria decision-making approach is used for a comparative examination of the existing signcryption solutions. Furthermore, the survey also highlights some of the public research issues that researchers must face to develop the security features of WBANs. Full article
(This article belongs to the Special Issue Big Data Analytics in Internet of Things Environment)
Show Figures

Figure 1

Back to TopTop