applsci-logo

Journal Browser

Journal Browser

Cloud Computing, Big Data, and Internet of Things Technologies in Healthcare and Industry (Industry 4.0)

A special issue of Applied Sciences (ISSN 2076-3417). This special issue belongs to the section "Computing and Artificial Intelligence".

Deadline for manuscript submissions: closed (15 June 2022) | Viewed by 26546

Special Issue Editors


E-Mail Website
Guest Editor
Department of Applied Informatics, Silesian University of Technology, Akademicka 16, 44-100 Gliwice, Poland
Interests: cloud computing; big data; Internet of Things; artificial intelligence; bioinformatics; data analysis; computational intelligence; health informatics; Industry 4.0

E-Mail Website
Co-Guest Editor
Department of Computer Science, Emory University, 400 Dowman Dr, Atlanta, GA 30322, USA
Interests: distributed computing; data security and privacy; IoT and smart systems

Special Issue Information

Dear Colleagues,

It is my pleasure to announce the opening of a new Special Issue in the Applied Science journal.

Internet of Things (IoT) is one of the new technologies that changed the image of the current world, yielding new possibilities, but also proliferating data. IoT devices may constantly produce enormous amounts of data as data streams that can be analyzed in real-time and also collected for further exploration in huge data centers. Due to their scaling capabilities, these data centers are frequently located in the cloud. Cloud computing has revolutionized many fields that require ample computational power. Today, cloud platforms provide huge support for data analyses performed in healthcare and industry, mainly through disclosing scalable resources of different types. In clouds, these resources are available as services, which simplifies their allocation and releasing. This feature is especially useful during the analysis of large volumes of data, like the one produced by medical equipment in various types of laboratory experiments or by IoT devices. The rapid growth of data in healthcare and industry offers new opportunities, but also raises many challenges. This Special Issue intends to encompass the latest advancements concerning scalable cloud computing, Big Data, and Internet of Things solutions in processing and analyzing biomedical and industrial data. The aim of this Special Issue is to reflect the most recent developments in scalable data analysis used for solving problems in a variety of areas related to Healthcare and Industry 4.0.

The scope of the Special Issue covers the use of scalable cloud computing architectures, Big Data, and Internet of Things technologies in various problems related to those areas, including (but not limited to):

  • Cloud-based architectures or Big Data-based algorithms for data analysis and interpretation (e.g., Next Generation Sequencing data or sensor data)
  • IoT-driven monitoring health of people and production systems
  • Integration and exploration of biomedical and industrial data
  • Large-scale AI-driven analytics of data at rest and data streams
  • Scalable solutions for industrial and health informatics

Prof. Dariusz Mrozek
Prof. Dr. Vaidy Sunderam
Guest Editor

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All submissions that pass pre-check are peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Applied Sciences is an international peer-reviewed open access semimonthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 2400 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Keywords

  • Cloud computing
  • Big Data
  • Internet of Things
  • Stream processing
  • Biomedical enginnering
  • Health informatics
  • Industry 4.0
  • Smart factory

Benefits of Publishing in a Special Issue

  • Ease of navigation: Grouping papers by topic helps scholars navigate broad scope journals more efficiently.
  • Greater discoverability: Special Issues support the reach and impact of scientific research. Articles in Special Issues are more discoverable and cited more frequently.
  • Expansion of research network: Special Issues facilitate connections among authors, fostering scientific collaborations.
  • External promotion: Articles in Special Issues are often promoted through the journal's social media, increasing their visibility.
  • e-Book format: Special Issues with more than 10 articles can be published as dedicated e-books, ensuring wide and rapid dissemination.

Further information on MDPI's Special Issue polices can be found here.

Published Papers (7 papers)

Order results
Result details
Select all
Export citation of selected articles as:

Research

Jump to: Review

15 pages, 575 KiB  
Article
The FaaS-Based Cloud Agnostic Architecture of Medical Services—Polish Case Study
by Dariusz R. Augustyn, Łukasz Wyciślik and Mateusz Sojka
Appl. Sci. 2022, 12(15), 7954; https://doi.org/10.3390/app12157954 - 8 Aug 2022
Cited by 1 | Viewed by 2637
Abstract
In this paper, the authors, based on a case study of the Polish healthcare IT system being deployed to the cloud, show the possibilities for limiting the computing resources consumption of rarely used services. The architecture of today’s developed application systems is often [...] Read more.
In this paper, the authors, based on a case study of the Polish healthcare IT system being deployed to the cloud, show the possibilities for limiting the computing resources consumption of rarely used services. The architecture of today’s developed application systems is often based on the architectural style of microservices, where individual groups of services are deployed independently of each other. This is also the case with the system under discussion. Most often, the nature of the workload of each group of services is different, which creates some challenges but also provides opportunities to make optimizations in the consumption of computing resources, thus lowering the environmental footprint and at the same time gaining measurable financial benefits. Unlike other scaling methods, such as those based on MDP and reinforcement learning in particular, which focus on system load prediction, in this paper, the authors propose a reactive approach in which any, even unpredictable, change in system load may result in a change (autoscaling) in the number of instances of computing processes so as to adapt the system to the current demand for computing resources as soon as possible. The authors’ main motivation for undertaking the study is to observe the growing interest in implementing FaaS technology in systems deployed to production in many fields, but with relatively little adoption in the healthcare field. Thus, as part of the research conducted here, the authors propose a solution for infrequently used services enabling the so-called scale-to-zero feature using the FaaS model implemented by the Fission tool. This solution is at the same time compatible with the cloud-agnostic approach which in turn helps avoid so-called cloud computing vendor lock-in. Using the example of the system in question, quantitative experimental results showing the savings achieved are presented, proving the justification for this novel implementation in the field of healthcare IT systems. Full article
Show Figures

Figure 1

21 pages, 1203 KiB  
Article
Role of Blockchain Technology in Combating COVID-19 Crisis
by Zaina Alsaed, Raghad Khweiled, Mousab Hamad, Eman Daraghmi, Omar Cheikhrouhou, Wajdi Alhakami and Habib Hamam
Appl. Sci. 2021, 11(24), 12063; https://doi.org/10.3390/app112412063 - 17 Dec 2021
Cited by 17 | Viewed by 4242
Abstract
The COVID-19 pandemic has negatively affected aspects of human life and various sectors, especially the health sector. These conditions led to the creation of new patterns of life that people have had to deal with to reduce the spread of the epidemic by [...] Read more.
The COVID-19 pandemic has negatively affected aspects of human life and various sectors, especially the health sector. These conditions led to the creation of new patterns of life that people have had to deal with to reduce the spread of the epidemic by committing to social distancing, among others. Therefore, governments and technological organizations had to take advantage of technological developments in the current era to overcome these challenges that were created by these conditions. In this paper, we will discuss the role of the blockchain in combating the COVID-19 crisis. Then we will review the recently recorded blockchain-based research proposals to control the COVID-19 pandemic. Finally, we will highlight the challenges of using blockchain to combat the COVID-19 pandemic and find solutions to mitigate these challenges. Full article
Show Figures

Figure 1

22 pages, 14647 KiB  
Article
Edge-Based Detection of Varroosis in Beehives with IoT Devices with Embedded and TPU-Accelerated Machine Learning
by Dariusz Mrozek, Rafał Gȯrny, Anna Wachowicz and Bożena Małysiak-Mrozek
Appl. Sci. 2021, 11(22), 11078; https://doi.org/10.3390/app112211078 - 22 Nov 2021
Cited by 17 | Viewed by 3623
Abstract
One of the causes of mortality in bees is varroosis, a bee disease caused by the Varroa destructor mite. Varroa destructor mites may occur suddenly in beehives, spread across them, and impair bee colonies, which finally die. Edge IoT (Internet of Things) devices [...] Read more.
One of the causes of mortality in bees is varroosis, a bee disease caused by the Varroa destructor mite. Varroa destructor mites may occur suddenly in beehives, spread across them, and impair bee colonies, which finally die. Edge IoT (Internet of Things) devices capable of processing video streams in real-time, such as the one we propose, may allow for the monitoring of beehives for the presence of Varroa destructor. Additionally, centralization of monitoring in the Cloud data center enables the prevention of the spread of this disease and reduces bee mortality through monitoring entire apiaries. Although there are various IoT or non-IoT systems for bee-related issues, such comprehensive and technically advanced solutions for beekeeping and Varroa detection barely exist or perform mite detection after sending the data to the data center. The latter, in turn, increases communication and storage needs, which we try to limit in our approach. In the paper, we show an innovative Edge-based IoT solution for Varroa destructor detection. The solution relies on Tensor Processing Unit (TPU) acceleration for machine learning-based models pre-trained in the hybrid Cloud environment for bee identification and Varroa destructor infection detection. Our experiments were performed in order to investigate the effectiveness and the time performance of both steps, and the study of the impact of the image resolution on the quality of detection and classification processes prove that we can effectively detect the presence of varroosis in beehives in real-time with the use of Edge artificial intelligence invoked for the analysis of video streams. Full article
Show Figures

Graphical abstract

17 pages, 28437 KiB  
Article
Developing the Smart Sorting Screw System Based on Deep Learning Approaches
by Wan-Ju Lin, Jian-Wen Chen, Hong-Tsu Young, Che-Lun Hung and Kuan-Ming Li
Appl. Sci. 2021, 11(20), 9751; https://doi.org/10.3390/app11209751 - 19 Oct 2021
Cited by 3 | Viewed by 2627
Abstract
The deep learning technique has turned into a mature technique. In addition, many researchers have applied deep learning methods to classify products into defective categories. However, due to the limitations of the devices, the images from factories cannot be trained and inferenced in [...] Read more.
The deep learning technique has turned into a mature technique. In addition, many researchers have applied deep learning methods to classify products into defective categories. However, due to the limitations of the devices, the images from factories cannot be trained and inferenced in real-time. As a result, the AI technology could not be widely implemented in actual factory inspections. In this study, the proposed smart sorting screw system combines the internet of things technique and an anomaly network for detecting the defective region of the screw product. The proposed system has three prominent characteristics. First, the spiral screw images are stitched into a panoramic image to comprehensively detect the defective region that appears on the screw surface. Second, the anomaly network comprising of convolutional autoencoder (CAE) and adversarial autoencoder (AAE) networks is utilized to automatically recognize the defective areas in the absence of a defective-free image for model training. Third, the IoT technique is employed to upload the screw image to the cloud platform for model training and inference, in order to determine if the defective screw product is a pass or fail on the production line. The experimental results show that the image stitching method can precisely merge the spiral screw image to the panoramic image. Among these two anomaly models, the AAE network obtained the best maximum IOU of 0.41 and a maximum dice coefficient score of 0.59. The proposed system has the ability to automatically detect a defective screw image, which is helpful in reducing the flow of the defective products in order to enhance product quality. Full article
Show Figures

Figure 1

24 pages, 2079 KiB  
Article
A Sensor Data-Driven Decision Support System for Liquefied Petroleum Gas Suppliers
by Michał Kozielski, Joanna Henzel, Łukasz Wróbel, Zbigniew Łaskarzewski and Marek Sikora
Appl. Sci. 2021, 11(8), 3474; https://doi.org/10.3390/app11083474 - 13 Apr 2021
Cited by 1 | Viewed by 2311
Abstract
Currently, efficiency in the supply domain and the ability to make quick and accurate decisions and to assess risk properly play a crucial role. The role of a decision support system (DSS) is to support the decision-making process in the enterprise, and for [...] Read more.
Currently, efficiency in the supply domain and the ability to make quick and accurate decisions and to assess risk properly play a crucial role. The role of a decision support system (DSS) is to support the decision-making process in the enterprise, and for this, it is yet not enough to have up-to-date data; reliable predictions are necessary. Each application area has its own specificity, and so far, no dedicated DSS for liquefied petroleum gas (LPG) supply has been presented. This study presents a decision support system dedicated to support the LPG supply process from the perspective of gas demand analysis. This perspective includes a short- and medium-term gas demand prediction, as well as the definition and monitoring of key performance indicators. The analysis performed within the system is based exclusively on the collected sensory data; no data from any external enterprise resource planning (ERP) systems are used. Examples of forecasts and KPIs presented in the study show what kind of analysis can be implemented in the proposed system and prove its usefulness. This study, showing the overall workflow and the results for the use cases, which outperform the typical trivial approaches, could be a valuable direction for future works in the field of LPG and other fuel supply. Full article
Show Figures

Figure 1

18 pages, 3810 KiB  
Article
A Hybrid Approach Combining R*-Tree and k-d Trees to Improve Linked Open Data Query Performance
by Yuxiang Sun, Tianyi Zhao, Seulgi Yoon and Yongju Lee
Appl. Sci. 2021, 11(5), 2405; https://doi.org/10.3390/app11052405 - 8 Mar 2021
Cited by 5 | Viewed by 2400
Abstract
Semantic Web has recently gained traction with the use of Linked Open Data (LOD) on the Web. Although numerous state-of-the-art methodologies, standards, and technologies are applicable to the LOD cloud, many issues persist. Because the LOD cloud is based on graph-based resource description [...] Read more.
Semantic Web has recently gained traction with the use of Linked Open Data (LOD) on the Web. Although numerous state-of-the-art methodologies, standards, and technologies are applicable to the LOD cloud, many issues persist. Because the LOD cloud is based on graph-based resource description framework (RDF) triples and the SPARQL query language, we cannot directly adopt traditional techniques employed for database management systems or distributed computing systems. This paper addresses how the LOD cloud can be efficiently organized, retrieved, and evaluated. We propose a novel hybrid approach that combines the index and live exploration approaches for improved LOD join query performance. Using a two-step index structure combining a disk-based 3D R*-tree with the extended multidimensional histogram and flash memory-based k-d trees, we can efficiently discover interlinked data distributed across multiple resources. Because this method rapidly prunes numerous false hits, the performance of join query processing is remarkably improved. We also propose a hot-cold segment identification algorithm to identify regions of high interest. The proposed method is compared with existing popular methods on real RDF datasets. Results indicate that our method outperforms the existing methods because it can quickly obtain target results by reducing unnecessary data scanning and reduce the amount of main memory required to load filtering results. Full article
Show Figures

Figure 1

Review

Jump to: Research

19 pages, 886 KiB  
Review
Software Tools for Conducting Real-Time Information Processing and Visualization in Industry: An Up-to-Date Review
by Regina Sousa, Rui Miranda, Ailton Moreira, Carlos Alves, Nicolas Lori and José Machado
Appl. Sci. 2021, 11(11), 4800; https://doi.org/10.3390/app11114800 - 24 May 2021
Cited by 16 | Viewed by 7079
Abstract
The processing of information in real-time (through the processing of complex events) has become an essential task for the optimal functioning of manufacturing plants. Only in this way can artificial intelligence, data extraction, and even business intelligence techniques be applied, and the data [...] Read more.
The processing of information in real-time (through the processing of complex events) has become an essential task for the optimal functioning of manufacturing plants. Only in this way can artificial intelligence, data extraction, and even business intelligence techniques be applied, and the data produced daily be used in a beneficent way, enhancing automation processes and improving service delivery. Therefore, professionals and researchers need a wide range of tools to extract, transform, and load data in real-time efficiently. Additionally, the same tool supports or at least facilitates the visualization of this data intuitively and interactively. The review presented in this document aims to provide an up-to-date review of the various tools available to perform these tasks. Of the selected tools, a brief description of how they work, as well as the advantages and disadvantages of their use, will be presented. Furthermore, a critical analysis of overall operation and performance will be presented. Finally, a hybrid architecture that aims to synergize all tools and technologies is presented and discussed. Full article
Show Figures

Figure 1

Back to TopTop