Application of Time Series Analysis and Forecasting in Computer Science

A special issue of Electronics (ISSN 2079-9292). This special issue belongs to the section "Computer Science & Engineering".

Deadline for manuscript submissions: closed (25 September 2024) | Viewed by 16631

Special Issue Editors


E-Mail Website
Guest Editor
School of Computer Science and Engineering, University of New South Wales, Sydney 2052, Australia
Interests: spatio-temporal data modelling; time series forecasting; pedestrian trajectory prediction

E-Mail Website
Guest Editor
Department of Computer Science and Software Engineering, The University of Western Australia, Perth 6009, Australia
Interests: computer vision; machine learning; object detection; visual tracking; image processing; pattern recognition

Special Issue Information

Dear Colleagues,

Time series analysis and forecasting have become increasingly important in recent years due to the rise of electronics, the Internet of Things (IoT), and smart sensor techniques. These advancements have led to the collection of vast amounts of time series data from various sources for different applications, such as finance, energy, healthcare, and environmental monitoring.

Machine learning and deep learning have become crucial tools in time series analysis and forecasting due to their ability to handle complex and high-dimensional data. These techniques have been used in various tasks of time series analysis, including prediction, classification, clustering, and anomaly detection. Although recent models such as LSTM and transformer-based approaches have been utilized for time series analysis and forecasting, many questions and challenges remain to be addressed in real-world application scenarios.

This Special Issue intends to bring together researchers and practitioners working on methods and techniques for time series analysis and forecasting in various application domains, such as intelligent transportation, geographic information systems, economics, finance, and environmental science. The Special Issue aims to showcase recent advances and applications of time series analysis and forecasting methodologies to real-world problems in these domains.

Dr. Hao Xue
Dr. Du Huynh
Guest Editors

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All submissions that pass pre-check are peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Electronics is an international peer-reviewed open access semimonthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 2400 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Keywords

  • time series analysis
  • time series forecasting
  • deep learning
  • probabilistic forecasting
  • forecasting applications
  • temporal data modeling

Benefits of Publishing in a Special Issue

  • Ease of navigation: Grouping papers by topic helps scholars navigate broad scope journals more efficiently.
  • Greater discoverability: Special Issues support the reach and impact of scientific research. Articles in Special Issues are more discoverable and cited more frequently.
  • Expansion of research network: Special Issues facilitate connections among authors, fostering scientific collaborations.
  • External promotion: Articles in Special Issues are often promoted through the journal's social media, increasing their visibility.
  • e-Book format: Special Issues with more than 10 articles can be published as dedicated e-books, ensuring wide and rapid dissemination.

Further information on MDPI's Special Issue polices can be found here.

Published Papers (12 papers)

Order results
Result details
Select all
Export citation of selected articles as:

Research

15 pages, 3806 KiB  
Article
Transformer-Based GAN with Multi-STFT for Rotating Machinery Vibration Data Analysis
by Seokchae Lee, Hoejun Jeong and Jangwoo Kwon
Electronics 2024, 13(21), 4253; https://doi.org/10.3390/electronics13214253 - 30 Oct 2024
Viewed by 579
Abstract
Prognostics and health management of general rotating machinery have been studied over time to improve system stability. Recently, the excellent abnormal diagnosis performance of artificial intelligence (AI) was demonstrated, and therefore, AI-based intelligent diagnosis is now being implemented in these systems. AI models [...] Read more.
Prognostics and health management of general rotating machinery have been studied over time to improve system stability. Recently, the excellent abnormal diagnosis performance of artificial intelligence (AI) was demonstrated, and therefore, AI-based intelligent diagnosis is now being implemented in these systems. AI models are trained using large volumes of data. Therefore, we propose a transformer-based generative adversarial network (GAN) model with a multi-resolution short-time Fourier transform (multi-STFT) loss function to augment the vibration data of rotating machinery to facilitate the successful learning of deep learning models. We constructed a model with a conditional GAN structure, which is transformer based, for learning the feature points of vibration data in the time-series domain. In addition, we applied the multi-STFT loss function to capture the frequency features of the vibration data. The generated data, which adequately captured the frequency features, were used to augment the training data to improve the performance of a deep learning classifier. Furthermore, by visualizing the generated vibration data and comparing the visualizations to those of the vibration data obtained from real machinery, we demonstrated that the generated data were indistinguishable from the actual data. Full article
Show Figures

Figure 1

16 pages, 1550 KiB  
Article
Prediction of Sunspot Number with Hybrid Model Based on 1D-CNN, BiLSTM and Multi-Head Attention Mechanism
by Huirong Chen, Song Liu, Ximing Yang, Xinggang Zhang, Jianzhong Yang and Shaofen Fan
Electronics 2024, 13(14), 2804; https://doi.org/10.3390/electronics13142804 - 16 Jul 2024
Cited by 1 | Viewed by 686
Abstract
Sunspots have a significant impact on human activities. In this study, we aimed to improve solar activity prediction accuracy. To predict the sunspot number based on different aspects, such as extracted features and relationships among data, we developed a hybrid model that includes [...] Read more.
Sunspots have a significant impact on human activities. In this study, we aimed to improve solar activity prediction accuracy. To predict the sunspot number based on different aspects, such as extracted features and relationships among data, we developed a hybrid model that includes a one-dimensional convolutional neural network (1D-CNN) for extracting the features of sunspots and bidirectional long short-term memory (BiLSTM) embedded with a multi-head attention mechanism (MHAM) to learn the inner relationships among data and finally predict the sunspot number. We evaluated our model and several existing models according to different evaluation indicators, such as mean absolute error (MAE) and root mean square error (RMSE). Compared with the informer, stacked LSTM, XGBoost-DL, and EMD-LSTM-AM models, the RMSE and MAE of our results were more than 42.5% and 65.1% lower, respectively. The experimental results demonstrate that our model has higher accuracy than other methods. Full article
Show Figures

Figure 1

13 pages, 3573 KiB  
Article
Research on Runoff Prediction Based on Time2Vec-TCN-Transformer Driven by Multi-Source Data
by Yang Liu, Yize Wang, Xuemei Liu, Xingzhi Wang, Zehong Ren and Songlin Wu
Electronics 2024, 13(14), 2681; https://doi.org/10.3390/electronics13142681 - 9 Jul 2024
Viewed by 985
Abstract
Due to the frequent occurrence of extreme weather in recent years, accurate runoff prediction is crucial for the rational planning and management of water resources. Addressing the high uncertainty and multiple influencing factors in runoff prediction, this paper proposes a runoff prediction method [...] Read more.
Due to the frequent occurrence of extreme weather in recent years, accurate runoff prediction is crucial for the rational planning and management of water resources. Addressing the high uncertainty and multiple influencing factors in runoff prediction, this paper proposes a runoff prediction method driven by multi-source data. Based on multivariate observed data of runoff, water level, temperature, and precipitation, a Time2Vec-TCN-Transformer model is proposed for runoff prediction research and compared with LSTM, TCN, and TCN-Transformer models. The results show that the Time2Vec-TCN-Transformer model outperforms other models in metrics including MAE, RRMSE, MAPE, and NSE, demonstrating higher prediction accuracy and reliability. By effectively combining Time2Vec, TCN, and Transformer, the proposed model improves the MAPE for forecasting 1–4 days in the future by approximately 7% compared to the traditional LSTM model and 4% compared to the standalone TCN model, while maintaining NSE consistently between 0.9 and 1. This model can better capture the periodicity, long-term scale information, and relationships among multiple variables of runoff data, providing reliable predictive support for flood forecasting and water resources management. Full article
Show Figures

Figure 1

18 pages, 3856 KiB  
Article
Hybrid Time Series Model for Advanced Predictive Analysis in COVID-19 Vaccination
by Amna Khalil, Mazhar Javed Awan, Awais Yasin, Tanzeela Kousar, Abdur Rahman and Mohamed Sebaie Youssef
Electronics 2024, 13(13), 2468; https://doi.org/10.3390/electronics13132468 - 24 Jun 2024
Viewed by 1015
Abstract
This study aims to enhance the prediction of COVID-19 vaccination trends using a novel integrated forecasting model, facilitating better public health decision-making and resource allocation during the pandemic. As the COVID-19 pandemic continues to impact global health, accurately forecasting vaccination trends is critical [...] Read more.
This study aims to enhance the prediction of COVID-19 vaccination trends using a novel integrated forecasting model, facilitating better public health decision-making and resource allocation during the pandemic. As the COVID-19 pandemic continues to impact global health, accurately forecasting vaccination trends is critical for effective public health response and strategy development. Traditional forecasting models often fail to capture the complex dynamics of pandemic-driven vaccination rates. The analysis utilizes a comprehensive dataset comprising over 68,487 entries, detailing daily vaccination statistics across various demographics and geographic locations. This dataset provides a robust foundation for modeling and forecasting efforts. It utilizes advanced time series analysis techniques and machine learning algorithms to accurately predict future vaccination patterns based on the Hybrid Harvest model, which combines the strengths of ARIMA and Prophet models. Hybrid Harvest exhibits superior performance, with mean-square errors (MSEs) of 0.1323, and root-mean-square errors (RMSEs) of 0.0305. Based on these results, the model is significantly more accurate than traditional forecasting methods when predicting vaccination trends. It offers significant advances in forecasting COVID-19 vaccination trends through integration of ARIMA and Prophet models. The model serves as a powerful tool for policymakers to plan vaccination campaigns efficiently and effectively. Full article
Show Figures

Figure 1

19 pages, 1939 KiB  
Article
Enhancing Financial Time Series Prediction with Quantum-Enhanced Synthetic Data Generation: A Case Study on the S&P 500 Using a Quantum Wasserstein Generative Adversarial Network Approach with a Gradient Penalty
by Filippo Orlandi, Enrico Barbierato and Alice Gatti
Electronics 2024, 13(11), 2158; https://doi.org/10.3390/electronics13112158 - 1 Jun 2024
Viewed by 861
Abstract
This study introduces a novel Quantum Wasserstein Generative Adversarial Network approach with a Gradient Penalty (QWGAN-GP) model that leverages a quantum generator alongside a classical discriminator to synthetically generate time series data. This approach aims to accurately replicate the statistical properties of the [...] Read more.
This study introduces a novel Quantum Wasserstein Generative Adversarial Network approach with a Gradient Penalty (QWGAN-GP) model that leverages a quantum generator alongside a classical discriminator to synthetically generate time series data. This approach aims to accurately replicate the statistical properties of the S&P 500 index. The synthetic data generated by this model were compared to the original series using various metrics, including Wasserstein distance, Dynamic Time Warping (DTW) distance, and entropy measures, among others. The outcomes demonstrate the model’s robustness, with the generated data exhibiting a high degree of fidelity to the statistical characteristics of the original data. Additionally, this study explores the applicability of the synthetic time series in enhancing prediction models. An LSTM (Long-Short Term Memory)-based model was developed to evaluate the impact of incorporating synthetic data on forecasting accuracy, particularly focusing on general trends and extreme market events. The findings reveal that models trained on a mix of synthetic and real data significantly outperform those trained solely on historical data, improving predictive performance. Full article
Show Figures

Figure 1

19 pages, 2237 KiB  
Article
Short-Term Air Traffic Flow Prediction Based on CEEMD-LSTM of Bayesian Optimization and Differential Processing
by Rui Zhou, Shuang Qiu, Ming Li, Shuangjie Meng and Qiang Zhang
Electronics 2024, 13(10), 1896; https://doi.org/10.3390/electronics13101896 - 12 May 2024
Cited by 2 | Viewed by 1020
Abstract
With the rapid development of China’s civil aviation, the flow of air traffic in terminal areas is also increasing. Short-term air traffic flow prediction is of great significance for the accurate implementation of air traffic flow management. To enhance the accuracy of short-term [...] Read more.
With the rapid development of China’s civil aviation, the flow of air traffic in terminal areas is also increasing. Short-term air traffic flow prediction is of great significance for the accurate implementation of air traffic flow management. To enhance the accuracy of short-term air traffic flow prediction, this paper proposes a short-term air traffic flow prediction model based on complementary ensemble empirical mode decomposition (CEEMD) and long short-term memory (LSTM) of the Bayesian optimization algorithm and data differential processing. Initially, the model performs CEEMD on the short-term air traffic flow series. Subsequently, to improve prediction accuracy, the data differencing is employed to stabilize the time series. Finally, the smoothed sequences are, respectively, input into the LSTM network model optimized by the Bayesian optimization algorithm for prediction. After data reconstruction, the final short-term flow prediction result is obtained. The model proposed in this paper is verified by using the data from Shanghai Pudong International Airport. The results show that the evaluation indexes of the prediction accuracy and fitting degree of the model, RMSE (Root Mean Square Error), MAE (Mean Absolute Error), and R2 (Coefficient of Determination), are 0.336, 0.239, and 97.535%, respectively. Compared to other classical time-series prediction models, the prediction accuracy is greatly improved, which can provide a useful reference for short-term air traffic flow prediction. Full article
Show Figures

Figure 1

24 pages, 4980 KiB  
Article
Wind Power Forecasting with Machine Learning Algorithms in Low-Cost Devices
by Pablo Andrés Buestán-Andrade, Mario Peñacoba-Yagüe, Jesus Enrique Sierra-García and Matilde Santos
Electronics 2024, 13(8), 1541; https://doi.org/10.3390/electronics13081541 - 18 Apr 2024
Cited by 2 | Viewed by 1194
Abstract
The urgent imperative to mitigate carbon dioxide (CO2) emissions from power generation poses a pressing challenge for contemporary society. In response, there is a critical need to intensify efforts to improve the efficiency of clean energy sources and expand their use, [...] Read more.
The urgent imperative to mitigate carbon dioxide (CO2) emissions from power generation poses a pressing challenge for contemporary society. In response, there is a critical need to intensify efforts to improve the efficiency of clean energy sources and expand their use, including wind energy. Within this field, it is necessary to address the variability inherent to the wind resource with the application of prediction methodologies that allow production to be managed. At the same time, to extend its use, this clean energy should be made accessible to everyone, including on a small scale, boosting devices that are affordable for individuals, such as Raspberry and other low-cost hardware platforms. This study is designed to evaluate the effectiveness of various machine learning (ML) algorithms, with special emphasis on deep learning models, in accurately forecasting the power output of wind turbines. Specifically, this research deals with convolutional neural networks (CNN), fully connected networks (FC), gated recurrent unit cells (GRU), and transformer-based models. However, the main objective of this work is to analyze the feasibility of deploying these architectures on various computing platforms, comparing their performance both on conventional computing systems and on other lower-cost alternatives, such as Raspberry Pi 3, in order to make them more accessible for the management of this energy generation. Through training and a rigorous benchmarking process, considering accuracy, real-time performance, and energy consumption, this study identifies the optimal technique to accurately model such real-time series data related to wind energy production, and evaluates the hardware implementation of the studied models. Importantly, our findings demonstrate that effective wind power forecasting can be achieved on low-cost hardware platforms, highlighting the potential for widespread adoption and the personal management of wind power generation, thus representing a fundamental step towards the democratization of clean energy technologies. Full article
Show Figures

Figure 1

17 pages, 11999 KiB  
Article
Edge-Bound Change Detection in Multisource Remote Sensing Images
by Zhijuan Su, Gang Wan, Wenhua Zhang, Zhanji Wei, Yitian Wu, Jia Liu, Yutong Jia, Dianwei Cong and Lihuan Yuan
Electronics 2024, 13(5), 867; https://doi.org/10.3390/electronics13050867 - 23 Feb 2024
Cited by 2 | Viewed by 1298
Abstract
Detecting changes in multisource heterogeneous images is a great challenge for unsupervised change detection methods. Image-translation-based methods, which transform two images to be homogeneous for comparison, have become a mainstream approach. However, most of them primarily rely on information from unchanged regions, resulting [...] Read more.
Detecting changes in multisource heterogeneous images is a great challenge for unsupervised change detection methods. Image-translation-based methods, which transform two images to be homogeneous for comparison, have become a mainstream approach. However, most of them primarily rely on information from unchanged regions, resulting in networks that cannot fully capture the connection between two heterogeneous representations. Moreover, the lack of a priori information and sufficient training data makes the training vulnerable to the interference of changed pixels. In this paper, we propose an edge-oriented generative adversarial network (EO-GAN) for change detection that indirectly translates images using edge information, which serves as a core and stable link between heterogeneous representations. The EO-GAN is composed of an edge extraction network and a reconstructive network. During the training process, we ensure that the edges extracted from heterogeneous images are as similar as possible through supplemented data based on superpixel segmentation. Experimental results on both heterogeneous and homogeneous datasets demonstrate the effectiveness of our proposed method. Full article
Show Figures

Figure 1

16 pages, 7435 KiB  
Article
A Time Series-Based Approach to Elastic Kubernetes Scaling
by Haibin Yuan and Shengchen Liao
Electronics 2024, 13(2), 285; https://doi.org/10.3390/electronics13020285 - 8 Jan 2024
Cited by 3 | Viewed by 2005
Abstract
With the increasing popularity of cloud-native architectures and containerized applications, Kubernetes has become a critical platform for managing these applications. However, Kubernetes still faces challenges when it comes to resource management. Specifically, the platform cannot achieve timely scaling of the resources of applications [...] Read more.
With the increasing popularity of cloud-native architectures and containerized applications, Kubernetes has become a critical platform for managing these applications. However, Kubernetes still faces challenges when it comes to resource management. Specifically, the platform cannot achieve timely scaling of the resources of applications when their workloads fluctuate, leading to insufficient resource allocation and potential service disruptions. To address this challenge, this study proposes a predictive auto-scaling Kubernetes Operator based on time series forecasting algorithms, aiming to dynamically adjust the number of running instances in the cluster to optimize resource management. In this study, the Holt–Winter forecasting method and the Gated Recurrent Unit (GRU) neural network, two robust time series forecasting algorithms, are employed and dynamically managed. To evaluate the effectiveness, we collected workload metrics from a deployed RESTful HTTP application, implemented predictive auto-scaling, and assessed the differences in service quality before and after the implementation. The experimental results demonstrate that the predictive auto-scaling component can accurately predict the future trend of the metrics and intelligently scale resources based on the prediction results, with a Mean Squared Error (MSE) of 0.00166. Compared to the deployment using a single algorithm, the cold start time is reduced by 1 h and 41 min, and the fluctuation in service quality is reduced by 83.3%. This process effectively enhances the quality of service and offers a novel solution for resource management in Kubernetes clusters. Full article
Show Figures

Figure 1

20 pages, 9318 KiB  
Article
Unsupervised Anomaly Detection of Intermittent Demand for Spare Parts Based on Dual-Tailed Probability
by Kairong Hong, Yingying Ren, Fengyuan Li, Wentao Mao and Yangshuo Liu
Electronics 2024, 13(1), 195; https://doi.org/10.3390/electronics13010195 - 2 Jan 2024
Cited by 1 | Viewed by 1302
Abstract
The quick development of machine learning techniques provides a superior capability for manufacturing enterprises to make effective decisions about inventory management based on spare parts demand (SPD) data. Since SPD sequences in practical maintenance applications usually show an intermittent distribution, it is not [...] Read more.
The quick development of machine learning techniques provides a superior capability for manufacturing enterprises to make effective decisions about inventory management based on spare parts demand (SPD) data. Since SPD sequences in practical maintenance applications usually show an intermittent distribution, it is not easy to represent the demand pattern of such sequences. Meanwhile, there are some aspects like manual report errors, environmental interference, sudden project changes, etc., that bring large and unexpected fluctuations to SPD sequences, i.e., anomalous demands. The inventory decision made based on the SPD sequences with anomalous demands is not trusted by enterprise engineers. For such SPD data, there are two great concerns, i.e., false alarms in which sparse demands are recognized to be anomalous and missing alarms in which the anomalous demands are categorized as normal due to their adjacent demands having extreme values. To address these concerns, a new unsupervised anomaly-detection method for intermittent time series is proposed based on a dual-tailed probability. First, the multi-way delay embedding transform (MDT) was applied on the raw SPD sequences to obtain higher-order tensors. Through Tucker tensor decomposition, the disturbance of extreme demands can be effectively reduced. For the reconstructed SPD sequences, then, the tail probability at each time point, as well as the empirical cumulative distribution function were calculated based on the probability of the demand occurrence. Second, to lessen the disturbance of sparse demand, the non-zero demand sequence was distilled from the raw SPD sequence, with the tail probability at each time point being calculated. Finally, the obtained dual-tailed probabilities were fused to determine the anomalous degree of each demand. The proposed method was validated on the two actual SPD datasets, which were collected from a large engineering manufacturing enterprise and a large vehicle manufacturing enterprise in China, respectively. The results demonstrated that the proposed method can effectively lower the false alarm rate and missing alarm rate with no supervised information provided. The detection results were trustworthy enough and, more importantly, computationally inexpensive, showing significant applicability to large-scale after-sales parts management. Full article
Show Figures

Figure 1

21 pages, 5387 KiB  
Article
Comparative Performance Analysis of RNN Techniques for Predicting Concatenated Normal and Abnormal Vibrations
by Ju-Hyung Lee and Jun-Ki Hong
Electronics 2023, 12(23), 4778; https://doi.org/10.3390/electronics12234778 - 25 Nov 2023
Cited by 1 | Viewed by 1037
Abstract
We analyze the comparative performance of predicting the transition from normal to abnormal vibration states, simulating the motor’s condition before a drone crash, by proposing a concatenated vibration prediction model (CVPM) based on recurrent neural network (RNN) techniques. Subsequently, using the proposed CVPM, [...] Read more.
We analyze the comparative performance of predicting the transition from normal to abnormal vibration states, simulating the motor’s condition before a drone crash, by proposing a concatenated vibration prediction model (CVPM) based on recurrent neural network (RNN) techniques. Subsequently, using the proposed CVPM, the prediction performances of six RNN techniques: long short-term memory (LSTM), attention-LSTM (Attn.-LSTM), bidirectional-LSTM (Bi-LSTM), gate recurrent unit (GRU), attention-GRU (Attn.-GRU), and bidirectional-GRU (Bi-GRU), are analyzed comparatively. In order to assess the prediction accuracy of these RNN techniques in predicting concatenated vibrations, both normal and abnormal vibration data are collected from the motors connected to the drone’s propellers. Consequently, a concatenated vibration dataset is generated by combining 50% of normal vibration data with 50% of abnormal vibration data. This dataset is then used to compare and analyze vibration prediction performance and simulation runtime across the six RNN techniques. The goal of this analysis is to comparatively analyze the performances of the six RNN techniques for vibration prediction. According to the simulation results, it is observed that Attn.-LSTM and Attn.-GRU, incorporating the attention mechanism technique to focus on information highly relevant to the prediction target through unidirectional learning, demonstrate the most promising predictive performance among the six RNN techniques. This implies that employing the attention mechanism enhances the concentration of relevant information, resulting in superior predictive accuracy compared to the other RNN techniques. Full article
Show Figures

Figure 1

19 pages, 3212 KiB  
Article
A Sales Forecasting Model for New-Released and Short-Term Product: A Case Study of Mobile Phones
by Seongbeom Hwang, Goonhu Yoon, Eunjung Baek and Byoung-Ki Jeon
Electronics 2023, 12(15), 3256; https://doi.org/10.3390/electronics12153256 - 28 Jul 2023
Cited by 5 | Viewed by 3291
Abstract
In today’s competitive market, sales forecasting of newly released and short-term products is an important challenge because there is not enough sales data. To address these challenges, we propose a sales forecasting model for new-released and short-term products and study the case of [...] Read more.
In today’s competitive market, sales forecasting of newly released and short-term products is an important challenge because there is not enough sales data. To address these challenges, we propose a sales forecasting model for new-released and short-term products and study the case of mobile phones. The main approach is to develop an integrated sales forecasting model by training the sales patterns and product characteristics of the same product category. In particular, we analyze the performance of the latest 12 machine learning models and propose the best performance model. Machine learning models have been used to compare performance through the development of Ridge, Lasso, Support Vector Machine (SVM), Random Forest, Gradient Boosting Machine (GBM), AdaBoost, LightGBM, XGBoost, CatBoost, Deep Neural Network (DNN), Recurrent Neural Network (RNN), and Long Short-Term Memory (LSTM). We apply a dataset consisting of monthly sales data of 38 mobile phones obtained in the Korean market. As a result, the Random Forest model was selected as an excellent model that outperforms other models in terms of prediction accuracy. Our model achieves remarkable results with a mean absolute percentage error (MAPE) of 42.6258, a root mean square error (RMSE) of 8443.3328, and a correlation coefficient of 0.8629. Full article
Show Figures

Figure 1

Back to TopTop