Next Article in Journal
Meteorological Data Warehousing and Analysis for Supporting Air Navigation
Next Article in Special Issue
Risk Factors Influencing Fatal Powered Two-Wheeler At-Fault and Not-at-Fault Crashes: An Application of Spatio-Temporal Hotspot and Association Rule Mining Techniques
Previous Article in Journal
Classification of Malaria Using Object Detection Models
Previous Article in Special Issue
Bagging Machine Learning Algorithms: A Generic Computing Framework Based on Machine-Learning Methods for Regional Rainfall Forecasting in Upstate New York
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Predicting Future Promising Technologies Using LSTM

Department of Statistical Data Science, ICT Convergence Engineering, Anyang University, Anyang 14028, Korea
Informatics 2022, 9(4), 77; https://doi.org/10.3390/informatics9040077
Submission received: 19 August 2022 / Revised: 18 September 2022 / Accepted: 23 September 2022 / Published: 27 September 2022
(This article belongs to the Special Issue Feature Papers in Big Data)

Abstract

:
With advances in science and technology and changes in industry, research on promising future technologies has emerged as important. Furthermore, with the advent of a ubiquitous and smart environment, governments and enterprises are required to predict future promising technologies on which new important core technologies will be developed. Therefore, this study aimed to establish science and technology development strategies and support business activities by predicting future promising technologies using big data and deep learning models. The names of the “TOP 10 Emerging Technologies” from 2018 to 2021 selected by the World Economic Forum were used as keywords. Next, patents collected from the United States Patent and Trademark Office and the Science Citation Index (SCI) papers collected from the Web of Science database were analyzed using a time-series forecast. For each technology, the number of patents and SCI papers in 2022, 2023 and 2024 were predicted using the long short-term memory model with the number of patents and SCI papers from 1980 to 2021 as input data. Promising technologies are determined based on the predicted number of patents and SCI papers for the next three years. Keywords characterizing future promising technologies are extracted by analyzing abstracts of patent data collected for each technology and the term frequency-inverse document frequency is measured for each patent abstract. The research results can help business managers make optimal decisions in the present situation and provide researchers with an understanding of the direction of technology development.

1. Introduction

Owing to the development of science and technology and changes in industry, research on promising future technologies has become important. With the advent of a ubiquitous and smart environment, governments and enterprises are required to predict future promising technologies on which new important and core technologies will be developed [1]. Technology forecasting focuses on supporting business managers to make optimal decisions in the present situation and helps researchers understand the direction of technology development by predicting the future in detail through quantitative techniques. Through technological forecasting, it is possible to effectively link science and technology with economic development by forecasting the future situation and integrating economic needs and research opportunities. Technology forecasting is becoming an important tool to support the forecasting of industry and technological development [2]. As globalization accelerates and the industrial paradigm changes rapidly, technology forecasting for rapidly changing important technologies has emerged in response to the needs of the private and public sectors [3,4]. With quantitative analysis techniques being applied to technical forecasting [5,6], the reliability and validity of technical forecasting using papers and patents are increasing [7]. As patents and papers are representative data of technical information, these attempts of forecasting help solve the problem of subjective bias of experts [8,9]. In recent years, scholars have begun predicting technology based on the number of publications of papers and patents [10]. The common methods of technology forecasting through analysis of the number of papers and patents are regression, machine learning, and deep learning [11]. Deep learning has a more in-depth network structure than machine learning, which can significantly improve the prediction accuracy for the problems that require complex solutions [12]. Mudassir et al. [13] used a long short-term memory (LSTM) network for forecasting bitcoin price fluctuations. Cai et al. [14] forecasted wind power using a generalized regression neural network (GRNN) and showed that the deep learning approach has higher prediction accuracy. Gui and Xu [15] used a deep learning text classification model to extract relevant Science Citation Index (SCI) papers from the Web of Science database for the period 1996–2019 for topic classification and used Ensemble Empirical Mode Decomposition (EEMD) and LSTM neural networks to predict the future development of each research field. Zhou et al. [16] used a deep learning algorithm to predict emerging technologies in Gartner’s hype curve in 2017 based on patent data from 2000 to 2016. Lee et al. [17] devised a deep learning model based on meta-knowledge (i.e., text information including citations, abstracts, and area codes) for prediction of future growth potential.
This study aimed to establish science and technology development strategies and support business activities by predicting future promising technologies using big data and deep learning models. Herein, the promising technology names of the “TOP 10 Emerging Technologies” from 2018 to 2021 selected by the World Economic Forum (WEF) are used as keywords to analyze the patents collected from the United States Patent and Trademark Office (USPTO) and SCI papers collected from the Web of Science database by time-series forecast (TSF). For each technology, the number of patents and SCI papers in 2022, 2023 and 2024 are predicted using the LSTM model with the number of patents and SCI papers collected from 1980 to 2021 as input data. Promising technologies are determined based on the number of predicted patents and SCI papers for the next three years. This study has differences in previous works, in that big data are collected from the vast databases of the USPTO and the Web of Science and future promising technologies are derived based on it using a deep learning model. Furthermore, this study aimed to extract the keywords characterizing future promising technologies—this is achieved by calculating the term frequency-inverse document frequency (TF-IDF) of each word in a patent abstract by using the abstracts of patent data collected for each technology from the USPTO to compose the corpus.

2. Data and Methods

2.1. Data

In this study, the input dataset of LSTM was constructed using the number of patents collected from the USPTO registered between 1980 and 2021 for each technology and the number of SCI papers collected from the Web of Science database by keyword search for the promising technology names of the “TOP 10 Emerging Technologies” from 2018 to 2021 selected by the WEF, as shown in Table 1.
Table 2 shows the total number of patents collected from the USPTO between 1980 and 2021 for each technology and the total number of SCI papers collected from the Web of Science database for technology in Table 1. The input dataset of the LSTM model was created by composing the number of patents and SCI papers collected for technology and year as follows. To predict the number of patents for the next three years for each technology, vector t ,   patent _ num t ,   t + 1 ,   patent _ num t + 1 , ,   t + 9 ,   patent _ num t + 9   with length 10 was constructed for year t   t = 1980 ,   1981 ,   ,   2012 .
To predict the number of SCI papers for the next three years for each technology, vector t ,   paper _ num t ,   t + 1 ,   paper _ num t + 1 , ,   t + 9 ,   paper _ num t + 9   with length 10 was constructed for year t   t = 1980 ,   1981 ,   ,   2012 . The LSTM model was modeled to predict the number of patents for the next three years t + 10 ,   patent _ num t + 10 ,   t + 11 ,   patent _ num t + 11 ,   t + 12 ,   patent _ num t + 12   for input data t ,   patent _ num t ,   t + 1 ,   patent _ num t + 1 , ,   t + 9 ,   patent _ num t + 9   and the number of SCI papers for the next three years t + 10 ,   paper _ num t + 10 ,   t + 11 ,   paper _ num t + 11 ,   t + 12 ,   paper _ num t + 12   for input data t ,   paper _ num t ,   t + 1 ,   paper _ num t + 1 , ,   t + 9 ,   paper _ num t + 9   . An increase in the number of patents and SCI papers predicted for the next three years compared to that of 2021 indicated a promising technology in the future. To extract keywords that characterize future promising technologies, the abstract of each patent was considered one document for each technology and the set of abstracts as a corpus through data mining to calculate the term frequency (TF), document frequency (DF), and term frequency-inverse document frequency (TF-IDF). Keywords characterizing future promising technologies were extracted from words with a calculated TF-IDF.

2.2. Model

A recurrent neural network (RNN) is a deep learning model that uses time-series data from the past as input and outputs future data; for example, a river level prediction model [18,19], solar power generation prediction model [20,21], fine dust prediction model [22,23], energy demand prediction model [24], and stock price prediction model [25,26]. In this study, using the dataset in Section 2.1 as input data, the predictions made by LSTM for promising future technologies showed excellent performance even with a dataset having long-term dependencies.
The mathematical model of the LSTM is expressed as Equation (1) and illustrated in Figure 1. The output h t , output gate o t , new memory content c t ˜ , forget gate f t , and input gate i t of the LSTM are expressed as Equation (1).
Output h t = o t   t a n h c t
Output gate o t = σ W o x t + U o h t 1 + V o c t
Memory cell c t = f t c t 1 + i t c ˜ t
New memory content c ˜ t = t a n h W c x t + U c h t 1 (1)
Forget gate f t = σ W f x t + U f h t 1 + V f c t 1
Input gate i t = σ W i x t + U i h t 1 + V i c t 1
Equation (1) can be illustrated as an image in Figure 1.
To improve the prediction accuracy of LSTM, a stacked LSTM, as shown in Figure 2, was used as a model by stacking two LSTM layers with a hidden size of 100. The experiment was configured as follows, and it was confirmed that the predicted values converged when the epochs were set to 200.
Epochs: 200, Hidden size: 100, Loss function: MSE, Optimizer: SGD, Learning rate: 0.001.

2.3. Patent Analysis Results

Technologies with less than three patents in the patent dataset were excluded from the analysis due to difficulties in constructing a sufficient training dataset for the LSTM. Table 3 shows the results of calculating the rate of increase ( patent _ num 2024 patent _ num 2021 ) / 4 in the number of patents in 2024 compared to 2021 based on the number of patents expected to be applied in the next three years using the LSTM model for 16 technologies. In Table 3, the accuracy was calculated as follows.
With the maximum number of patents patent _ num t collected under each technology for year t   t = 1980 ,   1981 ,   ,   2012 as max, the number of patents, which was the input data of the LSTM, and the number of patents, the output data, were multiplied by 100 / max to normalize the number of patents to a number from 0 to 100. Let difference t be the absolute value of the difference between normalized patent _ num t , which represents input data, and normalized patent _ num t , which represents output data in the same year. Prediction accuracy was calculated as accuracy = 100 mean difference t .
Based on the predicted increase in the number of patents over the next three years, “Augmented reality” was predicted to be the most promising technology in the future, followed by “Plasmonic materials,” “Virtual patients,” “Spatial computing,” “Quantum sensing,” “Social robots,” “Personalised medicine,” . Figure 3 shows the input data of future promising technologies and the predicted number of patents as a graph. Since the model used in this study is a predictive model based on past data, it tends to underestimate when a sudden increase in a short period of time is observed in the input data.
The number of patents in the next three years predicted by the LSTM model using the number of patents of the past 10 years as input data showed a tendency to conservatively predict the number of patents with a smaller variation from the number of patents actually observed.
In this study, the abstracts of patent data collected by technology were used to compose the corpus to calculate the TF of each abstract word appearing in the corpus; the DF, the number of documents in which each abstract word appeared; and the TF-IDF [28], a statistical number indicating how important each abstract word was in the corpus, and to extract the keywords characterizing future promising technologies. Table 4 shows the results of extracting the keywords of future promising technologies based on the TF-IDF for each technology. Keywords for technologies are not shown in Table 4 but are included in the Supplementary Data in the online resource.

2.4. Results of SCI Paper Analysis

Among the data of the SCI papers in Section 2.1, technologies with less than four published papers were excluded from the analysis due to difficulties in constructing a sufficient training dataset for LSTM. Table 5 shows the results of calculating the rate of increase ( paper _ num 2024 paper _ num 2021 ) / 4 in the number of papers in 2024 compared to 2021 based on the number of papers predicted to be published in the next three years using the LSTM model for 32 technologies. Table 5 shows the accuracy calculations, similar to the calculations presented in the patent analysis results.
Based on the predicted increase in the number of SCI papers over the next three years, the most promising technology was predicted to be “Augmented reality,” followed by “Spatial computing,” “Digital medicine,” “Virtual patients,” “Social robots,” “Plasmonic materials,” “Quantum sensing,” . Figure 4 shows the input data of future promising technologies and the predicted number of SCI papers. As in Figure 3, when a sudden increase in a short period of time is observed in the input data, the predicted value tends to be underestimated.
As shown in Table 6, the technologies predicted by the LSTM to grow in both number of patents and SCI papers in the next three years included: “Augmented reality,” “Spatial computing,” “Digital medicine,” “Virtual patients,” “Social robots,” “Plasmonic materials,” “Quantum sensing,” “Electric aviation,” “Green hydrogen,” “Personalized medicine,” “Gene drive,” “Electroceuticals,” and “Whole-genome synthesis.” Figure 5 shows a graph of the predicted number of patents and SCI papers for the top 10 technologies with a high growth rate among these 13 technologies.

3. Conclusions

This study used the promising technology names of the “TOP 10 Emerging Technologies” from 2018 to 2021 selected by the WEF as keywords to analyze the patents collected from the USPTO and SCI papers collected from the Web of Science database by TSF. Using the number of patents and SCI papers collected for 40 technologies as input data, the number of patents and SCI papers in the next three years was predicted for each technology using a two-layer LSTM model. Promising technologies were derived based on the increase rate of the predicted number of patents and the increase rate of the predicted number of SCI papers. This study is meaningful in that it determines promising technologies with an average accuracy of 86.42% using a deep learning model for two databases for 40 broad technologies.
The 13 technologies predicted to grow in both the number of patents and the number of SCI papers in the next three years, namely, “Augmented reality,” “Spatial computing,” “Digital medicine,” “Virtual patients,” “Social robots,” “Plasmonic materials,” “Quantum sensing,” “Electric aviation,” “Green hydrogen,” “Personalized medicine,” “Gene drive,” “Electroceuticals,” and “Whole-genome synthesis,” can be considered future promising technologies. Using the research results, business managers can make optimal decisions in the present situation and researchers can understand the direction of technology development. Through technological forecasting, it is possible to effectively link science and technology with economic development by forecasting the future situation more similarly and integrating economic needs and research opportunities.
Furthermore, this study differs from other studies in that keywords characterizing future promising technologies were extracted by calculating the TF-IDF of each word in a patent abstract by using the abstracts of patent data collected for each technology from the USPTO to compose the corpus.
In the future, to determine promising technologies, a wider database will be built and other models that can further improve prediction accuracy will be investigated.

Supplementary Materials

The following supporting information can be downloaded at: https://www.mdpi.com/article/10.3390/informatics9040077/s1. The Supplementary Data analyzed during this study are available from the online resource (https://github.com/shnoh92/future-promising-technologies (accessed on 18 September 2022)).

Funding

This research received no external funding.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

The code and datasets used and analyzed during this study can be obtained from the online resource (https://github.com/shnoh92/future-promising-technologies (accessed on 18 September 2022)).

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Miles, I. The development of technology foresight: A review. Technol. Forecast. Soc. Chang. 2010, 77, 1448–1456. [Google Scholar] [CrossRef]
  2. Firat, A.K.; Woon, W.L.; Madnick, S. Technological Forecasting—A Review; Composite Information Systems Laboratory (Confederazione Italiana Sindacati Lavoratori); Massachusetts Institute of Technology: Cambridge, MA, USA, 2008. [Google Scholar]
  3. Grupp, H.; Linstone, H.A. National technology foresight activities around the globe. Technol. Forecast. Soc. Chang. 1999, 60, 85–94. [Google Scholar] [CrossRef]
  4. Pietrobelli, C.; Puppato, F. Technology foresight and industrial strategy. Technol. Forecast. Soc. Chang. 2016, 110, 117–125. [Google Scholar] [CrossRef]
  5. Cho, Y.; Yoon, S.-P.; Kim, K.-S. An industrial technology roadmap for supporting public R&D planning. Technol. Forecast. Soc. Chang. 2016, 107, 1–12. [Google Scholar] [CrossRef]
  6. Barnes, S.J.; Mattsson, J. Understanding current and future issues in collaborative consumption: A four-stage Delphi study. Technol. Forecast. Soc. Chang. 2016, 104, 200–211. [Google Scholar] [CrossRef]
  7. Yoon, J.; Kim, K. Identifying rapidly evolving technological trends for R&D planning using SAO-based semantic patent networks. Scientometrics 2011, 88, 213–228. [Google Scholar] [CrossRef]
  8. Yeo, W.; Kim, S.; Park, H.; Kang, J. A bibliometric method for measuring the degree of technological innovation. Technol. Forecast. Soc. Chang. 2015, 95, 152–162. [Google Scholar] [CrossRef]
  9. Jun, S. A forecasting model for technological trend using unsupervised learning. In Communications in Computer and Information Science; Springer: Berlin, Germany, 2011; pp. 51–60. [Google Scholar] [CrossRef]
  10. Furukawa, T.; Mori, K.; Arino, K.; Hayashi, K.; Shirakawa, N. Identifying the evolutionary process of emerging technologies: A chronological network analysis of World Wide Web conference sessions. Technol. Forecast. Soc. Chang. 2015, 91, 280–294. [Google Scholar] [CrossRef]
  11. Chen, Y.; Luh, P.B.; Guan, C.; Zhao, Y.; Michel, L.D.; Coolbeth, M.A.; Friedland, P.B.; Rourke, S.J. Short-term load forecasting: Similar day-based wavelet neural networks. IEEE Trans. Power Syst. 2010, 25, 322–330. [Google Scholar] [CrossRef]
  12. Zhou, M.; Wang, B.; Guo, S.; Watada, J. Multi-objective prediction intervals for wind power forecast based on deep neural networks. Inf. Sci. 2021, 550, 207–220. [Google Scholar] [CrossRef]
  13. Mudassir, M.; Bennbaia, S.; Unal, D.; Hammoudeh, M. Time-series forecasting of bitcoin prices using high-dimensional features: A machine learning approach. Neural Comput. Appl. 2020, 1–15. [Google Scholar] [CrossRef]
  14. Cai, H.; Wu, Z.; Huang, C.; Huang, D. Wind power forecasting based on ensemble empirical mode decomposition with generalized regression neural network based on cross-validated method. J. Electr. Eng. Technol. 2019, 14, 1823–1829. [Google Scholar] [CrossRef]
  15. Gui, M.; Xu, X. Technology forecasting using deep learning neural network: Taking the case of robotics. IEEE Access 2021, 9, 53306–53316. [Google Scholar] [CrossRef]
  16. Zhou, Y.; Dong, F.; Liu, Y.; Li, Z.; Du, J.F.; Zhang, L. Forecasting emerging technologies using data augmentation and deep learning. Scientometrics 2020, 123, 1–29. [Google Scholar] [CrossRef]
  17. Lee, J.Y.; Ahn, S.; Kim, D. Deep learning-based prediction of future growth potential of technologies. PLoS ONE 2021, 16, e0252753. [Google Scholar] [CrossRef]
  18. Tran, Q.K.; Song, S. Water level forecasting based on deep learning: A use case of Trinity River-Texas-The United States. J. KIISE 2020, 44, 607–612. [Google Scholar] [CrossRef]
  19. Cho, W.; Kang, D. Estimation method of river water level using LSTM. In Proceedings of the Korea Conference on Software Engineering, Busan, Korea, 20–22 December 2017; pp. 439–441. [Google Scholar]
  20. Kim, H.; Tak, H.; Cho, H. Design of photovoltaic power generation prediction model with recurrent neural network. J. KIISE 2019, 46, 506–514. [Google Scholar] [CrossRef]
  21. Son, H.; Kim, S.; Jang, Y. LSTM-based 24-h solar power forecasting model using weather forecast data. KIISE Trans. Comput. Pract. 2020, 26, 435–441. [Google Scholar] [CrossRef]
  22. Yi, H.; Bui, K.N.; Seon, C.N. A deep learning LSTM framework for urban traffic flow and fine dust prediction. J. KIISE 2020, 47, 292–297. [Google Scholar] [CrossRef]
  23. Jo, S.; Jeong, M.; Lee, J.; Oh, I.; Han, Y. Analysis of Correlation of Wind Direction/Speed and Particulate Matter (PM10) and Prediction of Particulate Matter Using LSTM. In Proceedings of the Korea Computer Congress, Busan, Korea, 2–4 July 2020; pp. 1649–1651. [Google Scholar]
  24. Munir, M.S.; Abedin, S.F.; Alam, G.R.; Kim, D.H.; Hong, C.S. RNN based energy demand prediction for smart-home in smart-grid framework. In Proceedings of the Korea Conference on Software Engineering, Busan, Korea, 20–22 December 2017; pp. 437–439. [Google Scholar]
  25. Kwon, D.; Kwon, S.; Byun, J.; Kim, M. Forecasting KOSPI Index with LSTM deep learning model using COVID-19 data. In Proceedings of the Korea Conference on Software Engineering, Seoul, Korea, 5–11 October 2020; Volume 270, pp. 1367–1369. [Google Scholar]
  26. Fischer, T.; Krauss, C. Deep learning with long short-term memory networks for financial market predictions. Eur. J. Oper. Res. 2018, 270, 654–669. [Google Scholar] [CrossRef]
  27. Noh, S.-H. Analysis of gradient vanishing of RNNs and performance comparison. Information 2021, 12, 442. [Google Scholar] [CrossRef]
  28. Wu, H.; Salton, G. A comparison of search term weighting: Term relevance vs. inverse document frequency. ACM SIGIR Forum 1981, 16, 30–39. [Google Scholar] [CrossRef]
Figure 1. Long short-term memory (LSTM) model (source: [27]).
Figure 1. Long short-term memory (LSTM) model (source: [27]).
Informatics 09 00077 g001
Figure 2. Architecture of stacked long short-term memory (LSTM): patent _ num t is replaced by paper _ num t for SCI paper analysis.
Figure 2. Architecture of stacked long short-term memory (LSTM): patent _ num t is replaced by paper _ num t for SCI paper analysis.
Informatics 09 00077 g002
Figure 3. Number of patents in the input data and the predicted number of patents by year for the technologies predicted as the future promising technologies.
Figure 3. Number of patents in the input data and the predicted number of patents by year for the technologies predicted as the future promising technologies.
Informatics 09 00077 g003
Figure 4. Number of Science Citation Index (SCI) papers in the input data and the predicted number of SCI papers per year for the technology predicted as a future promising technology.
Figure 4. Number of Science Citation Index (SCI) papers in the input data and the predicted number of SCI papers per year for the technology predicted as a future promising technology.
Informatics 09 00077 g004
Figure 5. Predicted number of patents and predicted number of Science Citation Index (SCI) papers for ten derived future promising technologies: A, SP, D, V, SO, PL, Q, E, G, and PE refer to “Augmented reality,” “Spatial computing,” “Digital medicine,” “Virtual patients,” “Social robots,” “Plasmonic materials,” “Quantum sensing,” “Electric aviation,” “Green hydrogen,” and “Personalized medicine,” respectively. In the patent number prediction graph, the number of patents in augmented reality exceeded 1000 and it did not appear in the graph. The graph of the predicted number of patents for augmented reality technology is shown in Figure 4.
Figure 5. Predicted number of patents and predicted number of Science Citation Index (SCI) papers for ten derived future promising technologies: A, SP, D, V, SO, PL, Q, E, G, and PE refer to “Augmented reality,” “Spatial computing,” “Digital medicine,” “Virtual patients,” “Social robots,” “Plasmonic materials,” “Quantum sensing,” “Electric aviation,” “Green hydrogen,” and “Personalized medicine,” respectively. In the patent number prediction graph, the number of patents in augmented reality exceeded 1000 and it did not appear in the graph. The graph of the predicted number of patents for augmented reality technology is shown in Figure 4.
Informatics 09 00077 g005
Table 1. World Economic Forum “Top 10 Emerging Technologies” (2018–2021).
Table 1. World Economic Forum “Top 10 Emerging Technologies” (2018–2021).
No.2018201920202021
1Augmented realityBioplastics for a circular economyMicroneedles for painless injections and testsDecarbonization rises
2Personalized medicineSocial robotsSun-powered chemistryCrops that self-fertilize
3AI-led molecular designLenses for miniature devicesVirtual patientsBreath sensors diagnose disease
4More capable digital helpersDisordered proteins as drug targetsSpatial computingOn-demand drug manufacturing
5Implantable drug-making cellsSmarter fertilizers can reduce environmental contaminationDigital medicineEnergy from wireless signals
6Gene driveCollaborative telepresenceElectric aviationEngineering better ageing
7Algorithm for quantum computersAdvanced food tracking and packingLower-carbon cementGreen ammonia
8Plasmonic materialsSafer nuclear reactorsQuantum sensingBiomarker devices go wireless
9Lab-grown meatDNA data for storageGreen hydrogenHouses printed with local materials
10ElectroceuticalsUtility-scale storage of renewable energyWhole-genome synthesisSpace connects the globe
Table 2. Number of patents and papers collected for technology in Table 1 (the left shows the number of patents and the right shows the number of papers).
Table 2. Number of patents and papers collected for technology in Table 1 (the left shows the number of patents and the right shows the number of papers).
No.2018201920202021Sum
141,088/16,2040/2340/50/25
2134/74591/42670/10/3
30/140/3191/89230/16
40/510/11209/46,8980/13
50/50/12449/16,2370/0
6152/7376/1547/15940/0
72/70/60/3212/455
8360/17351/980279/47320/0
90/018/12840/33140/0
1027/140/365/270/0
Sum41,763/19,512116/5971680/81,76312/51242,571/107,758
Table 3. Rate of increase and prediction accuracy of the predicted number of patents by technology.
Table 3. Rate of increase and prediction accuracy of the predicted number of patents by technology.
No.TechnologyRate of IncreaseAccuracy (%)
1Augmented reality106.843378.62
2Collaborative telepresence0.04967573.44
3Digital medicine0.225994.50
4DNA data storage0.0082596.85
5Electric aviation0.03797582.18
6Electroceuticals0.4025593.50
7Gene drive0.09207594.76
8Green ammonia−0.0087879.60
9Green hydrogen0.2274590.70
10Personalized medicine0.44767594.41
11Plasmonic materials2.4542595.53
12Quantum sensing1.858897.55
13Social robots0.8292.17
14Spatial computing2.29502594.50
15Virtual patients2.405692.62
16Whole-genome synthesis0.007789.36
Table 4. Keywords extracted based on TF-IDF.
Table 4. Keywords extracted based on TF-IDF.
(a) Keywords of Plasmonic Materials
WordTFDFTF_IDF
Layer355103442.77
Material353124375.36
Surface326130331.36
Structure15849312.78
Peg13336303.34
Light19478295.31
Optical15859283.97
Region10826280.35
Waveguide12650246.94
Plasmonic206111241.67
Metal10539231.29
Devices16691227.40
Transducer186106226.70
Least179102224.99
Positioned10946222.52
Magnetic10543221.28
Portion11451221.21
Substrate15084217.35
Nft10950213.62
Dielectric10244212.67
Nearfield10048199.98
Oxide6921193.24
Configured10559188.72
Device11574181.03
Field13190180.88
Film7029174.33
Thereof7333172.67
Conductive7030172.04
Electromagnetic6626171.32
(b) Keywords of Quantum Sensing
WordTFDFTF_IDF
Layer20345369.53
Material28593315.12
Light18154297.14
Quantum15242286.94
Diamond15041286.70
Optical18059279.83
Magnetic17167244.44
Spin8013240.79
Field12247216.89
Excitation9732208.79
Device18390208.27
Semiconductor13258207.43
Nanometers484193.90
Configured12459192.77
Signal10242192.55
Region7220187.52
Substrate10549182.38
Source9844180.55
Surface9041172.02
Frequency8339162.69
Defect6119161.85
Diode5514161.75
Detector9756155.77
System8753144.42
Sensor6126143.54
Magnetooptical4612141.87
Unit418141.52
Micro4411139.22
Array5725136.28
TF, term frequency; DF, document frequency; TF_IDF, term frequency-inverse document frequency.
Table 5. Rate of increase and prediction accuracy of the predicted number of Science Citation Index (SCI) papers by technology.
Table 5. Rate of increase and prediction accuracy of the predicted number of Science Citation Index (SCI) papers by technology.
No.TechnologyRate of IncreaseAccuracy (%)
1Advanced food tracking and packaging0.03957584.32
2AI-led molecular design0.02282591.39
3Algorithms for quantum computers0.01357582.32
4Augmented reality78.937181.91
5Bioplastics for a circular economy2.06667594.70
6Breath sensors diagnose disease0.01557574.27
7Collaborative telepresence−0.2387892.32
8Decarbonization rises−0.0481588.86
9Digital medicine39.6045585.55
10Disordered proteins as drug targets0.00372576.20
11DNA data for storage−0.3162891.49
12Electric aviation3.4358576.81
13Electroceuticals0.09437590.43
14Gene drive0.11681.29
15Green ammonia1.2571594.31
16Green hydrogen2.002692.00
17Implantable drug-making cells0.000773.12
18Lower-carbon cement0.15852584.36
19Microneedles for painless injections and tests0.002373.42
20More capable digital helpers0.3317589.39
21On-demand drug manufacturing0.014488.04
22Personalized medicine1.21635788.90
23Plasmonic materials6.931184.02
24Quantum sensing5.73557571.52
25Safer nuclear reactors−0.041584.85
26Smarter fertilizers can reduce environmental contamination−0.1652795.13
27Social robots9.4625576.76
28Spatial computing71.0443870.20
29Tiny lenses for miniature devices0.09762575.01
30Utility-scale storage of renewable energy−0.2912893.85
31Virtual patients34.5974393.26
32Whole-genome synthesis0.05302587.81
Table 6. Technologies predicted to grow in both number of patents and Science Citation Index (SCI) papers.
Table 6. Technologies predicted to grow in both number of patents and Science Citation Index (SCI) papers.
TechnologyRate of Increase
(SCI Papers)
Rate of Increase
(Patents)
Augmented reality78.9371106.8433
Spatial computing71.044382.295025
Digital medicine39.604550.2259
Virtual patients34.597432.4056
Social robots9.462550.82
Plasmonic materials6.93112.45425
Quantum sensing5.7355751.8588
Electric aviation3.435850.037975
Green hydrogen2.00260.22745
Personalized medicine1.2163570.447675
Gene drive0.1160.092075
Electroceuticals0.0943750.40255
Whole-genome synthesis0.0530250.0077
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Noh, S.-H. Predicting Future Promising Technologies Using LSTM. Informatics 2022, 9, 77. https://doi.org/10.3390/informatics9040077

AMA Style

Noh S-H. Predicting Future Promising Technologies Using LSTM. Informatics. 2022; 9(4):77. https://doi.org/10.3390/informatics9040077

Chicago/Turabian Style

Noh, Seol-Hyun. 2022. "Predicting Future Promising Technologies Using LSTM" Informatics 9, no. 4: 77. https://doi.org/10.3390/informatics9040077

APA Style

Noh, S. -H. (2022). Predicting Future Promising Technologies Using LSTM. Informatics, 9(4), 77. https://doi.org/10.3390/informatics9040077

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop