Predicting Blood Glucose Concentration after Short-Acting Insulin Injection Using Discontinuous Injection Records
Abstract
:1. Introduction
2. Proposed Method
2.1. Dataset Extraction and Integration
- Extract discontinuous short-acting insulin bolus injections occurring in the ICU from the electronic medical records in the MIMIC-IV database.
- Match the blood glucose values before insulin injection and around the peak time of insulin efficacy for each insulin injection event.
- Query and match the relevant characteristic values for each insulin injection event as the prediction basis.
- Remove all missing values and possible erroneous values.
2.2. Neural Network Training and Hyperparameter Tuning
2.3. Network Evaluation Metrics
- MAE: mean absolute error, which is calculated as the sum of absolute errors divided by the sample size. Lower values indicate better model fitting and more accurate prediction.
- RMSE: root mean square error, which is the square root of the average squared errors. Lower values indicate better model fitting and more accurate prediction.
- R2_Score: The coefficient of determination, denoted or , can represent the proportion of the dependent variable that is predictable from the independent variables. Higher values indicate better model fitting and more accurate prediction.
- EGA [61]: Clarke error grid analysis, to quantify clinical accuracy of the predicted blood glucose values by subdividing the prediction results into five zones. The prediction values located in Zone A represent accurate predictions, and values in Zone B are acceptable glucose results. Values in Zone C may prompt unnecessary corrections. Values in Zone D indicate a dangerous failure to detect and treat. Values in Zone E represent erroneous treatment. More prediction results located in Zone A and Zone B indicate better model fit.
2.4. Machine Learning for Comparison
2.5. Model Retest
3. Experiment Results
3.1. Dataset Extracted from MIMIC-IV Database
3.2. Deep Neural Networks and Performance Evaluation
3.3. Comparison with Machine Learning
3.4. Test–Retest Reliability
4. Conclusions
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Conflicts of Interest
Appendix A. Appendix Tables
Model | Cohort | Days | Glucose Record | C | M | PH | Test Performance | Ref | Year |
---|---|---|---|---|---|---|---|---|---|
Previous related studies: | |||||||||
FCN 1 | 100 VP 2 | - | random simulation | after meal | reduce BGRI 3 | [44] | 2018 | ||
LSTM 4 | 1 T2D 5 | 30 | by SMBG 6 | ✓ | 60 min | RMSE 7≈18.79 mg/dL | [45] | 2019 | |
LSTM | 174 patients | 2∼3 | every 5 min by CGM 8 | ✓ | 30/60/120 min | RMSE≈19.44/22.86/24.48 mg/dL | [46] | 2019 | |
CNN 9 | 20 VP, 16 T1D 10 | 180 | every 5 min by CGM | ✓ | ✓ | 30/60 min | RMSE≈19.28/31.83 mg/dL | [47] | 2020 |
CRNN 11 | 10 VP, 10 T1D | 180 | every 5 min by CGM | ✓ | ✓ | 30/60 min | RMSE≈21.07/33.27 mg/dL | [48] | 2020 |
LSTM | 5 T1D | 400 | every 5 min by CGM | ✓ | ✓ | 30/60 min | RMSE≈21.4/38.0 mg/dL | [49] | 2017 |
GRU 12 | 12 T1D | 56 | every 5 min by CGM | ✓ | 30 min | RMSE≈21.54 mg/dL | [50] | 2021 | |
FCN | 25 T1D | 1∼2 | every 5 min by CGM | ✓ | 30 min | PRED-EGA 13, accuracy < 95% | [51] | 2017 | |
FCN | 27 patients | 16 | every 5 min by CGM | ✓ | ✓ | 75 min | RMSE≈43.9 mg/dL | [52] | 2011 |
Best prediction model proposed in this study: | |||||||||
LSTM | 20,426 patients | - | discrete record | about 4h | RMSE≈15.82 mg/dL | - | 2022 |
network_id: | FCN1 | layer number: | 12 | parameters number: | 6306945 |
training epochs: | 7000 | layer type: | Fully Connected Layer(units) 1 | ||
network architecture: | |||||
FCL(2048)-FCL(1536)-FCL(1024)-FCL(768)-FCL(512)-FCL(384)-FCL(256)-FCL(128)-FCL(64)-FCL(32)-FCL(16)-FCL(1) | |||||
network_id: | FCN2 | layer number: | 10 | parameters number: | 914905 |
training epochs: | 4000 | layer type: | Fully Connected Layer(units) | ||
network architecture: | |||||
FCL(1024)-FCL(512)-FCL(400)-FCL(256)-FCL(128)-FCL(72)-FCL(64)-FCL(32)-FCL(16)-FCL(1) | |||||
network_id: | CNN1 | layer number: | 12 | parameters number: | 1040241 |
training epochs: | 7000 | layer type: | Convolutional Layer(filter, kernel_size) 2, Pooling Layer(pool_size) 3, Fully Connected Layer(units) | ||
network architecture: | |||||
CL(16,3)-CL(64,3)-CL(128,3)-CL(256,3)-CL(512,3)-CL(256,3)-CL(128,3)-CL(64,3)-CL(16,3)-CL(8,3)-PL(3)-FCL(1) | |||||
network_id: | CNN2 | layer number: | 12 | parameters number: | 5868897 |
training epochs: | 4000 | layer type: | Convolutional Layer(filter, kernel_size), Fully Connected Layer(units) | ||
network architecture: | |||||
CL(32,3)-CL(64,3)-CL(128,3)-CL(64,3)-FCL(2048)-FCL(1024)-FCL(512)-FCL(256)-FCL(128)-FCL(64)-FCL(16)-FCL(1) | |||||
network_id: | CNN3 | layer number: | 12 | parameters number: | 1040353 |
training epochs: | 7000 | layer type: | Convolutional Layer(filter, kernel_size),Fully Connected Layer(units) | ||
network architecture: | |||||
CL(16,3)-CL(64,3)-CL(128,3)-CL(256,3)-CL(512,3)-CL(256,3)-CL(128,3)-CL(64,3)-CL(16,3)-CL(8,3)-CL(4,3)-FCL(1) | |||||
network_id: | CNN4 | layer number: | 12 | parameters number: | 9381633 |
training epochs: | 4000 | layer type: | Convolutional Layer(filter, kernel_size), Fully Connected Layer(units) | ||
network architecture: | |||||
CL(16,3)-CL(64,3)-CL(128,3)-FCL(2048)-FCL(1024)-FCL(512)-FCL(256)-FCL(128)-FCL(64)-FCL(32)-FCL(16)-FCL(1) | |||||
network_id: | CNN5 | layer number: | 12 | parameters number: | 7753953 |
training epochs: | 7000 | layer type: | Convolutional Layer(filter, kernel_size),Fully Connected Layer(units) | ||
network architecture: | |||||
CL(32,3)-CL(64,3)-CL(128,3)-CL(64,3)-FCL(2048)-FCL(1536)-FCL(768)-FCL(384)-FCL(128)-FCL(64)-FCL(16)-FCL(1) | |||||
network_id: | CNN6 | layer number: | 12 | parameters number: | 1041937 |
training epochs: | 7000 | layer type: | Convolutional Layer(filter, kernel_size), Pooling Layer(pool_size), Fully Connected Layer(units) | ||
network architecture: | |||||
CL(16,3)-CL(64,3)-CL(128,3)-CL(256,3)-CL(512,3)-CL(256,3)-CL(128,3)-CL(64,3)-CL(16,3)-PL(3)-FCL(32)-FCL(1) | |||||
network_id: | CNN7 | layer number: | 11 | parameters number: | 1047777 |
training epochs: | 4000 | layer type: | Convolutional Layer(filter, kernel_size),Fully Connected Layer(units) | ||
network architecture: | |||||
CL(32,3)-CL(64,3)-CL(128,3)-CL(256,3)-CL(512,3)-CL(256,3)-CL(128,3)-CL(64,3)-CL(32,3)-CL(16,3)-FCL(1) | |||||
network_id: | LSTM1 | layer number: | 12 | parameters number: | 75779817 |
training epochs: | 6000 | layer type: | LSTM Layer(units) 4, TimeDistributed Layer(Fully Connected Layer(units)) | ||
network architecture: | |||||
LSTM(2048)-LSTM(2048)-LSTM(1024)-LSTM(1024)-LSTM(512)-LSTM(256)-LSTM(128)-LSTM(64)-LSTM(32)-LSTM(16) -LSTM(8)-TDL(FCL(1)) | |||||
network_id: | LSTM2 | layer number: | 12 | parameters number: | 8523497 |
training epochs: | 6000 | layer type: | LSTM Layer(units), Dropout Layer(dropout_rate), TimeDistributed Layer(Fully Connected Layer(units)) | ||
network architecture: | |||||
LSTM(1024)-Dropout(0.2)-LSTM(512)-Dropout(0.2)-LSTM(256)-Dropout(0.2)-LSTM(128)-LSTM(64)-LSTM(32)-LSTM(16) -LSTM(8)-TDL(FCL(1)) | |||||
network_id: | LSTM3 | layer number: | 12 | parameters number: | 33824489 |
training epochs: | 3000 | layer type: | LSTM Layer(units), Dropout Layer(dropout_rate), TimeDistributed Layer(Fully Connected Layer(units)) | ||
network architecture: | |||||
LSTM(2048)-Dropout(0.2)-LSTM(1024)-Dropout(0.2)-LSTM(512)-LSTM(256)-LSTM(128)-LSTM(64)-LSTM(32)-LSTM(16) -LSTM(8)-TDL(FCL(1)) | |||||
network_id: | LSTM4 | layer number: | 12 | parameters number: | 44316393 |
training epochs: | 6000 | layer type: | LSTM Layer(units), TimeDistributed Layer(Fully Connected Layer(units)) | ||
network architecture: | |||||
LSTM(2048)-LSTM(1024)-LSTM(1024)-LSTM(512)-LSTM(512)-LSTM(256)-LSTM(128)-LSTM(64)-LSTM(32)-LSTM(16) -LSTM(8)-TDL(FCL(1)) |
The RMSE Value for Each Testing Fold (mg/dL) | |||||
---|---|---|---|---|---|
Models | Fold1 | Fold2 | Fold3 | Fold4 | Fold5 |
FCN1 | 15.96 | 16.08 | 16.83 | 16.46 | 16.68 |
FCN2 | 15.93 | 16.39 | 16.88 | 16.74 | 16.87 |
CNN1 | 15.83 | 16.12 | 16.95 | 16.69 | 16.07 |
CNN2 | 16.03 | 16.25 | 16.54 | 16.02 | 16.16 |
CNN3 | 16.12 | 16.49 | 16.88 | 16.17 | 16.58 |
CNN4 | 16.28 | 16.00 | 16.64 | 16.09 | 15.90 |
CNN5 | 16.09 | 16.03 | 16.64 | 16.59 | 16.14 |
CNN6 | 15.87 | 16.04 | 16.71 | 16.54 | 16.35 |
CNN7 | 15.79 | 16.22 | 16.6 | 16.83 | 16.16 |
LSTM1 | 16.43 | 16.53 | 16.66 | 16.60 | 16.42 |
LSTM2 | 16.67 | 17.06 | 17.44 | 17.06 | 17.06 |
LSTM3 | 16.8 | 16.81 | 17.41 | 17.28 | 17.46 |
LSTM4 | 16.27 | 16.72 | 16.86 | 16.53 | 16.69 |
The results for the Chi-Squared Test | |||||
: | 0.1236 | ||||
Freedom degree: | 48 | ||||
p-value: | >0.995 |
References
- IDF Diabetes Atlas. Available online: https://www.diabetesatlas.org (accessed on 21 June 2022).
- Baker, L.; Maley, J.H.; Arévalo, A.; DeMichele, F.; Mateo-Collado, R.; Finkelstein, S.; Celi, L.A. Real-world characterization of blood glucose control and insulin use in the intensive care unit. Sci. Rep. 2020, 10, 10718. [Google Scholar] [CrossRef] [PubMed]
- Abdelhamid, Y.A.; Kar, P.; Finnis, M.E.; Phillips, L.K.; Plummer, M.P.; Shaw, J.E.; Horowitz, M.; Deane, A.M. Stress hyperglycaemia in critically ill patients and the subsequent risk of diabetes: A systematic review and meta-analysis. Crit. Care 2016, 20, 301. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Marik, P.E.; Bellomo, R. Stress hyperglycemia: An essential survival response! Crit. Care 2013, 17, 305. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Umpierrez, G.E.; Isaacs, S.D.; Bazargan, N.; You, X.; Thaler, L.M.; Kitabchi, A.E. Hyperglycemia: An Independent Marker of In-Hospital Mortality in Patients with Undiagnosed Diabetes. J. Clin. Endocrinol. Metab. 2002, 87, 978–982. [Google Scholar] [CrossRef]
- Whitcomb, B.W.; Pradhan, E.K.; Pittas, A.G.; Roghmann, M.C.; Perencevich, E.N. Impact of admission hyperglycemia on hospital mortality in various intensive care unit populations. Crit. Care Med. 2005, 33, 2772–2777. [Google Scholar] [CrossRef]
- Barsheshet, A.; Garty, M.; Grossman, E.; Sandach, A.; Lewis, B.S.; Gottlieb, S.; Shotan, A.; Behar, S.; Caspi, A.; Schwartz, R.; et al. Admission blood glucose level and mortality among hospitalized nondiabetic patients with heart failure. Arch. Intern. Med. 2006, 166, 1613–1619. [Google Scholar] [CrossRef] [Green Version]
- Preiser, J.C.; Marik, P.E. Hyperglycemia-related mortality in critically ill patients varies with admission diagnosis. Crit. Care Med. 2010, 38, 1388. [Google Scholar] [CrossRef]
- Viana, M.V.; Moraes, R.B.; Fabbrin, A.R.; Santos, M.F.; Gerchman, F. Assessment and treatment of hyperglycemia in critically ill patients. Rev. Bras. Ter. Intensiv. 2014, 26, 71–76. [Google Scholar] [CrossRef]
- Liao, W.I.; Wang, J.C.; Chang, W.C.; Hsu, C.W.; Chu, C.M.; Tsai, S.H. Usefulness of glycemic gap to predict ICU mortality in critically ill patients with diabetes. Medicine 2015, 94, e1525. [Google Scholar] [CrossRef]
- Van den Berghe, G.; Wouters, P.; Weekers, F.; Verwaest, C.; Bruyninckx, F.; Schetz, M.; Vlasselaers, D.; Ferdinande, P.; Lauwers, P.; Bouillon, R. Intensive Insulin Therapy in Critically Ill Patients. N. Engl. J. Med. 2001, 345, 1359–1367. [Google Scholar] [CrossRef]
- Malmberg, K.; Norhammar, A.; Wedel, H.; Rydén, L. Glycometabolic State at Admission: Important Risk Marker of Mortality in Conventionally Treated Patients With Diabetes Mellitus and Acute Myocardial Infarction. Circulation 1999, 99, 2626–2632. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Furnary, A.P.; Gao, G.; Grunkemeier, G.L.; Wu, Y.X.; Zerr, K.J.; Bookin, S.O.; Floten, H.S.; Starr, A. Continuous insulin infusion reduces mortality in patients with diabetes undergoing coronary artery bypass grafting. J. Thorac. Cardiovasc. Surg. 2003, 125, 1007–1021. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Van den Berghe, G.; Wilmer, A.; Hermans, G.; Meersseman, W.; Wouters, P.J.; Milants, I.; Van Wijngaerden, E.; Bobbaers, H.; Bouillon, R. Intensive Insulin Therapy in the Medical ICU. N. Engl. J. Med. 2006, 354, 449–461. [Google Scholar] [CrossRef] [Green Version]
- Reed, C.C.; Stewart, R.M.; Sherman, M.; Myers, J.G.; Corneille, M.G.; Larson, N.; Gerhardt, S.; Beadle, R.; Gamboa, C.; Dent, D.; et al. Intensive Insulin Protocol Improves Glucose Control and Is Associated with a Reduction in Intensive Care Unit Mortality. J. Am. Coll. Surg. 2007, 204, 1048–1055. [Google Scholar] [CrossRef] [Green Version]
- Vlasselaers, D.; Milants, I.; Desmet, L.; Wouters, P.J.; Vanhorebeek, I.; van den Heuvel, I.; Mesotten, D.; Casaer, M.P.; Meyfroidt, G.; Ingels, C.; et al. Intensive insulin therapy for patients in paediatric intensive care: A prospective, randomised controlled study. Lancet 2009, 373, 547–556. [Google Scholar] [CrossRef]
- Wang, H.C.; Lee, A.R. Recent developments in blood glucose sensors. J. Food Drug Anal. 2015, 23, 191–200. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Rosenblatt, F. The perceptron: A probabilistic model for information storage and organization in the brain. Psychol. Rev. 1958, 65, 386. [Google Scholar] [CrossRef] [Green Version]
- Werbos, P. Beyond Regression: New Tools for Prediction and Analysis in the Behavioral Sciences. Ph.D. Thesis, Harvard University, Cambridge, MA, USA, 1974. [Google Scholar]
- Rumelhart, D.E.; Hinton, G.E.; Williams, R.J. Learning representations by back-propagating errors. Nature 1986, 323, 533–536. [Google Scholar] [CrossRef]
- Lecun, Y.; Bottou, L.; Bengio, Y.; Haffner, P. Gradient-based learning applied to document recognition. Proc. IEEE 1998, 86, 2278–2324. [Google Scholar] [CrossRef] [Green Version]
- Fukushima, K.; Miyake, S. Neocognitron: A Self-Organizing Neural Network Model for a Mechanism of Visual Pattern Recognition. In Proceedings of the Competition and Cooperation in Neural Nets, Kyoto, Japan, 15–19 February 1982; pp. 267–285. [Google Scholar] [CrossRef]
- Waibel, A.; Hanazawa, T.; Hinton, G.; Shikano, K.; Lang, K. Phoneme recognition using time-delay neural networks. IEEE Trans. Acoust. Speech Signal Process. 1989, 37, 328–339. [Google Scholar] [CrossRef]
- Hinton, G.E.; Osindero, S.; Teh, Y.W. A Fast Learning Algorithm for Deep Belief Nets. Neural Comput. 2006, 18, 1527–1554. [Google Scholar] [CrossRef]
- Dahl, G.E.; Yu, D.; Deng, L.; Acero, A. Context-Dependent Pre-Trained Deep Neural Networks for Large-Vocabulary Speech Recognition. IEEE Trans. Audio Speech Lang. Process. 2012, 20, 30–42. [Google Scholar] [CrossRef] [Green Version]
- Krizhevsky, A.; Sutskever, I.; Hinton, G.E. ImageNet Classification with Deep Convolutional Neural Networks. Commun. ACM 2017, 60, 84–90. [Google Scholar] [CrossRef] [Green Version]
- Schroff, F.; Kalenichenko, D.; Philbin, J. FaceNet: A unified embedding for face recognition and clustering. In Proceedings of the 2015 IEEE Conference on Computer Vision and Pattern Recognition (CVPR), Boston, MA, USA, 7–12 June 2015; pp. 815–823. [Google Scholar] [CrossRef] [Green Version]
- He, K.; Zhang, X.; Ren, S.; Sun, J. Deep Residual Learning for Image Recognition. In Proceedings of the 2016 IEEE Conference on Computer Vision and Pattern Recognition (CVPR), Las Vegas, NV, USA, 27–30 June 2016; pp. 770–778. [Google Scholar] [CrossRef] [Green Version]
- Silver, D.; Huang, A.; Maddison, C.J.; Guez, A.; Sifre, L.; Van Den Driessche, G.; Schrittwieser, J.; Antonoglou, I.; Panneershelvam, V.; Lanctot, M.; et al. Mastering the game of Go with deep neural networks and tree search. Nature 2016, 529, 484–489. [Google Scholar] [CrossRef]
- Naylor, C.D. On the Prospects for a (Deep) Learning Health Care System. JAMA 2018, 320, 1099–1100. [Google Scholar] [CrossRef] [PubMed]
- Stead, W.W. Clinical Implications and Challenges of Artificial Intelligence and Deep Learning. JAMA 2018, 320, 1107–1108. [Google Scholar] [CrossRef] [PubMed]
- Hinton, G. Deep Learning—A Technology With the Potential to Transform Health Care. JAMA 2018, 320, 1101–1102. [Google Scholar] [CrossRef]
- Ehteshami Bejnordi, B.; Veta, M.; Johannes van Diest, P.; van Ginneken, B.; Karssemeijer, N.; Litjens, G.; van der Laak, J.A.W.M.; the CAMELYON16 Consortium. Diagnostic Assessment of Deep Learning Algorithms for Detection of Lymph Node Metastases in Women With Breast Cancer. JAMA 2017, 318, 2199–2210. [Google Scholar] [CrossRef] [Green Version]
- De Fauw, J.; Ledsam, J.R.; Romera-Paredes, B.; Nikolov, S.; Tomasev, N.; Blackwell, S.; Askham, H.; Glorot, X.; O’Donoghue, B.; Visentin, D.; et al. Clinically applicable deep learning for diagnosis and referral in retinal disease. Nat. Med. 2018, 24, 1342–1350. [Google Scholar] [CrossRef]
- Ribeiro, A.H.; Ribeiro, M.H.; Paixão, G.M.M.; Oliveira, D.M.; Gomes, P.R.; Canazart, J.A.; Ferreira, M.P.S.; Andersson, C.R.; Macfarlane, P.W.; Meira Jr., W.; et al. Automatic diagnosis of the 12-lead ECG using a deep neural network. Nat. Commun. 2020, 11, 1760. [Google Scholar] [CrossRef] [Green Version]
- Lima, E.M.; Ribeiro, A.H.; Paixão, G.M.M.; Ribeiro, M.H.; Pinto-Filho, M.M.; Gomes, P.R.; Oliveira, D.M.; Sabino, E.C.; Duncan, B.B.; Giatti, L.; et al. Deep neural network-estimated electrocardiographic age as a mortality predictor. Nat. Commun. 2021, 12, 5117. [Google Scholar] [CrossRef] [PubMed]
- Zhang, Y.; An, L.; Xu, J.; Zhang, B.; Zheng, W.J.; Hu, M.; Tang, J.; Yue, F. Enhancing Hi-C data resolution with deep convolutional neural network HiCPlus. Nat. Commun. 2018, 9, 750. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- de Vos, B.D.; Berendsen, F.F.; Viergever, M.A.; Sokooti, H.; Staring, M.; Išgum, I. A deep learning framework for unsupervised affine and deformable image registration. Med. Image Anal. 2019, 52, 128–143. [Google Scholar] [CrossRef] [Green Version]
- Ahn, S.H.; Yeo, A.U.; Kim, K.H.; Kim, C.; Goh, Y.; Cho, S.; Lee, S.B.; Lim, Y.K.; Kim, H.; Shin, D.; et al. Comparative clinical evaluation of atlas and deep-learning-based auto-segmentation of organ structures in liver cancer. Radiat. Oncol. 2019, 14, 213. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Zeleznik, R.; Foldyna, B.; Eslami, P.; Weiss, J.; Alexander, I.; Taron, J.; Parmar, C.; Alvi, R.M.; Banerji, D.; Uno, M.; et al. Deep convolutional neural networks to predict cardiovascular risk from computed tomography. Nat. Commun. 2021, 12, 715. [Google Scholar] [CrossRef]
- Alghamdi, M.; Al-Mallah, M.; Keteyian, S.; Brawner, C.; Ehrman, J.; Sakr, S. Predicting diabetes mellitus using SMOTE and ensemble machine learning approach: The Henry Ford ExercIse Testing (FIT) project. PLoS ONE 2017, 12, e0179805. [Google Scholar] [CrossRef] [Green Version]
- Ibragimov, B.; Toesca, D.A.; Chang, D.T.; Yuan, Y.; Koong, A.C.; Xing, L. Automated hepatobiliary toxicity prediction after liver stereotactic body radiation therapy with deep learning-based portal vein segmentation. Neurocomputing 2020, 392, 181–188. [Google Scholar] [CrossRef]
- Zhen, X.; Chen, J.; Zhong, Z.; Hrycushko, B.; Zhou, L.; Jiang, S.; Albuquerque, K.; Gu, X. Deep convolutional neural network with transfer learning for rectum toxicity prediction in cervical cancer radiotherapy: A feasibility study. Phys. Med. Biol. 2017, 62, 8246–8263. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Cappon, G.; Vettoretti, M.; Marturano, F.; Facchinetti, A.; Sparacino, G. A Neural-Network-Based Approach to Personalize Insulin Bolus Calculation Using Continuous Glucose Monitoring. J. Diabetes Sci. Technol. 2018, 12, 265–272. [Google Scholar] [CrossRef]
- Padmapritha, T. Prediction of Blood Glucose Level by using an LSTM based Recurrent Neural networks. In Proceedings of the 2019 IEEE International Conference on Clean Energy and Energy Efficient Electronics Circuit for Sustainable Development (INCCES), Krishnankoil, India, 18–20 December 2019; pp. 1–4. [Google Scholar] [CrossRef]
- Song, W.; Cai, W.; Li, J.; Jiang, F.; He, S. Predicting Blood Glucose Levels with EMD and LSTM Based CGM Data. In Proceedings of the 2019 6th International Conference on Systems and Informatics (ICSAI), Shanghai, China, 2–4 November 2019; pp. 1443–1448. [Google Scholar] [CrossRef]
- Li, K.; Liu, C.; Zhu, T.; Herrero, P.; Georgiou, P. GluNet: A Deep Learning Framework for Accurate Glucose Forecasting. IEEE J. Biomed. Health Inform. 2020, 24, 414–423. [Google Scholar] [CrossRef]
- Li, K.; Daniels, J.; Liu, C.; Herrero, P.; Georgiou, P. Convolutional Recurrent Neural Networks for Glucose Prediction. IEEE J. Biomed. Health Inform. 2020, 24, 603–613. [Google Scholar] [CrossRef] [Green Version]
- Mirshekarian, S.; Bunescu, R.; Marling, C.; Schwartz, F. Using LSTMs to learn physiological models of blood glucose behavior. In Proceedings of the Annual International Conference of the IEEE Engineering in Medicine and Biology Society, Jeju, Korea, 11–15 July 2017; pp. 2887–2891. [Google Scholar] [CrossRef]
- Dudukcu, H.V.; Taskiran, M.; Yildirim, T. Consolidated or individual training: Which one is better for blood glucose prediction? In Proceedings of the 2021 International Conference on INnovations in Intelligent SysTems and Applications (INISTA), Kocaeli, Turkey, 25–27 August 2021; pp. 1–6. [Google Scholar] [CrossRef]
- Mhaskar, H.N.; Pereverzyev, S.V.; van der Walt, M.D. A Deep Learning Approach to Diabetic Blood Glucose Prediction. Front. Appl. Math. Stat. 2017, 3, 14. [Google Scholar] [CrossRef] [Green Version]
- Pappada, S.M.; Cameron, B.D.; Rosman, P.M.; Bourey, R.E.; Papadimos, T.J.; Olorunto, W.; Borst, M.J. Neural Network-Based Real-Time Prediction of Glucose in Patients with Insulin-Dependent Diabetes. Diabetes Technol. Ther. 2011, 13, 135–141. [Google Scholar] [CrossRef]
- Johnson, A.; Bulgarelli, L.; Pollard, T.; Horng, S.; Celi, L.A.; Mark., R. MIMIC-IV (version 1.0). PhysioNet 2021. [Google Scholar] [CrossRef]
- Leahy, S.; O’ Halloran, A.; O’ Leary, N.; Healy, M.; McCormack, M.; Kenny, R.; O’ Connell, J. Prevalence and correlates of diagnosed and undiagnosed type 2 diabetes mellitus and pre-diabetes in older adults: Findings from the Irish Longitudinal Study on Ageing (TILDA). Diabetes Res. Clin. Pract. 2015, 110, 241–249. [Google Scholar] [CrossRef]
- Alhyas, L.; McKay, A.; Majeed, A. Prevalence of Type 2 Diabetes in the States of The Co-Operation Council for the Arab States of the Gulf: A Systematic Review. PLoS ONE 2012, 7, e0040948. [Google Scholar] [CrossRef] [Green Version]
- Wild, S.; Roglic, G.; Green, A.; Sicree, R.; King, H. Global Prevalence of Diabetes: Estimates for the year 2000 and projections for 2030. Diabetes Care 2004, 27, 1047–1053. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Berkowitz, G.S.; Lapinski, R.H.; Wein, R.; Lee, D. Race/Ethnicity and Other Risk Factors for Gestational Diabetes. Am. J. Epidemiol. 1992, 135, 965–973. [Google Scholar] [CrossRef] [PubMed]
- Cheng, Y.J.; Kanaya, A.M.; Araneta, M.R.G.; Saydah, S.H.; Kahn, H.S.; Gregg, E.W.; Fujimoto, W.Y.; Imperatore, G. Prevalence of Diabetes by Race and Ethnicity in the United States, 2011–2016. JAMA 2019, 322, 2389–2398. [Google Scholar] [CrossRef]
- Alaveras, A.; Thomas, S.; Sagriotis, A.; Viberti, G. Promoters of progression of diabetic nephropathy: The relative roles of blood glucose and blood pressure control. Nephrol. Dial. Transplant. Off. Publ. Eur. Dial. Transpl. Assoc.-Eur. Ren. Assoc. 1997, 12 (Suppl. S2), 71–74. [Google Scholar]
- de Boer, M.J.; Miedema, K.; Casparie, A.F. Glycosylated haemoglobin in renal failure. Diabetologia 1980, 18, 437–440. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Clarke, W.L. The Original Clarke Error Grid Analysis (EGA). Diabetes Technol. Ther. 2005, 7, 776–779. [Google Scholar] [CrossRef] [PubMed]
- Pedregosa, F.; Varoquaux, G.; Gramfort, A.; Michel, V.; Thirion, B.; Grisel, O.; Blondel, M.; Prettenhofer, P.; Weiss, R.; Dubourg, V.; et al. Scikit-Learn: Machine Learning in Python. J. Mach. Learn. Res. 2011, 12, 2825–2830. [Google Scholar]
- Bengio, Y.; Grandvalet, Y. No Unbiased Estimator of the Variance of K-Fold Cross-Validation. J. Mach. Learn. Res. 2004, 5, 1089–1105. [Google Scholar]
- Fisher, R.A. On the Interpretation of χ2 from Contingency Tables, and the Calculation of P. J. R. Stat. Soc. 1922, 85, 87–94. [Google Scholar] [CrossRef]
Characteristics 1 | Maximum | Minimum | Mean ± Standard Deviation | Median |
---|---|---|---|---|
Blood glucose before insulin (mg/dL) | 399.00 | 59.00 | 198.49 ± 56.45 | 184.00 |
Blood glucose after insulin (mg/dL) | 398.00 | 17.00 | 155.08 ± 50.58 | 146.00 |
Insulin dose (unit) | 35.00 | 0.10 | 4.69 ± 3.60 | 4.00 |
Age | 91.00 | 18.00 | 65.35 ± 13.76 | 67.00 |
Weight (kg) | 587.00 | 23.90 | 86.86 ± 25.08 | 83.70 |
Systolic blood pressure (mmHg) | 228.00 | 39.00 | 119.27 ± 19.44 | 116.50 |
Diastolic blood pressure (mmHg) | 156.00 | 14.70 | 60.2 ± 12.59 | 58.75 |
Creatinine (mg/dL) | 22.60 | 0.10 | 1.61 ± 1.50 | 1.10 |
Blood urea nitrogen (mg/dL) | 283.00 | 2.00 | 34.26 ± 26.44 | 25.00 |
Gender (num/%) 2 | Female | Male | ||
33,942/39.09% | 52,891/60.91% | |||
Ethnicity (num/%) 3 | American Indian, Asian, Black, Hispanic(Latino), White, Others (Unknown) | |||
205, 2285, 8549, 3676, 58,136, 13,982/0.24%, 2.63%, 9.85%, 4.23%, 66.95%, 16.10% |
Network_id | RMSE (mg/dL) 1 | MAE (mg/dL) 2 | R2_Score 3 | EGA (A%_B%_C%_D%_E%) 4 |
---|---|---|---|---|
FCN1 | 16.19 | 5.27 | 0.8982 | 94.06%_5.48%_0.17%_0.27%_0.02% |
FCN2 | 16.56 | 5.79 | 0.8935 | 94.00%_5.53%_0.13%_0.30%_0.04% |
CNN1 | 16.06 | 5.56 | 0.8999 | 94.07%_5.51%_0.16%_0.22%_0.04% |
CNN2 | 16.17 | 5.55 | 0.8984 | 93.97%_5.59%_0.12%_0.29%_0.03% |
CNN3 | 16.24 | 5.06 | 0.8977 | 94.03%_5.53%_0.17%_0.27%_0.03% |
CNN4 | 16.24 | 5.21 | 0.8976 | 94.26%_5.27%_0.10%_0.34%_0.02% |
CNN5 | 16.29 | 5.09 | 0.8969 | 94.03%_5.51%_0.16%_0.28%_0.02% |
CNN6 | 16.54 | 6.40 | 0.8937 | 93.71%_5.78%_0.16%_0.32%_0.03% |
CNN7 | 16.29 | 5.66 | 0.8970 | 93.99%_5.59%_0.12%_0.26%_0.03% |
LSTM1 | 15.82 | 5.17 | 0.9027 | 94.60%_4.95%_0.17%_0.25%_0.03% |
LSTM2 | 16.82 | 7.47 | 0.8902 | 94.03%_5.49%_0.19%_0.27%_0.02% |
LSTM3 | 16.88 | 10.63 | 0.8894 | 93.95%_5.53%_0.18%_0.31%_0.03% |
LSTM4 | 16.40 | 5.35 | 0.8956 | 93.95%_5.58%_0.13%_0.31%_0.03% |
Model Type | RMSE (mg/dL) | MAE (mg/dL) | R2_Score | EGA (A%_B%_C%_D%_E%) |
---|---|---|---|---|
SVR | 34.04 | 25.43 | 0.5504 | 72.65%_25.60%_0.66%_0.97%_0.13% |
KNN | 22.20 | 9.75 | 0.8086 | 89.02%_9.98%_0.23%_0.69%_0.08% |
CART | 20.04 | 5.97 | 0.8441 | 92.74%_6.56%_0.26%_0.33%_0.10% |
RF | 17.07 | 8.29 | 0.8869 | 92.80%_6.58%_0.11%_0.48%_0.03% |
XGBoost | 30.15 | 23.08 | 0.6472 | 77.00%_21.65%_0.26%_1.04%_0.05% |
Network_id | RMSE (mg/dL) | MAE (mg/dL) | R2_Score | EGA (A%_B%_C%_D%_E%) |
---|---|---|---|---|
FCN1 | 16.40 | 5.34 | 0.8947 | 94.03%_5.50%_0.18%_0.27%_0.02% |
FCN2 | 16.56 | 5.49 | 0.8926 | 94.21%_5.33%_0.12%_0.30%_0.04% |
CNN1 | 16.33 | 5.67 | 0.8956 | 94.07%_5.44%_0.17%_0.29%_0.02% |
CNN2 | 16.20 | 5.36 | 0.8973 | 94.02%_5.52%_0.16%_0.27%_0.03% |
CNN3 | 16.45 | 5.71 | 0.8942 | 93.94%_5.56%_0.18%_0.28%_0.03% |
CNN4 | 16.18 | 5.27 | 0.8975 | 93.97%_5.56%_0.17%_0.27%_0.02% |
CNN5 | 16.30 | 5.18 | 0.8961 | 94.07%_5.47%_0.20%_0.23%_0.03% |
CNN6 | 16.30 | 5.63 | 0.8961 | 94.07%_5.48%_0.16%_0.27%_0.03% |
CNN7 | 16.32 | 5.58 | 0.8958 | 93.99%_5.55%_0.16%_0.27%_0.03% |
LSTM1 | 16.53 | 5.39 | 0.8932 | 93.90%_5.58%_0.21%_0.28%_0.03% |
LSTM2 | 17.06 | 7.48 | 0.8862 | 93.97%_5.49%_0.20%_0.31%_0.02% |
LSTM3 | 17.15 | 6.34 | 0.8849 | 93.72%_5.73%_0.20%_0.29%_0.04% |
LSTM4 | 16.61 | 5.26 | 0.8920 | 93.86%_5.66%_0.20%_0.25%_0.03% |
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations. |
© 2022 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Tang, B.; Yuan, Y.; Yang, J.; Qiu, L.; Zhang, S.; Shi, J. Predicting Blood Glucose Concentration after Short-Acting Insulin Injection Using Discontinuous Injection Records. Sensors 2022, 22, 8454. https://doi.org/10.3390/s22218454
Tang B, Yuan Y, Yang J, Qiu L, Zhang S, Shi J. Predicting Blood Glucose Concentration after Short-Acting Insulin Injection Using Discontinuous Injection Records. Sensors. 2022; 22(21):8454. https://doi.org/10.3390/s22218454
Chicago/Turabian StyleTang, Baoyu, Yuyu Yuan, Jincui Yang, Lirong Qiu, Shasha Zhang, and Jinsheng Shi. 2022. "Predicting Blood Glucose Concentration after Short-Acting Insulin Injection Using Discontinuous Injection Records" Sensors 22, no. 21: 8454. https://doi.org/10.3390/s22218454
APA StyleTang, B., Yuan, Y., Yang, J., Qiu, L., Zhang, S., & Shi, J. (2022). Predicting Blood Glucose Concentration after Short-Acting Insulin Injection Using Discontinuous Injection Records. Sensors, 22(21), 8454. https://doi.org/10.3390/s22218454