Hypertension Diagnosis with Backpropagation Neural Networks for Sustainability in Public Health
Abstract
:1. Introduction
2. Data Collection
2.1. Problem Statement
2.2. Justification
2.3. Hypothesis
2.4. Theoretical Framework
3. Design Methodology
3.1. Neural Networks
- A set of processing units or neurons.
- An activation state for each unit, equivalent to the output of the unit.
- Connections between the units, usually defined by a weight that determines the effect of an input signal on the unit.
- A propagation rule, which determines the effective input of a unit from external inputs.
- A trigger function that updates the new trigger level based on the effective input and the previous trigger.
- An external input corresponding to a term determined as bias for each unit.
- A method for gathering the information, corresponding to the learning rule.
- An environment in which the system will operate, with input signals and even error signals.
3.2. Structure
3.3. Proposed Neural Architecture
4. Results
- A1 = [21 174 84 15 24.7 94]; a1 = sim(net,A1) a1 = 0.2819 (a)
- A2 = [20 118 75 21 34.6 122]; a2 = sim(net,A2) a2 = 0.3721 (b)
- A3 = [18 103 77 18 19.4 130]; a3 = sim(net,A3) a3 = 0.7439 (c)
- A4 = [22 111 70 23 30.5 140]; a4 = sim(net,A4) a4 = 0.8682 (d)
5. Neural Architecture Classification Tests
Results Obtained
- 1.
- First step: division of the database according to the categories presented in Table 7, which leads to four datasets (Class_1, Class_2, Class_3 and Class_4).
- 2.
- Second step: Analyze the data for each of the variables in each class using scatter plots. For category Class1_Systolic Pressure Norm, we obtain:Figure 8, showing two outliers that were analyzed and considered as normal due to the nature of the variable. On the other hand, we observe that within the variable Age, we also had values far to the right, but these corresponded to students whose age was older than the majority.
- 3.
- Third step, correlation analysis of Class1_Systolic Normal Pressure variables is shown in Table 8 below.
6. Discussion
7. Conclusions
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Acknowledgments
Conflicts of Interest
Abbreviations
AHT | Arterial Hypertension |
AI | Artificial Intelligence |
CIDIT | Centro de Investigación, Desarrollo e Innovación Tecnológica |
ECITEC | School of Science and Technology |
mg/gL | milligrams per decilitre of blood |
HR | Heart Rate |
IMC | Body Mass Index |
INSP | National Institute of Public Health |
MLP | The Multilayer Perceptron |
PAHO | PanAmerican Health Organization |
RF | Respiratory Frequency |
UABC | Autonomous University of Baja California |
WHO | World Health Organization |
References
- Melin, P.; Prado-Arechiga, G. New Hybrid Intelligent Systems for Diagnosis and Risk Evaluation of Arterial Hypertension; Springer: Berlin/Heidelberg, Germany, 2018. [Google Scholar]
- Martínez-Linares, J.M. Updates in Hypertension Studies According to the Main Clinical Trials: A Review of the Past 45 Years about Pharmaceutical Intervention Efects. Nurs. Rep. 2020, 10, 2–14. [Google Scholar] [CrossRef] [PubMed]
- Nam, H.J.; Yoon, J.Y. Linking Health Literacy to Self-Care in Hypertensive Patients with Physical Disabilities: A Path Analysis Using a Multi-Mediation Model. Int. J. Environ. Res. Public Health 2021, 18, 3363. [Google Scholar] [CrossRef] [PubMed]
- Roop, S.C.; Battié, M.C.; Jhangri, G.S.; Hu, R.W.; Jones, C.A. Functional Recovery after Surgery for Lumbar Spinal Stenosis in Patients with Hypertension. Healthcare 2020, 8, 503. [Google Scholar] [CrossRef] [PubMed]
- Singh, Y.; Chauhan, A.S. Neural Networks in Data Mining. J. Theor. Appl. Inf. Technol. 2009, 5. [Google Scholar] [CrossRef]
- Latin American Expert Group. Latin American guidelines on hypertension. J. Hypertens. 2009, 27, 905–922. [Google Scholar] [CrossRef] [PubMed]
- Página oficial Organización Mundial de la Salud (OMS) 25 de agosto de 2021. Available online: https://www.who.int/es/news-room/fact-sheets/detail/hypertension (accessed on 29 May 2022).
- Página Oficial Organización Panamericana de la Salud. Available online: https://www.paho.org/hq/index.php?option=com_content&view=article&id=13257:dia-mundial-de-la-hipertension-2017-conoce-tus-numeros&Itemid=42345&lang=es (accessed on 29 May 2022).
- Encuesta Nacional de Salud y Nutrición Continúa 2021, Informe de Resultados de la Encuesta nacional de Salud y Nutrición - Continua COVID-19. p. 66. Available online: https://ensanut.insp.mx/encuestas/ensanutcontinua2020/informes.php (accessed on 29 May 2022).
- Jang, I. Article Pre-Hypertension and Its Determinants in Healthy Young Adults: Analysis of Data from the Korean National Health and Nutrition Examination Survey VII. Int. J. Environ. Res. Public Health 2021, 18, 9144. [Google Scholar] [CrossRef] [PubMed]
- Kaur, A.; Bhardwaj, A. Artificial Intelligence in Hypertension Diagnosis: A Review. Int. J. Comput. Sci. Inf. Technol. 2014, 5, 2633–2635. [Google Scholar]
- Effects of ARBs And ACEIs on Virus Infection, Inflammatory Status And Clinical Outcomes In COVID-19 Patients With Hypertension: A Single Center Retrospective Study. Available online: https://web.archive.org/web/20200514072618id_/https://www.ahajournals.org/doi/pdf/10.1161/HYPERTENSIONAHA.120.15143 (accessed on 29 May 2022).
- Secretaria de Salud. Gobierno de Puebla. Día Mundial de la Hipertensión Arterial. Available online: http://ss.pue.gob.mx/dia-mundial-de-la-hipertension-arterial/ (accessed on 29 May 2022).
- Rueda, E.B.; Gómez, M.C.; Macías, I.D.; Carabaño, A.L.; Díez, C.M.; Bajo, N.S. Uso de redes neuronales en medicina: A propósito de la patología dispéptica. Atención Primaria 2002, 30, 99–102. Available online: https://www.elsevier.es/es-revista-atencion-primaria-27-articulo-uso-redes-neuronales-medicina-proposito-13033735 (accessed on 29 May 2022). [CrossRef] [Green Version]
- García Montero Yolanda. Neural Network for High Blood Pressure Diagnosis; International University of La Rioja (UNIR): Madrid, Spain, 2018. [Google Scholar]
- Hagan, M.T.; Demuth, H.B.; Beale, M. Neural Networks Design, 2nd ed.; PWS Publishing Co.: Boston, MA, USA, 2014; Oklahoma State University College of Engineering, Architecture and Technology Extension Office. Contact Nathan Cragun, Manager, Engineering Extension. [Google Scholar]
- Medina-Santiago, A.; Villegas-M, J.M.; Ramirez-Torres, J.; García-Chong, N.R.; Cisneros-Gómez, A.; Melgar-Paniagua, E.M.; Bermudez-Rodriguez, J.I. Neural Network Backpropagation with Applications into Nutrition. In International Conference on Innovation in Medicine and Healthcare; Springer: Cham, Switzerland, 2017. [Google Scholar] [CrossRef]
- Lin, C.T.; Lee, C.G. Neural Fuzzy Systems: A Neuro-Fuzzy Synergism to Intelligent System; Prentice Hall PTR: Upper Saddler River, NJ, USA, 1996; ISBN 0-13-235169-2. [Google Scholar]
- Tappert, C.C. Frank Rosenblatt, the Father of Deep Learning. In Proceedings of the Student-Faculty Research Day, CSIS, New York, NY, USA, 8 May 2020. [Google Scholar]
- Hasanraza, ANSARI. Artificial Neural Network: Learn About Electronics (Learn Electronics). 2020. Available online: libgen.li/file.php?md5=a3245497addad0753ea49636a9777acc (accessed on 29 May 2022).
- Bonifacio Martin del Brío, Alfredo Sanz Molina. Redes Neuronales y Sistemas Difusos. 3ra. Edicion. 2006. Alfaomega-RaMa Editorial. Available online: www.alfaomega.com.mx (accessed on 29 May 2022).
- Werbos, P. Beyond Regression: New Tools for Prediction and Analysis in the Behavioral Sciences. Ph.D. Thesis, Harvard University, Cambridge, MA, USA, 1974. [Google Scholar]
- Werbos, P.J. Backpropagation through time: What it does and how to do it. Proc. IEEE 1990, 78, 1550–1560. [Google Scholar] [CrossRef] [Green Version]
- Skaar, S. (Editor) A Comprehensive Guide to Neural Network Modeling, Nova Science Pub. 2020. Available online: libgen.li/file.php?md5=cdd693db741785e5dd9455f5ea25bf97 (accessed on 29 May 2022).
- Guirguis-Blake, J.M.; Evans, C.V.; Webber, E.M.; Coppola, E.L.; Perdue, L.A.; Weyrich, M.S. Screening for Hypertension in Adults: An Updated Systematic Evidence Review for the U.S. Preventive Services Task Force. 2021. Available online: https://www.ncbi.nlm.nih.gov/books/NBK570233/table/ch1.tab1/?report=objectonl (accessed on 29 May 2022).
- Binu, D.; Rajakumar, B.R. Artificial Intelligence in Data Mining: Theories and Applications. 2021. Available online: libgen.li/file.php?md5=1078c53a7932d7e41c9ef2121f39fccd (accessed on 29 May 2022).
- Jamsa, K. Introduction to Data Mining and Analytics, Jones & Bartlett Learning LLC. 2021. Available online: libgen.li/file.php?md5=3fa2e2c8a352852d616de7dd39a030b7 (accessed on 29 May 2022).
- Kantardzic, M. Data Mining: Concepts, Models, Methods, and Algorithms, 3rd ed.; John Wiley & Sons: Hoboken, NJ, USA, 2020; ISBN 9781119516040. [Google Scholar]
- Assaghir, Z.; Janbain, A.; Makki, S.; Kurdi, M.; Karam, R. Using neural network to predict the hypertension. Int. J. Sci. Eng. Dev. Res. 2017, 2, 2. [Google Scholar]
- Alrashedy, H.H.N.; Almansour, A.F.; Ibrahim, D.M.; Hammoudeh, M.A.A. BrainGAN: Brain MRI Image Generation and Classification Framework Using GAN Architectures and CNN Models. Sensors 2022, 22, 4297. [Google Scholar] [CrossRef] [PubMed]
- Al Mudawi, N.; Alazeb, A. Article A Model for Predicting Cervical Cancer Using Machine Learning Algorithms. Sensors 2022, 22, 4132. [Google Scholar] [CrossRef] [PubMed]
- Aladhadh, S.; Alsanea, M.; Aloraini, M.; Khan, T.; Habib, S.; Islam, M. An Effective Skin Cancer Classification Mechanism via Medical Vision Transformer. Sensors 2022, 22, 4008. [Google Scholar] [CrossRef] [PubMed]
- Corizzo, R.; Dauphin, Y.; Bellinger, C.; Zdravevski, E.; Japkowicz, N. Explainable image analysis for decision support in medical healthcare. In Proceedings of the 2021 IEEE International Conference on Big Data (Big Data), Orlando, FL, USA, 15–18 December 2021. [Google Scholar] [CrossRef]
- Bisgin, H.; Bera, T.; Ding, H.; Semey, H.G.; Wu, L.; Liu, Z.; Barnes, A.E.; Langley, D.A.; Pava-Ripoll, M.; Vyas, H.J.; et al. Comparing SVM and ANN based machine learning methods for species identification of food contaminating beetles. Sci. Rep. 2018, 9, 1–12. [Google Scholar] [CrossRef]
- Jijji, S.A.; Sandra, A.A.; Musa, M.Y. Ensemble Model for the Prediction of Hypertension using KNN and SVM Algorithms. Int. J. Comput. Appl. 2021, 975, 8887. [Google Scholar] [CrossRef]
- Chang, W.; Liu, Y.; Xiao, Y.; Yuan, X.; Xu, X.; Zhang, S.; Zhou, S. A Machine-Learning-Based Prediction Method for Hypertension Outcomes Based on Medical Data. Diagnostics 2019, 9, 178. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Ramalingam, V.V.; Dandapath, A.; Raja, M.K. Heart disease prediction using machine learning techniques: A survey. Int. J. Eng. Technol. 2018, 7, 684–687. [Google Scholar] [CrossRef] [Green Version]
- Rosas-Peralta, M.; Borrayo-Sánchez, G. Nuevos criterio ACC/AHA en hipertensión arterial sistémica. Gac. Médica de México Editor. 2018, 154, 633–637. [Google Scholar] [CrossRef]
Measurement Type | Type | Limits |
---|---|---|
Glucose | No diabetes | Before meals 70–110 mg/gL |
After meal < 140 mg/gL | ||
With diabetes | Before meals 80–130 mg/gL | |
After meal < 180 mg/gL | ||
Blood Pressure | Systolic (Highest value) | 90 or lower hypotension |
91 to 119 normal | ||
between 120 and 129 high | ||
between 130 and 139 stage 1 hypertension | ||
140 or higher stage 2 hypertension | ||
greater than 180 hypertensive crisis | ||
Diastolic (lowest value) | 60 or lower hypotension | |
61 to 79 normal | ||
and less than 80 high | ||
or between 80 and 89 stage 1 hypertensions | ||
or 90 or greater stage 2 hypertension | ||
greater than 120 hypertensive crisis | ||
Weight | Variations in observation may be due to sex, age of the individual and many other factors. | |
Body Mass Index (Quetelet Index) | BMI = Weight (Kg)/Height (m) | Numbers less than 18 indicate low weight. |
Figures between 18 and 24.9 indicate normal weight. | ||
Figures between 25 and 26.9 indicate overweight. | ||
Figures between 27 and 40 indicate | ||
varying degrees of obesity. |
Age | DxTx | FC | FR | Temp | Systolic | Diastolic | Smoke | BMI |
---|---|---|---|---|---|---|---|---|
20 | 118 | 75 | 21 | 37 | 122 | 86 | 1 | 34.6 |
21 | 174 | 84 | 15 | 36.7 | 94 | 76 | 0 | 24.7 |
22 | 124 | 86 | 18 | 36.7 | 112 | 85 | 0 | 30.7 |
22 | 160 | 88 | 18 | 36.9 | 111 | 78 | 1 | 23.4 |
19 | 112 | 58 | 17 | 36 | 112 | 75 | 0 | 22.7 |
23 | 125 | 63 | 13 | 37 | 115 | 82 | 0 | 28.8 |
21 | 127 | 78 | 15 | 35.6 | 135 | 87 | 0 | 25.3 |
18 | 103 | 77 | 18 | 36.4 | 130 | 107 | 1 | 19.4 |
24 | 147 | 79 | 17 | 36.2 | 116 | 78 | 0 | 19.7 |
Ranking | Expected Value | Simulated Value |
---|---|---|
Class 1 | 0.25 | 0.2819 |
Class 2 | 0.50 | 0.3721 |
Class 3 | 0.75 | 0.7439 |
Class 4 | 1 | 0.8682 |
Glucose | Heart Frequency | Respiratory Frequency | Blood Pressure | ||||||||
---|---|---|---|---|---|---|---|---|---|---|---|
No. Control | Sex | Age | DxTx | FC | FR | Temp °C | n0, s1 TA | Systolic <120 mmHg | Diastolic <80 mmHg | smoke | IMC |
1 | H | 20 | 118 | 75 | 21 | 37 | 122/86 | 122 | 86 | Si | 34.6 |
2 | M | 21 | 174 | 84 | 15 | 36.7 | 94/76 | 94 | 76 | No | 24.7 |
3 | H | 22 | 124 | 86 | 18 | 36.7 | 112/85 | 112 | 85 | No | 30.7 |
4 | M | 22 | 160 | 88 | 18 | 36.9 | 111/78 | 111 | 78 | Si | 23.4 |
5 | H | 19 | 112 | 58 | 17 | 36 | 112/75 | 112 | 75 | No | 22.7 |
6 | H | 23 | 125 | 63 | 13 | 37 | 115/82 | 115 | 82 | No | 28.8 |
7 | M | 21 | 127 | 78 | 15 | 35.6 | 135/87 | 135 | 87 | No | 25.3 |
8 | H | 18 | 103 | 77 | 18 | 36.4 | 130/107 | 130 | 107 | Si | 19.4 |
9 | M | 24 | 147 | 79 | 17 | 36.2 | 116/78 | 116 | 78 | No | 19.7 |
10 | H | 21 | 130 | 66 | 19 | 36.3 | 129/66 | 129 | 66 | Si | 19.7 |
11 | H | 20 | 116 | 98 | 17 | 32.6 | 140/118 | 140 | 118 | No | 35.9 |
12 | H | 22 | 122 | 91 | >12 | 36.7 | 127/92 | 127 | 92 | No | 27.7 |
BLOOD PRESSURE CATEGORY | SYSTOLIC mm Hg (Upper Number) | DIASTOLIC mm Hg (Lower Number) | |
---|---|---|---|
NORMAL | LESS THAN 120 | AND | LESS THAN 80 |
ELEVATED | 120–129 | AND | LESS THAN 80 |
HIGH BLOOD PRESSURE (HYPERTENSION STAGE 1) | 120–139 | OR | 80–89 |
HIGH BLOOD PRESSURE (HYPERTENSION STAGE 2) | 140 OR HIGHER | OR | 90 OR HIGHER |
HIGH BLOOD CRISIS (consult your doctor inmediately) | HIGHER THAN 180 | AND/OR | HIGHER THAN 120 |
Sbp | Gender | Married | Smoke | Exercise | Age | Weight | Height | Overwt | Race | Alcohol | Trt | Bmi | Stress | Salt | Chldbear | Income | Educatn |
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
133 | F | N | N | 3 | 60 | 159 | 56 | 3 | 1 | 2 | 0 | 35 | 2 | 2 | 2 | 2 | 2 |
115 | M | N | Y | 1 | 55 | 107 | 65 | 1 | 1 | 2 | 0 | 17 | 2 | 2 | 1 | 3 | 2 |
140 | M | N | Y | 1 | 18 | 130 | 59 | 2 | 1 | 1 | 0 | 26 | 3 | 2 | 1 | 1 | 3 |
132 | M | Y | N | 2 | 19 | 230 | 57 | 3 | 2 | 3 | 1 | 49 | 3 | 3 | 1 | 1 | 2 |
133 | M | N | N | 2 | 58 | 201 | 74 | 2 | 1 | 3 | 0 | 25 | 2 | 2 | 1 | 2 | 3 |
138 | F | N | N | 3 | 55 | 166 | 167 | 2 | 1 | 1 | 1 | 25 | 2 | 1 | 3 | 2 | 3 |
133 | F | Y | N | 1 | 22 | 188 | 66 | 3 | 1 | 3 | 1 | 30 | 3 | 1 | 3 | 1 | 1 |
67 | F | Y | N | 3 | 52 | 123 | 67 | 1 | 1 | 2 | 0 | 19 | 2 | 3 | 2 | 3 | 2 |
138 | M | Y | N | 1 | 46 | 106 | 73 | 1 | 1 | 3 | 1 | 13 | 2 | 2 | 1 | 2 | 1 |
Category | Blood Pressure | Systolic Pressure |
---|---|---|
Clase_1 | Normal | Minor 120 |
Clase_2 | High | 120–129 |
Clase_3 | Stage 1 Hypertension | 130–139 |
Clase_4 | Stage 2 Hypertension | 140 or higher |
Normal | |||||||
---|---|---|---|---|---|---|---|
Age | DxTx | FC | FR | Temp °C | IMC | Systolic | |
Age | 1 | ||||||
DxTx | 0.042692 | 1 | |||||
FC | 0.098773 | 0.230647 | 1 | ||||
FR | 0.075499 | −0.03204 | 0.083564 | 1 | |||
Temp °C | −0.01297 | −0.03423 | 0.085965 | 0.077646 | 1 | ||
IMC | −0.02441 | 0.010178 | −0.00661 | 0.13376 | −0.02438 | 1 | |
Systolic | −0.08638 | 0.11035 | −0.14349 | 0.094649 | −0.01691 | 0.131297 | 1 |
net=newff(minmax(inputs),[14 50 20 4],{’tansig’,’logsig’,’logsig’,’purelin’},’trainlm’) | |||
---|---|---|---|
Tests | Correct | Clasification | |
Clase_1 | 5 | 3 | 60% |
10 | 8 | 80% | |
15 | 13 | 87% | |
Clase_2 | 5 | 4 | 80% |
10 | 8 | 80% | |
15 | 14 | 93% | |
Clase_3 | 5 | 3 | 60% |
10 | 7 | 70% | |
15 | 6 | 40% | |
Clase_4 | 5 | 5 | 100% |
10 | 10 | 100% | |
15 | 15 | 100% |
net = feedforwardnet([10 25 15 4]){’tansig’,’ tansig ’,’ tansig ’,’ tansig ’},’trainlm’) | |||
---|---|---|---|
Tests | Correct | Clasificaction | |
Clase_1 | 5 | 5 | 100% |
10 | 10 | 100% | |
15 | 15 | 100% | |
Clase_2 | 5 | 4 | 80% |
10 | 6 | 60% | |
15 | 12 | 80% | |
Clase_3 | 5 | 4 | 80% |
10 | 7 | 70% | |
15 | 12 | 80% | |
Clase_4 | 5 | 5 | 100% |
10 | 10 | 100% | |
15 | 15 | 100% |
Machine Learning Method | Comments |
---|---|
Machine Learning SVM | This article mentions the use of SVM in combination with simple k; implies to obtain a lower order error and determine the tumour region by consolidating the inherent image structure progression [30] but does NOT mention its effectiveness and accuracy. |
The Support Vector Machine (SVM) algorithm can be used for classification and regression problems. However, SVMs are quite popular for relatively complex types of small to medium-sized classification datasets. In this algorithm, the data points are separated by a hyperplane and the kernel determines the appearance of the hyperplane. If we plot multiple variables on a normal scatter plot, in many cases, that plot cannot separate two or more classes of data. The kernel of an SVM is an important element, which can convert low dimensional data into a higher dimensional space, ref. [31]. The authors explain in a very limited way without indicating in what way it can be implemented [31]. | |
Cervical cancer can be diagnosed with the help of algorithms such as decision tree, logistic regression and support vector machine (SVM) [31]. | |
Several machine learning classification algorithms have been used in Predictive Model Selection (PMS), namely, support vector machine (SVM), decision tree classifier (DTC), random forest (RF), logistic regression (LR), gradient boosting (GB), XGBoost, adaptive boosting (AB) and k-nearest neighbour (KNN). The authors refer to them theoretically but do not support their efficiency in their implementation [31]. | |
The paper presented a skin cancer detection system using a support vector machine (SVM), which helps in early detection of skin cancer disease. They used traditional image processing and feature engineering methods for effective feature selection and support vector machine (SVM) algorithms for feature classification [32]. | |
Performance evaluation was carried out using four different classifiers such as decision tree (DT), k-nearest neighbour (KNN), (KNN) tree, boosted decision tree (BT) and SVM. The classification was performed using the relevant vector machine and SVM classifier, which achieved 92.4% [32]. The authors make a comparison with other classifiers. | |
Naive Bayes machine learning | This type of machine learning is not mentioned in any of the cited and suggested articles. |
Machine Learning ANN | The authors conducted a survey-based study on cervical cancer detection, including a performance analysis to determine the accuracy of several distinctive types of architecture in an artificial neural network (ANN), where the ANN was used to identify cancerous, normal and abnormal cells. Ref. [31] The authors theoretically present the use of ANNs to identify cancerous, normal and abnormal cells. |
Machine Learning Method | Comments |
---|---|
Machine Learning SVM | SVM models are characterized by processing both linear and non-linear data. The model aims to draw decision boundaries between data points of different classes and separate them with the maximum margin [34]. |
SVM slightly outperforms ANN in recognition using one dataset. The exact reason for this improvement is difficult to pinpoint and could simply be due to better parameter selection or the diverse and non-linear nature of the dataset, or both. It could also be due to the fact that SVM converges to a global minimum and allows for better noise tolerance. | |
The vector machine is a very popular supervised machine learning technique (with a predefined target variable) that can be used as a classifier and as a predictor [34]. | |
https://www.ijcaonline.org/archives/volume183/number43/jijji-2021-ijca-921837.pdf (accessed on 29 May 2022). [35] | |
SVM has the highest accuracy 0.9985. This part shows three machine learning algorithms, which are KNN, Bayesian and SVM, applied for the same objective [35]. | |
https://mdpi-res.com/d_attachment/diagnostics/diagnostics-09-00178/article_deploy/diagnostics-09-00178.pdf?version=1573124691 (accessed on 29 May 2022). [36] | |
- SVM can achieve better generalization ability in small sample classification tasks, and has been widely used in medicine. | |
- Theoretically, SVM can achieve optimal classification. | |
- SVM can be well applied to pattern recognition, time series prediction and regression estimation, among others [36]. | |
Naive Bayes machine learning | https://iopscience.iop.org/article/10.1088/1757-899X/1022/1/012072/pdf (accessed on 29 May 2022). [37] |
Naive Bayes is a simple but effective classification technique based on Bayes’ Theorem. It assumes independence between predictors, i.e., the attributes or features must be uncorrelated or unrelated to each other. Even if there is dependence, all these characteristics or attributes contribute independently to the likelihood. | |
- In ensemble modeling, two or more related but different analytical models are used and their results are combined into a single score. An ensemble of SVM, KNN and ANN have been used to achieve an accuracy of 94.12%. The majority vote-based model as demonstrated by Saba Bashir et al. [26], which is composed of Naïve Bayes, Decision Tree and Support Vector classifiers, gave an accuracy of 82%, a sensitivity of 74% and a specificity of 93% for and a specificity of 93% for the ICU heart disease dataset. an ensemble model, consisting of Gini index, SVM and Naïve Bayes classifiers, was used which provided 98% accuracy in predicting syncope disease [37]. | |
Machine Learning ANN | https://www.nature.com/articles/s41598-018-24926-7.pdf?origin=ppub (accessed on 29 May 2022). |
A comparison is made between the SVM and ANN method implemented in pattern recognition, specifically in the detection of insects contaminating food [34]. |
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations. |
© 2022 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Orozco Torres, J.A.; Medina Santiago, A.; Villegas Izaguirre, J.M.; Amador García, M.; Delgado Hernández, A. Hypertension Diagnosis with Backpropagation Neural Networks for Sustainability in Public Health. Sensors 2022, 22, 5272. https://doi.org/10.3390/s22145272
Orozco Torres JA, Medina Santiago A, Villegas Izaguirre JM, Amador García M, Delgado Hernández A. Hypertension Diagnosis with Backpropagation Neural Networks for Sustainability in Public Health. Sensors. 2022; 22(14):5272. https://doi.org/10.3390/s22145272
Chicago/Turabian StyleOrozco Torres, Jorge Antonio, Alejandro Medina Santiago, José Manuel Villegas Izaguirre, Monica Amador García, and Alberto Delgado Hernández. 2022. "Hypertension Diagnosis with Backpropagation Neural Networks for Sustainability in Public Health" Sensors 22, no. 14: 5272. https://doi.org/10.3390/s22145272
APA StyleOrozco Torres, J. A., Medina Santiago, A., Villegas Izaguirre, J. M., Amador García, M., & Delgado Hernández, A. (2022). Hypertension Diagnosis with Backpropagation Neural Networks for Sustainability in Public Health. Sensors, 22(14), 5272. https://doi.org/10.3390/s22145272