Automatic Evaluation of Heart Condition According to the Sounds Emitted and Implementing Six Classification Methods
Abstract
:1. Introduction
2. Materials and Methods
2.1. Database Acquisition
2.2. Database Pre-Processing
2.3. Feature Extraction
2.4. Classification Methods
- k-Nearest Neighbors: This classifies unlabeled examples according to known classified neighbors. The letter k represents a variable term that specifies the number of closest neighbors. This number starts with a value of k equal to an odd number approximately equal to the square root of the number of training examples; the odd number is used to eliminate the possibility of ending with a tie [40,41,42,43]. For the implementation of this classification method, the class library and the knn function were used. Seven different k-neighbors were used: 1, 5, 11, 13, 15, 21 and 27.
- Naive Bayes: This uses training data to calculate an observed probability of each outcome based on the evidence provided by the values of the characteristics When the classifier is later applied to unlabeled data, it uses the observed probabilities to predict the most probable class for the new characteristics. Naive Bayes model was implemented using the e1072 library and the naiveBayes function. Two configurations were made for this classifier, one with a Laplacian estimator and the other without it. This estimator adds a small number to each of the values in the frequency table to guarantee that each characteristic has a probability other than zero for any of the classes.
- Decision Trees: This is a powerful classifier that uses a tree structure to model the relationship between features and potential outcomes. A decision tree classifier uses a branching decision structure, which pipes examples to predict a final class value. The c50 library was used to implement the decision trees model. Three different decision tree configurations were set: one with indefinitely growing of the branches, one with a post-pruning of the branches to reduce the size of the tree and another with a cost error in the confusion matrix in order to avoid false negatives.
- Logistic Regression: It studies the relationship between a categorical dependent variable and a set of independent variables. It is so named because the dependent variable has only two possible values, 0 and 1 or “yes” and “no”. This technique uses the one versus the rest (OvR) scheme to predict the probability of a categorical dependent variable [44,45,46]. A function of generalized linear models glm with a logistic regression setup was implemented.
- Support Vector Machine: This can be thought of as a surface that defines a boundary between several data points representing examples drawn in multidimensional space according to their characteristic values. The goal of an SVM is to create a flat boundary, called a hyperplane, which leads to fairly homogeneous data partitions on both sides [47,48,49,50,51]. To implement the Support Vector Machine model, the kernlab library and the ksvm function were applied, and four different seed kernels were used to compare different perspectives of the data distribution: linear, radial basis, polynomial and hyperbolic tangent sigmoid.
- Artificial Neural Networks: This is an information processing method based on the system that the brain involves to process information. It models the interconnections of neurons in a brain using artificial neurons known as nodes, which relate an input signal and an output signal. Each node contains an activation function that has the function of thresholding the values of the nodes to take them to any of the possible results [52,53,54,55,56]. The neuralnet library was used to implement the Artificial Neural Networks model. Three different ANN architectures were configured: a ANN with one hidden layer and one neuron, a ANN with one hidden layer and seven neurons, and a ANN with two hidden layers and twelve neurons in the firs layer and four in the second. The three topologies had the same number of input neurons (fifty two) and output neurons (two). The three ANNs used the same linear activation function.
2.5. Evaluation Metrics
- ROC curve (Receiver Operating Characteristics) provides a global measure of diagnostic precision, independent of the cut-off point and prevalence. It is obtained by representing the sensitivity (percentage of true positives) on the ordinate axis and 1-Specificity (percentage of false positives) on the abscissa axis, for different cut-off points applied to the quantitative result of a test [67,68].
3. Results
4. Discussion
5. Conclusions
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Conflicts of Interest
References
- WHO. The Top 10 Causes of Death. 2018. Available online: https://www.who.int/news-room/fact-sheets/detail/the-top-10-causes-of-death (accessed on 25 July 2018).
- INEGI. Características De Las Defunciones Registradas en México Durante 2018. Available online: https://www.inegi.org.mx/contenidos/saladeprensa/boletines/2019/EstSociodemo/DefuncionesRegistradas2019.pdf (accessed on 25 January 2020).
- WHO. Cardiovascular Diseases (CVDs). 2017. Available online: https://www.who.int/en/news-room/fact-sheets/detail/cardiovascular-diseases-(cvds) (accessed on 25 July 2018).
- Montinari, M.R.; Minelli, G. The first 200 years of cardiac auscultation and future perspectives. J. Multidiscip. Healthc. 2019, 12, 183–189. [Google Scholar] [CrossRef] [Green Version]
- Fakoya, F.; du Plessis, M.; Gbenimacho, I. Ultrasound and stethoscope as tools in medical education and practice: Considerations for the archives. Adv. Med Educ. Pract. 2016, 7, 381–387. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Mohd Noor, A.; Shadi, M. The heart auscultation: From sound to graphical. Arpn J. Eng. Appl. Sci. 2014, 9, 1924–1929. [Google Scholar]
- Gaskin, P.R.A.; Owens, S.E.; Talner, N.S.; Sanders, S.P.; Li, J.S. Clinical Auscultation Skills in Pediatric Residents. Pediatrics 2000, 105, 1184–1187. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Mahnke, C.; Nowalk, A.; Hofkosh, D.; Zuberbuhler, J.; Law, Y. Comparison of Two Educational Interventions on Pediatric Resident Auscultation Skills. Pediatrics 2004, 113, 1331–1335. [Google Scholar] [CrossRef] [PubMed]
- Mangione, S.; Nieman, L.Z. Cardiac Auscultatory Skills of Internal Medicine and Family Practice Trainees: A Comparison of Diagnostic Proficiency. JAMA 1997, 278, 717–722. [Google Scholar] [CrossRef] [PubMed]
- Leng, S.; Tan, R.S.; Chai, K.T.C.; Wang, C.; Ghista, D.; Zhong, L. The electronic stethoscope. Biomed. Eng. Online 2015, 14, 1–37. [Google Scholar] [CrossRef] [Green Version]
- Swarup, S.; Makaryus, A. Digital stethoscope: Technology update. Med. Dev. Evid. Res. 2018, 11, 29–36. [Google Scholar] [CrossRef] [Green Version]
- Gillman, L.; Kirkpatrick, A. Portable bedside ultrasound: The visual stethoscope of the 21 st century. Scand. J. Trauma Resusc. Emerg. Med. 2012, 20, 18. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Massin, M.; Dessy, H. Delayed recognition of congenital heart disease. Postgrad. Med. J. 2006, 82, 468–470. [Google Scholar] [CrossRef] [Green Version]
- Quinn, G.R.; Ranum, D.; Song, E.; Linets, M.; Keohane, C.; Riah, H.; Greenberg, P. Missed diagnosis of cardiovascular disease in outpatient general medicine: Insights from malpractice claims data. Jt. Commun. J. Qual. Patient Saf. 2017, 43, 508–516. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Wacker-Gussmann, A.; Ehringer-Schetitska, D.; Herceg, V.; Hidvégi, E.; Jakab, A.; Petropoulos, A.; Jokinen, E.; Fritsch, P.; Oberhoffer, R. Prevention of delayed diagnosis in congenital heart disease. Cardiol. Young 2019, 29, 1–2. [Google Scholar] [CrossRef] [Green Version]
- Galtrey, C.M.; Levee, V.; Arevalo, J.; Wren, D. Long QT syndrome masquerading as epilepsy. Pract. Neurol. 2019, 19, 56–61. [Google Scholar] [CrossRef] [PubMed]
- Brown, K.L.; Ridout, D.A.; Hoskote, A.; Verhulst, L.; Ricci, M.; Bull, C. Delayed diagnosis of congenital heart disease worsens preoperative condition and outcome of surgery in neonates. Heart 2006, 92, 1298–1302. [Google Scholar] [CrossRef] [PubMed]
- Bishop, E.; Brown, E.E.; Fajardo, J.; Barouch, L.A.; Judge, D.P.; Halushka, M.K. Seven factors predict a delayed diagnosis of cardiac amyloidosis. Amyloid 2018, 25, 174–179. [Google Scholar] [CrossRef]
- Mahnke, C. Automated heartsound analysis/Computer-aided auscultation: A cardiologist’s perspective and suggestions for future development. In Proceedings of the 2009 Annual International Conference of the IEEE Engineering in Medicine and Biology Society, Minneapolis, MN, USA, 2–6 September 2009; pp. 3115–3118. [Google Scholar]
- Liu, C.; Springer, D.; Li, Q.; Moody, B.; Juan, R.; Chorro, F.; Castells Ramon, F.; Roig, J.; Silva, I.; Johnson, A.; et al. An open access database for the evaluation of heart sound algorithms. Physiol. Meas. 2016, 37, 2181–2213. [Google Scholar] [CrossRef] [PubMed]
- Institute of Medicine and National Academies of Sciences, Engineering, and Medicine. Improving Diagnosis in Health Care; The National Academies Press: Washington, DC, USA, 2015. [Google Scholar]
- Schiff, G.; Hasan, O.; Kim, S.; Abrams, R.; Cosby, K.; Lambert, B.; Elstein, A.; Hasler, S.; Kabongo, M.; Krosnjar, N.; et al. Diagnostic Error in Medicine: Analysis of 583 Physician-Reported Errors. Arch. Intern. Med. 2009, 169, 1881–1887. [Google Scholar] [CrossRef] [PubMed]
- Loh, B.; Then, P. Deep learning for cardiac computer-aided diagnosis: Benefits, issues & solutions. mHealth 2017, 3, 45. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Vicnesh, J.; Oh, S.L.; Koh, J.E.W.; Ciaccio, E.; Chua, K.; Tan, R.S.; Acharya, U.R. Computer-aided diagnosis of congestive heart failure using ECG signals—A review. Phys. Med. 2019, 62, 95–104. [Google Scholar] [CrossRef] [Green Version]
- Li, S.; Li, F.; Tang, S.; Xiong, W. A Review of Computer-Aided Heart Sound Detection Techniques. Biomed Res. Int. 2020, 2020, 1–10. [Google Scholar] [CrossRef]
- Mandal, S.; Martis, R.; Mandana, K.; Acharya, U.R.; Chatterjee, J.; Ray, A. Practice of Cardiac Auscultation:Clinical perspectives and its implications on computer aided diagnosis. BioRxiv 2014, 013334. [Google Scholar] [CrossRef]
- Zuhlke, L.; Myer, L.; Mayosi, B. The promise of computer-assisted auscultation in screening for structural heart disease and clinical teaching. Cardiovasc. J. Afr. 2012, 23, 405–408. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Watrous, R.; Thompson, W.; Ackerman, S. The Impact of Computer-assisted Auscultation on Physician Referrals of Asymptomatic Patients with Heart Murmurs. Clin. Cardiol. 2008, 31, 79–83. [Google Scholar] [CrossRef] [PubMed]
- Lee, C.; Rankin, K.; Zuo, K.; Mackie, A. Computer-aided auscultation of murmurs in children: Evaluation of commercially available software. Cardiol. Young 2016, 1, 1–6. [Google Scholar] [CrossRef] [PubMed]
- Lai, L.S.; Redington, A.N.; Reinisch, A.J.; Unterberger, M.J.; Schriefl, A.J. Computerized Automatic Diagnosis of Innocent and Pathologic Murmurs in Pediatrics: A Pilot Study. Congenit. Heart Dis. 2016, 11, 386–395. [Google Scholar] [CrossRef] [PubMed]
- Mandal, S.; Basak, K.; Mandana, K.M.; Ray, A.K.; Chatterjee, J.; Mahadevappa, M. Development of Cardiac Prescreening Device for Rural Population Using Ultralow-Power Embedded System. IEEE Trans. Biomed. Eng. 2011, 58, 745–749. [Google Scholar] [CrossRef] [PubMed]
- Iwamoto, J.; Ogawa, H.; Maki, H.; Yonezawa, Y.; Hahn, A.; Caldwell, W. A mobile phone-based ecg and heart sound monitoring system-biomed. Biomed. Sci. Instrum. 2011, 47, 160–164. [Google Scholar]
- Koekemoer, H.L.; Scheffer, C. Heart sound and electrocardiogram recording devices for telemedicine environments. In Proceedings of the 2008 30th Annual International Conference of the IEEE Engineering in Medicine and Biology Society, Vancouver, BC, Canada, 20–25 August 2008; pp. 4867–4870. [Google Scholar]
- Bentley, P.; Nordehn, G.; Coimbra, M.; Mannor, S.; Getz, R. Classifying Heart Sounds Challenge. 2012. Available online: http://www.peterjbentley.com/heartchallenge/#taskoverview (accessed on 25 July 2018).
- Feraru, S.M.; Zbancioc, M.D. Emotion recognition in Romanian language using LPC features. In Proceedings of the 2013 E-Health and Bioengineering Conference (EHB), Iasi, Romania, 21–23 November 2013; pp. 1–4. [Google Scholar]
- Wang, L.; Chen, Z.; Yin, F. A novel hierarchical decomposition vector quantization method for high-order LPC parameters. IEEE/ACM Trans. Audio Speech Lang. Process. 2015, 23, 212–221. [Google Scholar] [CrossRef]
- Mascorro, G.A.M.; Torres, G.A. Reconocimiento de voz Basado en MFCC, SBC y Espectrogramas; INGENIUS: Quito, Ecuador, 2013; pp. 12–20. [Google Scholar]
- Bezoui, M.; Elmoutaouakkil, A.; Beni-hssane, A. Feature extraction of some Quranic recitation using mel-frequency cepstral coeficients (MFCC). In Proceedings of the 2016 5th International Conference on Multimedia Computing and Systems (ICMCS), Marrakesh, Morocco, 29 September–1 October 2016; pp. 127–131. [Google Scholar]
- Reig Albiñana, D. Implementación de Algoritmos para la Extracción de Patrones Característicos en Sistemas de Reconocimiento De Voz en Matlab. Ph.D. Thesis, Universitat Politècnica de València, Valencia, España, 2015. [Google Scholar]
- Anava, O.; Levy, K. Chapter k-Nearest Neighbors: From Global to Local. In Advances in Neural Information Processing Systems 29; Curran Associates Inc.: Barcelona, Spain, 2016; pp. 4923–4931. [Google Scholar]
- Cunningham, P.; Delany, S. k-Nearest neighbour classifiers. arXiv 2007, arXiv:2004.04523. [Google Scholar]
- Novakovic, J.; Veljovic, A.; Ilic, S.; Papic, M. Experimental study of using the k-nearest neighbour classifier with filter methods. In Proceedings of the Conference: Computer Science and Technology, Varna, Burgaria, 7–10 September 2016; pp. 90–99. [Google Scholar]
- Sun, S.; Huang, R. An adaptive k-nearest neighbor algorithm. In Proceedings of the 2010 Seventh International Conference on Fuzzy Systems and Knowledge Discovery, Yantai, China, 10–12 August 2010; pp. 91–94. [Google Scholar]
- Morgan, S.P.; Teachman, J.D. Logistic Regression: Description, Examples, and Comparisons. J. Marriage Fam. 1988, 50, 929–936. [Google Scholar] [CrossRef]
- Park, H. An Introduction to Logistic Regression: From Basic Concepts to Interpretation with Particular Attention to Nursing Domain. J. Korean Acad. Nurs. 2013, 43, 154–164. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Peng, J.; Lee, K.; Ingersoll, G. An Introduction to Logistic Regression Analysis and Reporting. J. Educ. Res. 2002, 96, 3–14. [Google Scholar] [CrossRef]
- Evgeniou, T.; Pontil, M. Machine Learning and Its Applications. In Chapter Support Vector Machines: Theory and Applications; Springer: Berlin/Heidelberg, Germany, 2001; pp. 249–257. [Google Scholar]
- Hearst, M.; Dumais, S.; Osman, E.; Platt, J.; Scholkopf, B. Support vector machines. Intell. Syst. Their Appl. IEEE 1998, 13, 18–28. [Google Scholar] [CrossRef] [Green Version]
- Srivastava, D.; Bhambhu, L. Data classification using support vector machine. J. Theor. Appl. Inf. Technol. 2010, 12, 1–7. [Google Scholar]
- Tong, S.; Koller, D. Support Vector Machine Active Learning with Applications To Text Classification. J. Mach. Learn. Res. 2001, 2, 45–66. [Google Scholar]
- Zhang, Y. Support Vector Machine Classification Algorithm and Its Application. In Proceedings of the International Conference on Information Computing and Applications, Chengde, China, 14–16 September 2012; pp. 179–186. [Google Scholar]
- Minli, Z.; Shanshan, Q. Research on the Application of Artificial Neural Networks in Tender Offer for Construction Projects. Phys. Procedia 2012, 24, 1781–1788. [Google Scholar] [CrossRef] [Green Version]
- Mishra, M.; Srivastava, M. A view of Artificial Neural Network. In Proceedings of the International Conference on Advances in Engineering Technology Research (ICAETR), Kanpur, India, 1–2 August 2014; pp. 1–3. [Google Scholar]
- Parisi, G.; Kemker, R.; Part, J.; Kanan, C.; Wermter, S. Continual lifelong learning with neural networks: A review. Neural Netw. 2019, 113, 54–71. [Google Scholar] [CrossRef]
- Sharma, V.; Rai, S.; Dev, A. A Comprehensive Study of Artificial Neural Networks. Int. J. Adv. Res. Comput. Sci. Softw. Eng. 2012, 2, 278–284. [Google Scholar]
- Maind, B.; Wankar, P. Research Paper on Basic of Artificial Neural Network. 2014. Available online: https://www.semanticscholar.org (accessed on 5 May 2020).
- Hossin, M.; Sulaiman, M.; Wirza, R. Improving Accuracy Metric with Precision and Recall Metrics for Optimizing Stochastic Classifier. In Proceedings of the 3rd International Conference on Computing and Informatics (ICOCI 2011), Bandung, Indonesia, 8–9 June 2011; pp. 105–110. [Google Scholar]
- McNee, S.; Riedl, J.; Konstan, J. Being accurate is not enough: How accuracy metrics have hurt recommender systems. In Proceedings of the CHI’06 Extended Abstracts on Human Factors in Computing Systems, Montreal, QC, Canada, 21–25 April 2006; pp. 1097–1101. [Google Scholar]
- Jonnalagadda, S. Sensitivity Analysis of Performance Metrics. In Proceedings of the 3rd Annual Software Testing Conference, Bangalore, India, 11–13 November 2001. [Google Scholar]
- Saha, M.; Ghosh, R.; Goswami, B. Robustness and Sensitivity Metrics for Tuning the Extended Kalman Filter. IEEE Trans. Instrum. Meas. 2014, 63, 964–971. [Google Scholar] [CrossRef]
- Zhu, W.; Zeng, N.; Wang, N. Sensitivity, Specificity, Accuracy, Associated Confidence Interval and ROC Analysis with Practical SAS Implementations. In Proceedings of the NESUG Proceedings: Health Care and Life Sciences, Baltimore, MD, USA, 14–17 November 2010. [Google Scholar]
- Flach, P.; Kull, M. Precision-Recall-Gain Curves: PR Analysis Done Right. In Advances in Neural Information Processing Systems 28; Cortes, C., Lawrence, N.D., Lee, D.D., Sugiyama, M., Garnett, R., Eds.; Curran Associates Inc.: Montreal, QC, Canada, 2015; pp. 838–846. [Google Scholar]
- Powers, D.; Ailab. Evaluation: From precision, recall and F-measure to ROC, informedness, markedness & correlation. J. Mach. Learn. Technol. 2011, 2, 2229–3981. [Google Scholar]
- Ting, K. Precision and Recall. In Encyclopedia of Machine Learning; Sammut, C., Webb, G.I., Eds.; Springer: Boston, MA, USA, 2010; p. 781. [Google Scholar]
- Sokolova, M.; Japkowicz, N.; Szpakowicz, S. Beyond Accuracy, F-Score and ROC: A Family of Discriminant Measures for Performance Evaluation. Adv. Artif. Intell. Lect. Notes Comput. Sci. 2006, 4304, 1015–1021. [Google Scholar]
- Lipton, Z.; Elkan, C.; Narayanaswamy, B. Thresholding Classifiers to Maximize F1 Score. 2014. Available online: https://www.researchgate.net (accessed on 1 April 2020).
- Hajian-Tilaki, K. Receiver Operating Characteristic (ROC) Curve Analysis for Medical Diagnostic Test Evaluation. Casp. J. Intern. Med. 2013, 4, 627–635. [Google Scholar]
- Bradley, A. The use of the area under the ROC curve in the evaluation of machine learning algorithms. Pattern Recognit. 1997, 30, 1145–1159. [Google Scholar] [CrossRef] [Green Version]
- Sidey-Gibbons, J.A.; Sidey-Gibbons, C.J. Machine learning in medicine: A practical introduction. BMC Med Res. Methodol. 2019, 19, 1–18. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Ghani, M.U.; Alam, T.M.; Jaskani, F.H. Comparison of classification models for early prediction of breast cancer. In Proceedings of the 2019 International Conference on Innovative Computing (ICIC), Lahore, Pakistan, 23–24 September 2019; pp. 1–6. [Google Scholar]
- Masood, F.; Farzana, M.; Nesathurai, S.; Abdullah, H.A. Comparison study of classification methods of intramuscular electromyography data for non-human primate model of traumatic spinal cord injury. Proc. Inst. Mech. Eng. Part H J. Eng. Med. 2020, 234, 955–965. [Google Scholar] [CrossRef]
- Galván-Tejada, C.E.; Villagrana-Bañuelos, K.E.; Zanella-Calzada, L.A.; Moreno-Báez, A.; Luna-García, H.; Celaya-Padilla, J.M.; Galván-Tejada, J.I.; Gamboa-Rosales, H. Univariate Analysis of Short-Chain Fatty Acids Related to Sudden Infant Death Syndrome. Diagnostics 2020, 10, 896. [Google Scholar] [CrossRef]
- Bayrak, E.A.; Kırcı, P.; Ensari, T. Comparison of machine learning methods for breast cancer diagnosis. In Proceedings of the 2019 Scientific Meeting on Electrical-Electronics & Biomedical Engineering and Computer Science (EBBT), Istanbul, Turkey, 24–26 April 2019; pp. 1–3. [Google Scholar]
- Wang, H.; Zhou, Z.; Li, Y.; Chen, Z.; Lu, P.; Wang, W.; Liu, W.; Yu, L. Comparison of machine learning methods for classifying mediastinal lymph node metastasis of non-small cell lung cancer from 18 F-FDG PET/CT images. EJNMMI Res. 2017, 7, 1–11. [Google Scholar] [CrossRef]
- Yang, G.; Raschke, F.; Barrick, T.R.; Howe, F.A. Classification of brain tumour 1 h mr spectra: Extracting features by metabolite quantification or nonlinear manifold learning? In Proceedings of the 2014 IEEE 11th International Symposium on Biomedical Imaging (ISBI), Beijing, China, 29 April–2 May 2014; pp. 1039–1042. [Google Scholar]
- Yang, G.; Raschke, F.; Barrick, T.R.; Howe, F.A. Manifold Learning in MR spectroscopy using nonlinear dimensionality reduction and unsupervised clustering. Magn. Reson. Med. 2015, 74, 868–878. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Gomes, E.; Pereira, E. Classifying heart sounds using peak location for segmentation and feature construction. In Proceedings of the Workshop Classifying Heart Sounds, La Palmam, Canary Islands, 24 April 2012. [Google Scholar]
- Deng, Y. A Robust Heart Sound Segmentation and Classification Algorithm using Wavelet Decomposition and Spectrogram. In Proceedings of the Workshop Classifying Heart Sounds, La Palmam, Canary Islands, 24 April 2012. [Google Scholar]
- Kitzes, J.; Turek, D.; Deniz, F. The Practice of Reproducible Research: Case Studies and Lessons from the Data-Intensive Sciences; University of California Press: Berkeley, CA, USA, 2017. [Google Scholar]
Classification Method | Accuracy | Specificity | Sensitivity | ROC | Precision | F1-Score | EM Mean |
---|---|---|---|---|---|---|---|
k-NN, | 0.6585 | 0.6667 | 0.6522 | 0.6594 | 0.7143 | 0.6818 | 0.6721 |
k-NN, | 0.561 | 0.5556 | 0.5652 | 0.5604 | 0.6190 | 0.5909 | 0.5753 |
k-NN, | 0.6585 | 0.6111 | 0.6957 | 0.6534 | 0.6957 | 0.6956 | 0.6683 |
k-NN, | 0.6098 | 0.5556 | 0.6522 | 0.6039 | 0.6522 | 0.6521 | 0.6209 |
k-NN, | 0.6098 | 0.5556 | 0.6522 | 0.6039 | 0.6522 | 0.6521 | 0.6209 |
k-NN, | 0.6098 | 0.5000 | 0.6957 | 0.5978 | 0.6400 | 0.6666 | 0.6183 |
k-NN, | 0.7073 | 0.5000 | 0.8696 | 0.5978 | 0.6897 | 0.7692 | 0.6889 |
NB w/o | 0.5897 | 0.8667 | 0.4167 | 0.6417 | 0.8333 | 0.5555 | 0.6506 |
NB with | 0.5897 | 0.8667 | 0.4167 | 0.6417 | 0.8333 | 0.5555 | 0.6506 |
DT w/o P | 0.6383 | 0.8261 | 0.4583 | 0.6422 | 0.7333 | 0.5641 | 0.6437 |
DT with P | 0.617 | 0.6364 | 0.6000 | 0.6182 | 0.6522 | 0.625 | 0.6248 |
DT with C | 0.6383 | 0.3333 | 0.8846 | 0.609 | 0.6216 | 0.7301 | 0.6361 |
LR | 0.6327 | 0.7222 | 0.5806 | 0.7204 | 0.7826 | 0.6666 | 0.6841 |
SVM linear | 0.7073 | 0.6842 | 0.7273 | 0.7703 | 0.7273 | 0.7272 | 0.7293 |
SVM radial | 0.6829 | 0.5789 | 0.7727 | 0.7344 | 0.6800 | 0.7234 | 0.6953 |
SVM polynomial | 0.7073 | 0.6316 | 0.7727 | 0.7536 | 0.7083 | 0.7111 | 0.7141 |
SVM HT | 0.6341 | 0.6316 | 0.6364 | 0.6053 | 0.6667 | 0.6511 | 0.6375 |
ANN 1HL 1N | 0.5854 | 0.1053 | 1.0000 | 0.5526 | 0.5641 | 0.7213 | 0.5881 |
ANN 1HL 7N | 0.561 | 0.8421 | 0.3182 | 0.6352 | 0.7000 | 0.4375 | 0.5881 |
ANN 2HL 12N 4N | 0.6829 | 0.7895 | 0.5909 | 0.6902 | 0.7647 | 0.6666 | 0.6974 |
Classification Method | Accuracy | Specificity | Sensitivity | ROC | Precision | F1-Score | EM Mean |
---|---|---|---|---|---|---|---|
K-NN, | 0.6531 | 0.5833 | 0.7200 | 0.6517 | 0.6429 | 0.6792 | 0.6550 |
K-NN, | 0.6327 | 0.5417 | 0.7200 | 0.6308 | 0.6207 | 0.6666 | 0.6354 |
K-NN, | 0.6327 | 0.5000 | 0.7600 | 0.63 | 0.6129 | 0.6785 | 0.6356 |
K-NN, | 0.6122 | 0.5417 | 0.6800 | 0.6108 | 0.6071 | 0.6415 | 0.6155 |
K-NN, | 0.6122 | 0.5417 | 0.6800 | 0.6108 | 0.6071 | 0.6415 | 0.6155 |
K-NN, | 0.6531 | 0.4167 | 0.8800 | 0.6483 | 0.6111 | 0.7213 | 0.6550 |
K-NN, | 0.6531 | 0.3750 | 0.9200 | 0.6475 | 0.6053 | 0.7301 | 0.6551 |
NB w/o | 0.6923 | 0.8000 | 0.5789 | 0.6895 | 0.7333 | 0.6470 | 0.6901 |
NB with | 0.6923 | 0.8000 | 0.5789 | 0.6895 | 0.7333 | 0.6470 | 0.6901 |
DT w/o P | 0.6383 | 0.8261 | 0.4583 | 0.6422 | 0.7333 | 0.5641 | 0.6437 |
DT with P | 0.617 | 0.6364 | 0.6000 | 0.6182 | 0.6522 | 0.625 | 0.6248 |
DT with C | 0.6383 | 0.3333 | 0.8846 | 0.609 | 0.6216 | 0.7301 | 0.6361 |
LR | 0.7317 | 0.7083 | 0.7647 | 0.8407 | 0.6500 | 0.7027 | 0.7330 |
SVM linear | 0.6585 | 0.4211 | 0.8636 | 0.7512 | 0.6522 | 0.7307 | 0.6795 |
SVM radial | 0.7073 | 0.4737 | 0.9091 | 0.7871 | 0.6667 | 0.7692 | 0.7188 |
SVM polynomial | 0.6829 | 0.4737 | 0.8636 | 0.7512 | 0.6552 | 0.7450 | 0.6952 |
SVM HT | 0.6098 | 0.7368 | 0.5000 | 0.6364 | 0.6875 | 0.5789 | 0.6249 |
ANN 1HL 1N | 0.7073 | 0.5333 | 0.8077 | 0.6308 | 0.7500 | 0.7777 | 0.7011 |
ANN 1HL 7N | 0.5854 | 0.6667 | 0.5385 | 0.6308 | 0.7368 | 0.6222 | 0.6300 |
ANN 2HL 12N 4N | 0.6098 | 0.5333 | 0.6538 | 0.6821 | 0.7083 | 0.68 | 0.6445 |
Classification Method | Accuracy | Specificity | Sensitivity | ROC | Precision | F1-Score | EM Mean |
---|---|---|---|---|---|---|---|
K-NN, | 0.4694 | 0.5600 | 0.3750 | 0.5276 | 0.4828 | 0.5660 | 0.4968 |
K-NN, | 0.6341 | 0.6250 | 0.6400 | 0.6325 | 0.7273 | 0.6808 | 0.6566 |
K-NN, | 0.6341 | 0.6250 | 0.6400 | 0.6325 | 0.7273 | 0.6808 | 0.6566 |
K-NN, | 0.6341 | 0.6250 | 0.6400 | 0.6325 | 0.7273 | 0.6808 | 0.6566 |
K-NN, | 0.6531 | 0.5217 | 0.7692 | 0.6455 | 0.6452 | 0.7017 | 0.6560 |
K-NN, | 0.6098 | 0.5000 | 0.6800 | 0.5900 | 0.6800 | 0.6800 | 0.6233 |
K-NN, | 0.6341 | 0.5625 | 0.6800 | 0.6212 | 0.7083 | 0.6938 | 0.6499 |
NB w/o | 0.6667 | 0.7727 | 0.5294 | 0.6511 | 0.6429 | 0.5806 | 0.6405 |
NB with | 0.6667 | 0.7727 | 0.5294 | 0.6511 | 0.6429 | 0.5806 | 0.6405 |
DT w/o P | 0.6383 | 0.8261 | 0.4583 | 0.6422 | 0.7333 | 0.5641 | 0.6437 |
DT with P | 0.617 | 0.6364 | 0.6000 | 0.6182 | 0.6522 | 0.6250 | 0.6248 |
DT with C | 0.6383 | 0.3333 | 0.8846 | 0.609 | 0.6216 | 0.7301 | 0.6361 |
LR | 0.8049 | 0.7500 | 0.8571 | 0.8405 | 0.7826 | 0.8181 | 0.8088 |
SVM linear | 0.7073 | 0.6364 | 0.7895 | 0.6962 | 0.6522 | 0.7142 | 0.6993 |
SVM radial | 0.6585 | 0.5455 | 0.7895 | 0.6986 | 0.6000 | 0.6818 | 0.6623 |
SVM polynomial | 0.7073 | 0.6364 | 0.7895 | 0.6962 | 0.6522 | 0.7142 | 0.6993 |
SVM HT | 0.4146 | 0.2273 | 0.6316 | 0.5646 | 0.4138 | 0.5000 | 0.4586 |
ANN 1HL 1N | 0.6341 | 0.7778 | 0.5217 | 0.7029 | 0.7500 | 0.6153 | 0.6669 |
ANN 1HL 7N | 0.6341 | 0.6667 | 0.6154 | 0.6308 | 0.7619 | 0.6808 | 0.6700 |
ANN 2HL 12N 4N | 0.7073 | 0.8000 | 0.6538 | 0.7513 | 0.8500 | 0.7391 | 0.7502 |
LR | DT w/o P | ANN 2HL 12N 4N | J48 | MLB | N. BPM | |
---|---|---|---|---|---|---|
Precision | 0.7826 | 0.7333 | 0.8500 | 0.4566 | 0.5566 | 0.4377 |
LR | DT w/o P | ANN 2HL 12N 4N | J48 | MLB | N. BPM | |
---|---|---|---|---|---|---|
Sensitivity | 0.8571 | 0.4583 | 0.6538 | 0.22 | 0.19 | 0.5085 |
Specificity | 0.75 | 0.8261 | 0.8000 | 0.82 | 0.84 | 0.5882 |
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations. |
© 2021 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).
Share and Cite
Soto-Murillo, M.A.; Galván-Tejada, J.I.; Galván-Tejada, C.E.; Celaya-Padilla, J.M.; Luna-García, H.; Magallanes-Quintanar, R.; Gutiérrez-García, T.A.; Gamboa-Rosales, H. Automatic Evaluation of Heart Condition According to the Sounds Emitted and Implementing Six Classification Methods. Healthcare 2021, 9, 317. https://doi.org/10.3390/healthcare9030317
Soto-Murillo MA, Galván-Tejada JI, Galván-Tejada CE, Celaya-Padilla JM, Luna-García H, Magallanes-Quintanar R, Gutiérrez-García TA, Gamboa-Rosales H. Automatic Evaluation of Heart Condition According to the Sounds Emitted and Implementing Six Classification Methods. Healthcare. 2021; 9(3):317. https://doi.org/10.3390/healthcare9030317
Chicago/Turabian StyleSoto-Murillo, Manuel A., Jorge I. Galván-Tejada, Carlos E. Galván-Tejada, Jose M. Celaya-Padilla, Huizilopoztli Luna-García, Rafael Magallanes-Quintanar, Tania A. Gutiérrez-García, and Hamurabi Gamboa-Rosales. 2021. "Automatic Evaluation of Heart Condition According to the Sounds Emitted and Implementing Six Classification Methods" Healthcare 9, no. 3: 317. https://doi.org/10.3390/healthcare9030317
APA StyleSoto-Murillo, M. A., Galván-Tejada, J. I., Galván-Tejada, C. E., Celaya-Padilla, J. M., Luna-García, H., Magallanes-Quintanar, R., Gutiérrez-García, T. A., & Gamboa-Rosales, H. (2021). Automatic Evaluation of Heart Condition According to the Sounds Emitted and Implementing Six Classification Methods. Healthcare, 9(3), 317. https://doi.org/10.3390/healthcare9030317