Deep Feature Extraction Based on AE Signals for the Characterization of Loaded Carbon Fiber Epoxy and Glass Fiber Epoxy Composites
Abstract
:1. Introduction
2. Materials and Methods
2.1. Experimental Setup
2.2. Feature Extraction
2.2.1. Standard Features
- c1:
- Peak amplitude (µV) denotes burst signal linear peak amplitude,
- c2:
- Burst signal duration (µs) represents the elapsed time after the first and until the last threshold crossing of a burst signal,
- c3:
- Burst signal energy (µV2s) according to EN 1330-9 is the integral of the squared AE-signal over time,
- c4:
- Burst signal rise-time (µs) represents the elapsed time after the first threshold crossing and until the burst signal maximum amplitude,
- c5:
- Frequency centroid (Hz) denotes the frequency at which the spectrum has its center of gravity (including only the frequencies and corresponding amplitudes in the range between 0 and 1000 kHz),
- c6:
- Frequency of the max. amplitude of the spectrum (Hz),
- c7:
- Frequency of the max. amplitude of continuous wavelet transformation (using the complex Morlet wavelet) (Hz),
- c8:
- Partial power of frequency spectrum between 0 and 200 kHz (/),
- c9:
- Partial power of frequency spectrum between 200 and 400 kHz (/),
- c10:
- Partial power of frequency spectrum between 400 and 1000 kHz (/).
2.2.2. Convolutional Autoencoder and Deep Features
- s1:
- Number of filters used in layers C+P (1) and TC+U (3),
- s2:
- Number of filters used in layers C+P (2) and TC+U (2),
- s3:
- Number of filters used in layers C+P (3) and TC+U (1),
- s4:
- Number of neurons in fully connected layer FCL 2,
- s5:
- Number of neurons in fully connected layer FCL 3,
- s6:
- Number of training epochs,
- s7:
- Batch size of training samples.
2.3. Classification Methods
2.3.1. Decision Trees
2.3.2. Discriminant Analysis
2.4. Research Framework
2.4.1. Objectives
2.4.2. Evaluation Procedure
- A maximum of 6 selected features was allowed as inputs for the classifiers.
- Deep feature extraction was performed in various CAE configurations consisting of 4 or 6 neurons in the 3rd fully connected layer (FCL 3), thus providing 4 or 6 deep features.
- For the DT classifier, the number of splits was limited to 15. Without this limitation, even higher accuracy (such as 95%) can be obtained, but the results are subject to overfitting.
2.4.3. Feature Selection
3. Results
- The first column presents the architecture of the deep autoencoder that was applied to generate the corresponding deep features.
- The second and the third column present the selected features, listed in order of importance, as chosen by DT and DA classifiers.
- The last two columns denote the classification accuracy obtained by DT and DA methods.
Discussion
4. Conclusions
- The analysis of the acquired AE signals based on standard features with decision trees (DTs) and discriminant analysis (DA) classification methods offers accuracy measures of around 80% to classify the signals according to the source specimen, whether it is CFE or GFE.
- The classification accuracy can be significantly increased if deep features, extracted from the proposed convolutional autoencoder, are used instead of standard features. In this case, the classification accuracy of 91.3% has been obtained by both classifiers. Comparative analysis also indicates good consistency of deep feature selection by both classifiers.
- The use of both feature sets (standard and deep features) reveals only a slight improvement in accuracy measures compared to the use of deep features only. This proves that deep features already accurately represent the information relevant for the defined classification task.
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Conflicts of Interest
References
- Duchene, P.; Chaki, S.; Ayadi, A.; Krawczak, P. A review of non-destructive techniques used for mechanical damage assessment in polymer composites. J. Mater. Sci. 2018, 53, 7915–7938. [Google Scholar] [CrossRef]
- Crivelli, D.; Guagliano, M.; Monici, A. Development of an artificial neural network processing technique for the analysis of damage evolution in pultruded composites with acoustic emission. Compos. Part B Eng. 2014, 56, 948–959. [Google Scholar] [CrossRef] [Green Version]
- Świt, G. Acoustic Emission Method for Locating and Identifying Active Destructive Processes in Operating Facilities. Appl. Sci. 2018, 8, 1295. [Google Scholar] [CrossRef] [Green Version]
- Davijani, A.B.; Hajikhani, M.; Ahmadi, M. Acoustic Emission based on sentry function to monitor the initiation of delamination in composite materials. Mater. Des. 2011, 32, 3059–3065. [Google Scholar] [CrossRef]
- Kalteremidou, K.-A.; Murray, B.R.; Tsangouri, E.; Aggelis, D.G.; Van Hemelrijck, D.; Pyl, L. Multiaxial Damage Characterization of Carbon/Epoxy Angle-Ply Laminates under Static Tension by Combining In Situ Microscopy with Acoustic Emission. Appl. Sci. 2018, 8, 2021. [Google Scholar] [CrossRef] [Green Version]
- Aggelis, D.; Barkoula, N.-M.; Matikas, T.; Paipetis, A. Acoustic structural health monitoring of composite materials: Damage identification and evaluation in cross ply laminates using acoustic emission and ultrasonics. Compos. Sci. Technol. 2011, 72, 1127–1133. [Google Scholar] [CrossRef]
- Gutkin, R.; Green, C.; Vangrattanachai, S.; Pinho, S.; Robinson, P.; Curtis, P. On acoustic emission for failure investigation in CFRP: Pattern recognition and peak frequency analyses. Mech. Syst. Signal Process. 2011, 25, 1393–1407. [Google Scholar] [CrossRef]
- Tang, J.; Soua, S.; Mares, C.; Gan, T.-H. A Pattern Recognition Approach to Acoustic Emission Data Originating from Fatigue of Wind Turbine Blades. Sensors 2017, 17, 2507. [Google Scholar] [CrossRef] [Green Version]
- Monti, A.; EL Mahi, A.; Jendli, Z.; Guillaumat, L. Mechanical behaviour and damage mechanisms analysis of a flax-fibre reinforced composite by acoustic emission. Compos. Part A Appl. Sci. Manuf. 2016, 90, 100–110. [Google Scholar] [CrossRef] [Green Version]
- Friedrich, L.; Colpo, A.; Maggi, A.; Becker, T.; Lacidogna, G.; Iturrioz, I. Damage process in glass fiber reinforced polymer specimens using acoustic emission technique with low frequency acquisition. Compos. Struct. 2020, 256, 113105. [Google Scholar] [CrossRef]
- Hamdi, S.E.; Le Duff, A.; Simon, L.; Plantier, G.; Sourice, A.; Feuilloy, M. Acoustic emission pattern recognition approach based on Hilbert–Huang transform for structural health monitoring in polymer-composite materials. Appl. Acoust. 2013, 74, 746–757. [Google Scholar] [CrossRef]
- Sause, M.; Gribov, A.; Unwin, A.; Horn, S. Pattern recognition approach to identify natural clusters of acoustic emission signals. Pattern Recognit. Lett. 2012, 33, 17–23. [Google Scholar] [CrossRef]
- Panek, M.; Blazewicz, S.; Konsztowicz, K.J. Correlation of Acoustic Emission with Fractography in Bending of Glass–Epoxy Composites. J. Nondestruct. Eval. 2020, 39, 63. [Google Scholar] [CrossRef]
- Chelliah, S.K.; Parameswaran, P.; Ramasamy, S.; Vellayaraj, A.; Subramanian, S. Optimization of acoustic emission parameters to discriminate failure modes in glass–epoxy composite laminates using pattern recognition. Struct. Health Monit. 2018, 18, 1253–1267. [Google Scholar] [CrossRef]
- Nair, A.; Cai, C.; Kong, X. Acoustic emission pattern recognition in CFRP retrofitted RC beams for failure mode identification. Compos. Part B Eng. 2018, 161, 691–701. [Google Scholar] [CrossRef]
- Gu, Y.-K.; Zhou, X.-Q.; Yu, D.-P.; Shen, Y.-J. Fault diagnosis method of rolling bearing using principal component analysis and support vector machine. J. Mech. Sci. Technol. 2018, 32, 5079–5088. [Google Scholar] [CrossRef]
- Muir, C.; Swaminathan, B.; Almansour, A.S.; Sevener, K.; Smith, C.; Presby, M.; Kiser, J.D.; Pollock, T.M.; Daly, S. Damage mechanism identification in composites via machine learning and acoustic emission. NPJ Comput. Mater. 2021, 7, 95. [Google Scholar] [CrossRef]
- Kaji, M.; Parvizian, J.; Venn, H. Constructing a Reliable Health Indicator for Bearings Using Convolutional Autoencoder and Continuous Wavelet Transform. Appl. Sci. 2020, 10, 8948. [Google Scholar] [CrossRef]
- Ou, J.; Li, H.; Huang, G.; Zhou, Q. A Novel Order Analysis and Stacked Sparse Auto-Encoder Feature Learning Method for Milling Tool Wear Condition Monitoring. Sensors 2020, 20, 2878. [Google Scholar] [CrossRef]
- García-Martín, J.; Gómez-Gil, J.; Vázquez-Sánchez, E. Non-Destructive Techniques Based on Eddy Current Testing. Sensors 2011, 11, 2525–2565. [Google Scholar] [CrossRef] [Green Version]
- Gao, X.; Mo, L.; You, D.; Li, Z. Tight butt joint weld detection based on optical flow and particle filtering of magneto-optical imaging. Mech. Syst. Signal Process. 2017, 96, 16–30. [Google Scholar] [CrossRef]
- Miao, R.; Gao, Y.; Ge, L.; Jiang, Z.; Zhang, J. Online defect recognition of narrow overlap weld based on two-stage recognition model combining continuous wavelet transform and convolutional neural network. Comput. Ind. 2019, 112, 103115. [Google Scholar] [CrossRef]
- Hinton, G.E.; Salakhutdinov, R.R. Reducing the Dimensionality of Data with Neural Networks. Science 2006, 313, 504–507. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Kramer, M.A. Nonlinear principal component analysis using autoassociative neural networks. AIChE J. 1991, 37, 233–243. [Google Scholar] [CrossRef]
- Vincent, P.; Larochelle, H.; Lajoie, I.; Bengio, Y.; Manzagol, P.-A. Stacked Denoising Autoencoders: Learning Useful Representations in a Deep Network with a Local Denoising Criterion. J. Mach. Learn Res. 2010, 11, 3371–3408. [Google Scholar]
- Kingma, D.P.; Ba, J.L. Adam: A method for stochastic optimization. In Proceedings of the 3rd International Conference on Learning Representations, ICLR 2015, San Diego, CA, USA, 7–9 May 2015. [Google Scholar]
- Breiman, L.; Friedman, J.; Stone, C.J.; Olshen, A.R. Classification and Regression Trees; Chapman & HAll/CRC: Boca Raton, FL, USA, 1984. [Google Scholar]
- Härdle, W.K.; Simar, L. Applied Multivariate Statistical Analysis, 4th ed.; Springer: Berlin, Germany, 2015. [Google Scholar] [CrossRef]
- Kocherlakota, K.; Krzanowski, W.J. Principles of Multivariate Analysis, A User’s Perspective. Biometrics 1989, 45, 1338. [Google Scholar] [CrossRef] [Green Version]
Deep Autoencoder | Selected Features | Accuracy (%) | ||
---|---|---|---|---|
Architecture | DT | DA | DT | DA |
Not relevant | c1, c6, c4, c7, c9, c10 | c9, c10, c5, c8 | 82.4 | 79.1 |
Deep Autoencoder | Selected Features | Accuracy (%) | ||
---|---|---|---|---|
Architecture | DT | DA | DT | DA |
12-24-10-256-6-150-256 | d6, d3, d1, d5, d2, d4 | d6, d3, d1, d2 | 88.5 | 89.3 |
16-32-14-256-4-150-256 | d3, d1, d2, d4 | d3, d1, d1, d2 | 83.2 | 79.8 |
16-32-14-256-6-150-256 | d2, d1, d4, d5 | d2, d1, d4, d5, d6, d3 | 91.3 | 91.3 |
3-6-11-96-4-75-256 | d3, d2, d4, d1 | d3, d2, d4, d5, d6, d3 | 84.8 | 80.3 |
3-6-12-128-4-75-256 | d4, d1, d2, d3 | d4, d1, d2, d3, d6, d3 | 81.7 | 81.6 |
6-10-10-96-4-75-256 | d4, d1, d3, d2 | d1, d4, d3, d3, d6, d3 | 84.4 | 82.1 |
6-12-12-128-4-150-512 | d4, d1, d2, d3 | d4, d1, d2, d3, d6, d3 | 89.6 | 89.7 |
8-16-10-96-4-75-256 | d1, d3, d4, d2 | d1, d4, d3, d3, d6, d3 | 82.7 | 80.3 |
8-16-12-128-4-150-512 | d1, d2, d3, d4 | d1, d2, d3, d4, d6, d3 | 86.4 | 87.2 |
Deep Autoencoder | Selected Features | Accuracy (%) | ||
---|---|---|---|---|
Architecture | DT | DA | DT | DA |
12-24-10-256-6-150-256 | d6, d3, c4, c1, c6, c3 | d6, d3, d1, c5, c10, c8 | 89.0 | 91.0 |
16-32-14-256-4-150-256 | d3, c5, c4, c1, c10, d1 | d3, c10, c5, d1, c9, c8 | 86.4 | 85.3 |
16-32-14-256-6-150-256 | d2, d1, d4, c4, c1, c3 | d2, d1, d4, c10, c5, c8 | 91.4 | 91.8 |
3-6-11-96-4-75-256 | d3, c5, c4, c6, c1, d4 | d3, c5, c10, c9, c8, d1 | 85.9 | 87.6 |
3-6-12-128-4-75-256 | d4, c6, c5, d2, c4, d1 | d4, c5, c10, d2, c8, d1 | 85.5 | 87.9 |
6-10-10-96-4-75-256 | d4, d1, c5, c1, d3, c4 | c9, d3, c8, c10, c5, d4 | 84.4 | 86.4 |
6-12-12-128-4-150-512 | d4, d1, c6, c5, c4, c1 | d4, d1, c5, c10, d2, c8 | 89.5 | 91.5 |
8-16-10-96-4-75-256 | d1, c4, c1, d3, c6, c3 | d1, d4, c10, c5, d3, c8 | 85.0 | 87.4 |
8-16-12-128-4-150-512 | d1, c6, c4, c1, c10, c5 | d1, c5, c10, d4, d3, c8 | 88.1 | 88.3 |
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations. |
© 2022 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Potočnik, P.; Misson, M.; Šturm, R.; Govekar, E.; Kek, T. Deep Feature Extraction Based on AE Signals for the Characterization of Loaded Carbon Fiber Epoxy and Glass Fiber Epoxy Composites. Appl. Sci. 2022, 12, 1867. https://doi.org/10.3390/app12041867
Potočnik P, Misson M, Šturm R, Govekar E, Kek T. Deep Feature Extraction Based on AE Signals for the Characterization of Loaded Carbon Fiber Epoxy and Glass Fiber Epoxy Composites. Applied Sciences. 2022; 12(4):1867. https://doi.org/10.3390/app12041867
Chicago/Turabian StylePotočnik, Primož, Martin Misson, Roman Šturm, Edvard Govekar, and Tomaž Kek. 2022. "Deep Feature Extraction Based on AE Signals for the Characterization of Loaded Carbon Fiber Epoxy and Glass Fiber Epoxy Composites" Applied Sciences 12, no. 4: 1867. https://doi.org/10.3390/app12041867
APA StylePotočnik, P., Misson, M., Šturm, R., Govekar, E., & Kek, T. (2022). Deep Feature Extraction Based on AE Signals for the Characterization of Loaded Carbon Fiber Epoxy and Glass Fiber Epoxy Composites. Applied Sciences, 12(4), 1867. https://doi.org/10.3390/app12041867