A Novel Neural Network with the Ability to Express the Extreme Points Distribution Features of Higher Derivatives of Physical Processes
Abstract
:1. Introduction
2. Analysis of Noise’s Influence on Higher Derivatives and Extreme Points Distribution of Typical Process Functions
2.1. Noise’s Influence on the Higher Derivatives of the Measured Signal
2.2. Extreme Points Distribution of Typical Process Functions
3. The Extreme-Points-Distribution-Based Neural Network (EDNN)
3.1. The Architecture of the Extreme-Points-Distribution-Based Neural Network (EDNN)
3.2. Recursive Automatic Differentiation
3.2.1. Derivatives of Hidden Layers
3.2.2. Derivatives of Output Layers
3.3. Extreme Points Distribution Feature Layer
3.4. Loss Function Containing Extreme Points Distribution Feature Errors
4. Application of the EDNN in Denoising
4.1. Realization of EDNN
4.2. Denoising of Second-Order Damped Free Oscillation Signal via EDNN
4.2.1. Acquisition of Second-Order Damped Free Oscillation Signal
4.2.2. Comparative Research of EDNN with Shallow Neural Network on Denoising of Second-Order Damped Free Oscillation Signal
4.3. Denoising of the Cylinder Pressure Signal with EDNN
5. Conclusions
- The error’s deviation effect on the higher derivatives of the measured signal was analyzed and a possible way of applying the extreme points distribution as constraints on data fitting was studied.
- 2.
- The extreme points distribution pattern was adopted as a constraint and the EDNN was established. The EDNN consists of an input layer, hidden layers, an output layer, an automatic differentiation layer and an extreme points distribution feature layer. A recursive formulation was established for calculating the derivatives. Additionally, a novel loss function, embedded with the extreme feature error, was proposed.
- 3.
- The effectiveness of the EDNN on signal denoising was verified. The proposed EDNN was applied to reduce the noise in the second-order damped free oscillation signal and the internal combustion engine cylinder pressure signal. Compared with shallow neural networks and smoothing splines, the EDNN could obtain higher derivatives that are consistent with the real physical process in the absence of a detailed mathematical model. Therefore, data fitting for higher derivatives conforming to real physical process trends could be realized with EDNN, which provides a novel approach for analyzing physical processes with higher derivatives. Additionally, it could be used in understanding the real process with higher derivatives.
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Conflicts of Interest
References
- Ganz, A.; Noui, K. Reconsidering the Ostrogradsky theorem: Higher-derivatives Lagrangians, ghosts and degeneracy. Class. Quantum Gravity 2021, 38, 075005. [Google Scholar] [CrossRef]
- Visser, M. Jerk, snap, and the cosmological equation of state. Class. Quantum Gravity 2003, 21, 2603–2615. [Google Scholar] [CrossRef]
- Eager, D.; Pendrill, A.M.; Reistad, N. Beyond velocity and acceleration: Jerk, snap and higher derivatives. Eur. J. Phys. 2016, 37, 065008. [Google Scholar] [CrossRef]
- Alwan, Z.M. Solution Non Linear Partial Differential Equations By ZMA Decomposition Method. WSEAS Trans. Math. 2021, 20, 712–716. [Google Scholar] [CrossRef]
- Al-Habahbeh, A. Numerical Solution of a System of Fractional Ordinary Differential Equations by a Modified Variational Iteration Procedure. WSEAS Trans. Math. 2022, 21, 309–318. [Google Scholar] [CrossRef]
- Faris, H.S.; Butris, R.N. Existence, Uniqueness, and Stability of Solutions of Systems of Complex Integrodifferential Equations on Complex Planes. WSEAS Trans. Math. 2022, 21, 90–97. [Google Scholar] [CrossRef]
- Zhang, S.; Chen, Z.; Zhang, W.; Xia, M. Research on dynamic stiffness with the high-order fractional derivative model for rubber bushing. Proc. Inst. Mech. Eng. Part D J. Automob. Eng. 2022, 237, 09544070221079504. [Google Scholar] [CrossRef]
- Abouelregal, A.E. A novel generalized thermoelasticity with higher-order time-derivatives and three-phase lags. Multidiscip. Model. Mater. Struct. 2020, 16, 689–711. [Google Scholar] [CrossRef]
- Brusa, A.; Corti, E.; Rossi, A.; Moro, D. Enhancement of Heavy-Duty Engines Performance and Reliability Using Cylinder Pressure Information. Energies 2023, 16, 1193. [Google Scholar] [CrossRef]
- Chen, Y.; Cao, R.; Chen, J.; Liu, L.; Matsushita, B. A practical approach to reconstruct high-quality Landsat NDVI time-series data by gap filling and the Savitzky–Golay filter. ISPRS J. Photogramm. Remote Sens. 2021, 180, 174–190. [Google Scholar] [CrossRef]
- Zuo, B.; Cheng, J.; Zhang, Z. Degradation prediction model for proton exchange membrane fuel cells based on long short-term memory neural network and Savitzky-Golay filter. Int. J. Hydrogen Energy 2021, 46, 15928–15937. [Google Scholar] [CrossRef]
- Tong, Y.; Wang, L.; Zhang, W.-Z.; Zhu, M.-D.; Qin, X.; Jiang, M.; Rong, X.; Du, J. A high performance fast-Fourier-transform spectrum analyzer for measuring spin noise spectrums. Chin. Phys. B 2020, 29, 090704. [Google Scholar] [CrossRef]
- Wang, K.; Wen, H.; Li, G. Accurate frequency estimation by using three-point interpolated discrete fourier transform based on rectangular window. IEEE Trans. Ind. Inform. 2020, 17, 73–81. [Google Scholar] [CrossRef]
- Jalayer, M.; Orsenigo, C.; Vercellis, C. Fault detection and diagnosis for rotating machinery: A model based on convolutional LSTM, Fast Fourier and continuous wavelet transforms. Comput. Ind. 2021, 125, 103378. [Google Scholar] [CrossRef]
- Ghimire, S.; Deo, R.C.; Raj, N.; Mi, J. Wavelet-based 3-phase hybrid SVR model trained with satellite-derived predictors, particle swarm optimization and maximum overlap discrete wavelet transform for solar radiation prediction. Renew. Sustain. Energy Rev. 2019, 113, 109247. [Google Scholar] [CrossRef]
- Raissi, M.; Perdikaris, P.; Karniadakis, G.M. A deep learning framework for solving forward and inverse problems involving nonlinear partial differential equations. J. Comput. Phys. 2019, 378, 686–707. [Google Scholar] [CrossRef]
- Yu, J.; Lu, L.; Meng, X.; Karniadakis, G.E. Gradient-enhanced physics-informed neural networks for forward and inverse PDE problems. Comput. Methods Appl. Mech. Eng. 2022, 393, 114823. [Google Scholar] [CrossRef]
- Cuomo, S.; Di Cola, V.S.; Giampaolo, F.; Rozza, G.; Raissi, M.; Piccialli, F. Scientific machine learning through physics–informed neural networks: Where we are and what’s next. J. Sci. Comput. 2022, 92, 88. [Google Scholar] [CrossRef]
- Margossian, C.C. A review of automatic differentiation and its efficient implementation. Wiley Interdiscip. Rev. Data Min. Knowl. Discov. 2019, 9, e1305. [Google Scholar] [CrossRef]
- Baydin, A.G.; Pearlmutter, B.A.; Radul, A.A.; Siskind, J.M. Automatic differentiation in machine learning: A survey. J. Mach. Learn. Res. 2018, 18, 5595–5637. [Google Scholar]
- Novikov, A.; Rakhuba, M.; Oseledets, I. Automatic differentiation for Riemannian optimization on low-rank matrix and tensor-train manifolds. SIAM J. Sci. Comput. 2022, 44, A843–A869. [Google Scholar] [CrossRef]
- Pombo, I.; Sarmento, L. Automatic differentiation as an effective tool in Electrical Impedance Tomography. arXiv 2023, arXiv:2301.11410. [Google Scholar]
- Kim, J.; Lee, K.; Lee, D.; Jhin, S.Y.; Park, N. DPM: A novel training method for physics-informed neural networks in extrapolation. In Proceedings of the AAAI Conference on Artificial Intelligence, Virtual, 22 February–1 March 2022; Volume 35, pp. 8146–8154. [Google Scholar]
- Linka, K.; Schäfer, A.; Meng, X.; Zou, Z.; Karniadakis, G.E.; Kuhl, E. Bayesian Physics Informed Neural Networks for real-world nonlinear dynamical systems. Comput. Methods Appl. Mech. Eng. 2022, 402, 115346. [Google Scholar] [CrossRef]
- Zhu, Y.; Zabaras, N.; Koutsourelakis, P.-S.; Perdikaris, P. Physics-constrained deep learning for high-dimensional surrogate modeling and uncertainty quantification without labeled data. J. Comput. Phys. 2019, 394, 56–81. [Google Scholar] [CrossRef]
- Sengupta, S.; Basak, S.; Saikia, P.; Paul, S.; Tsalavoutis, V.; Atiah, F.; Ravi, V.; Peters, A. A review of deep learning with special emphasis on architectures, applications and recent trends. Knowl. Based Syst. 2020, 194, 105596. [Google Scholar] [CrossRef]
- Schiassi, E.; Furfaro, R.; Leake, C.; De Florio, M.; Johnston, H.; Mortari, D. Extreme theory of functional connections: A fast physics-informed neural network method for solving ordinary and partial differential equations. Neurocomputing 2021, 457, 334–356. [Google Scholar] [CrossRef]
- Dwivedi, V.; Srinivasan, B. Physics Informed Extreme Learning Machine (PIELM)—A rapid method for the numerical solution of partial differential equations. Neurocomputing 2020, 391, 96–118. [Google Scholar] [CrossRef]
- Ciulla, G.; D’Amico, A.; Di Dio, V.; Lo Brano, V. Modelling and analysis of real-world wind turbine power curves: Assessing deviations from nominal curve by neural networks. Renew. Energy 2019, 140, 477–492. [Google Scholar] [CrossRef]
- Tong, Y.; Yu, L.; Li, S.; Liu, J.; Qin, H.; Li, W. Polynomial fitting algorithm based on neural network. ASP Trans. Pattern Recognit. Intell. Syst. 2021, 1, 32–39. [Google Scholar] [CrossRef]
- Yarotsky, D. Error bounds for approximations with deep ReLU networks. Neural Netw. 2017, 94, 103–114. [Google Scholar] [CrossRef] [PubMed]
- Braga-Neto, U. Characteristics-Informed Neural Networks for Forward and Inverse Hyperbolic Problems. arXiv 2022, arXiv:2212.14012. [Google Scholar]
- Yang, Y.; Perdikaris, P. Adversarial uncertainty quantification in physics-informed neural networks. J. Comput. Phys. 2019, 394, 136–152. [Google Scholar] [CrossRef]
- Yang, L.; Meng, X.; Karniadakis, G.E. B-PINNs: Bayesian physics-informed neural networks for forward and inverse PDE problems with noisy data. J. Comput. Phys. 2021, 425, 109913. [Google Scholar] [CrossRef]
- Harrach, B.; Pohjola, V.; Salo, M. Monotonicity and local uniqueness for the Helmholtz equation. Anal. PDE 2019, 12, 1741–1771. [Google Scholar] [CrossRef]
- Pascoe, J.E.; Tully-Doyle, R. The royal road to automatic noncommutative real analyticity, monotonicity, and convexity. Adv. Math. 2022, 407, 108548. [Google Scholar] [CrossRef]
- Sadat, A.; Joye, I.J. Peak fitting applied to fourier transform infrared and Raman spectroscopic analysis of proteins. Appl. Sci. 2020, 10, 5918. [Google Scholar] [CrossRef]
- Kara, M. An analytical expression for arbitrary derivatives of Gaussian functions exp(ax2). Int. J. Phys. Sci. 2009, 4, 247–249. [Google Scholar]
- Johnson, W.P. The Curious History of Faà di Bruno’s Formula. Am. Math. Mon. 2002, 109, 217–234. [Google Scholar]
- Frabetti, A.; Mancho, D. Five interpretations of Faà di Bruno’s formula. arXiv 2014, arXiv:1402.5551. [Google Scholar]
- Mellinger, D.; Kumar, V. Minimum snap trajectory generation and control for quadrotors. In Proceedings of the 2011 IEEE International Conference on Robotics and Automation, Shanghai, China, 9–13 May 2011; pp. 2520–2525. [Google Scholar]
- Pulpeiro, G.J.; Hall, C.M.; Kolodziej, C.P. Determination of a most representative cycle from cylinder pressure ensembles via statistical method using distribution skewness. Int. J. Engine Res. 2023, 24, 720–737. [Google Scholar] [CrossRef]
1st Derivative | 2nd Derivative | 3rd Derivative | 4th Derivative | |
---|---|---|---|---|
SNN_18 neural network | 0.016 | 0.215 | 2.642 | 28.692 |
EDNN neural network | 0.010 | 0.074 | 0.751 | 8.414 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2023 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Wang, X.; Ma, F.; Gao, Y.; Liang, J.; Zhou, C. A Novel Neural Network with the Ability to Express the Extreme Points Distribution Features of Higher Derivatives of Physical Processes. Appl. Sci. 2023, 13, 6662. https://doi.org/10.3390/app13116662
Wang X, Ma F, Gao Y, Liang J, Zhou C. A Novel Neural Network with the Ability to Express the Extreme Points Distribution Features of Higher Derivatives of Physical Processes. Applied Sciences. 2023; 13(11):6662. https://doi.org/10.3390/app13116662
Chicago/Turabian StyleWang, Xibo, Feiyan Ma, Yanfei Gao, Jinfeng Liang, and Changfeng Zhou. 2023. "A Novel Neural Network with the Ability to Express the Extreme Points Distribution Features of Higher Derivatives of Physical Processes" Applied Sciences 13, no. 11: 6662. https://doi.org/10.3390/app13116662
APA StyleWang, X., Ma, F., Gao, Y., Liang, J., & Zhou, C. (2023). A Novel Neural Network with the Ability to Express the Extreme Points Distribution Features of Higher Derivatives of Physical Processes. Applied Sciences, 13(11), 6662. https://doi.org/10.3390/app13116662