A Source Identification Problem in Magnetics Solved by Means of Deep Learning Methods
Abstract
:1. Introduction
1.1. Overview
1.2. Literature Review
1.3. Motivation
2. Deep Learning Models for Field Reconstruction and Source Identification
2.1. Conditional Variational Autoencoder
2.2. Deep Learning-Based Source Identification
2.2.1. Approach 1: Conditional Variational Autoencoder + Convolutional Neural Network
2.2.2. Approach 2: Conditional Encoder + Artificial Neural Network
2.2.3. Approach 3: Convolutional Neural Network
3. Case Study Description
3.1. Database Generation: Direct Problem Description
- G1: turns with the same radius (solenoid-like geometry).
- G2: turns with an increasing radius along z (Δr = 1 mm).
- G3: turns with a decreasing radius along z (Δr = 1 mm).
3.2. Database Generation: Image Processing
3.3. Metric Definitions for the Comparison between the Approaches
4. Results
4.1. Approach 1: CVAE + CNN
4.2. Approach 2: Encoder + ANN
4.3. Approach 3: CNN
5. Discussion and Conclusions
Author Contributions
Funding
Data Availability Statement
Conflicts of Interest
References
- Jin, Z.; Cao, Y.; Li, S.; Ying, W.; Krishnamurthy, M. Analytical Approach for Sharp Corner Reconstruction in the Kernel Free Boundary Integral Method during Magnetostatic Analysis for Inductor Design. Energies 2023, 16, 5420. [Google Scholar] [CrossRef]
- Liebsch, M.; Russenschuck, S.; Kurz, S. BEM-based magnetic field reconstruction by ensemble Kálmán filtering. Comput. Methods Appl. Math. 2023, 23, 405–424. [Google Scholar] [CrossRef]
- Formisano, A.; Martone, R. Different regularization methods for an inverse magnetostatic problem. Int. J. Appl. Electromagn. Mech. 2019, 60, S49–S62. [Google Scholar] [CrossRef]
- Khan, A.; Ghorbanian, V.; Lowther, D. Deep Learning for Magnetic Field Estimation. IEEE Trans. Magn. 2019, 55, 7202304. [Google Scholar] [CrossRef]
- Sasaki, H.; Igarashi, H. Topology Optimization Accelerated by Deep Learning. IEEE Trans. Magn. 2019, 55, 7401305. [Google Scholar] [CrossRef]
- Pollok, S.; Bjørk, R.; Jørgensen, P.S. Inverse Design of Magnetic Fields Using Deep Learning. IEEE Trans. Magn. 2021, 57, 2101604. [Google Scholar] [CrossRef]
- Amjad, J.; Lyu, Z.; Rodrigues, M.R.D. Deep Learning Model-Aware Regulatization with Applications to Inverse Problems. IEEE Trans. Signal Process. 2021, 69, 6371–6385. [Google Scholar] [CrossRef]
- Jin, K.H.; McCann, M.T.; Froustey, E.; Unser, M. Deep convolutional neural network for inverse problems in imaging. IEEE Trans. Image Process. 2017, 26, 4509–4522. [Google Scholar] [CrossRef] [PubMed]
- Liang, D.; Cheng, J.; Ke, Z.; Ying, L. Deep Magnetic Resonance Image Reconstruction: Inverse Problems Meet Neural Networks. IEEE Signal Process. Mag. 2020, 37, 141–151. [Google Scholar] [CrossRef] [PubMed]
- Ongie, G.; Jalal, A.; Metzler, C.A.; Baraniuk, R.G.; Dimakis, A.G.; Willett, R. Deep Learning Techniques for Inverse Problems in Imaging. IEEE J. Sel. Areas Inf. Theory 2020, 1, 39–56. [Google Scholar] [CrossRef]
- Goodfellow, I.; Bengio, Y.; Courville, A. Deep Learning; MIT Press: Cambridge, MA, USA, 2016. [Google Scholar]
- Kingma, D.P.; Welling, M. An introduction to variational autoencoders. Found. Trends® Mach. Learn. 2019, 12, 307–392. [Google Scholar] [CrossRef]
- Barmada, S.; Di Barba, P.; Fontana, N.; Mognaschi, M.E.; Tucci, M. Electromagnetic Field Reconstruction and Source Identification Using Conditional Variational Autoencoder and CNN. IEEE J. Multiscale Multiphys. Comput. Tech. 2023, 8, 322–331. [Google Scholar] [CrossRef]
- Hall, P. On Kullback-Leibler loss and density estimation. Ann. Stat. 1987, 15, 1491–1519. [Google Scholar] [CrossRef]
- Kingma, D.P.; Mohamed, S.; Jimenez Rezende, D.; Welling, M. Semi-supervised learning with deep generative models. In Proceedings of the Advances in Neural Information Processing Systems 27 (NIPS 2014), Cambridge, MA, USA, 8–13 December 2014; Volume 27. [Google Scholar]
- Di Barba, P.; Mognaschi, M.E.; Lowther, D.A.; Sykulski, J.K. A Benchmark TEAM Problem for Multi-Objective Pareto Optimization of Electromagnetic Devices. IEEE Trans. Magn. 2018, 54, 9400604. [Google Scholar] [CrossRef]
- Di Barba, P.; Mognaschi, M.E.; Lowther, D.A.; Sykulski, J.K. Improved solutions to a TEAM problem for multi-objective optimisation in magnetics. IET Sci. Meas. Technol. 2020, 14, 964–968. [Google Scholar] [CrossRef]
- Fletcher, R. On the Barzilai–Borwein Method. In Optimization and Control with Applications; Applied Optimization; Qi, L., Teo, K., Yang, X., Eds.; Springer: Boston, MA, USA, 2005; Volume 96, pp. 235–256. [Google Scholar]
- Siemens, Simcenter Magnet® Version 2022. Available online: https://www.plm.automation.siemens.com/global/it/products/simcenter/magnet.html (accessed on 1 January 2023).
Layers | Activations |
---|---|
Image-based input (size 30 × 84) & Label (size 30 × 84) | 30 × 84 × 2 |
2D Convolution (size 6 × 16, padd. same, stride 1) | 30 × 84 × 16 |
Batch Norm. | 30 × 84 × 16 |
ReLU act. fun. | 30 × 84 × 16 |
Max Pooling 2D (stride 2 × 2) | 15 × 42 × 16 |
2D Convolution (size 3 × 32, padd. same, stride 1) | 15 × 42 × 32 |
Batch Norm. | 15 × 42 × 32 |
ReLU act. fun. | 15 × 42 × 32 |
Max Pooling 2D (stride 2 × 2) | 7 × 21 × 32 |
2D Convolution (size 3 × 64, padd. same, stride 1) | 7 × 21 × 64 |
Batch Norm. | 7 × 21 × 64 |
ReLU act. fun. | 7 × 21 × 64 |
Fully connected layer (20 outputs) | 20 × 1 |
Layers | Activations |
---|---|
Image-based input (size 1 × 1 × 10) & Labels (size 1 × 1 × 2520) | 1 × 1 × 2530 |
Transp. Conv. 2D (size 3 × 420, stride 2 × 3, cropp. same) | 2 × 3 × 420 |
ReLU act. fun. | 2 × 3 × 420 |
Transp. Conv. 2D (size 5 × 35, stride 3 × 4, cropp. same) | 6 × 12 × 35 |
ReLU act. fun. | 6 × 12 × 35 |
Transp. Conv. 2D (size 10 × 16, stride 5 × 7, cropp. same) | 30 × 84 × 16 |
ReLU act. fun. | 30 × 84 × 16 |
Transp. Conv. 2D (size 3 × 8, stride 1 × 1, cropp. same) | 30 × 84 × 8 |
ReLU act. fun. | 30 × 84 × 16 |
Transp. Conv. 2D (size 3 × 1, stride 1 × 1, cropp. same) | 30 × 84 × 1 |
Layers | Activations |
---|---|
Image-based input (size 30 × 84) | 30 × 84 × 1 |
2D Convolution (size 3 × 8, padd. same, stride 1) | 30 × 84 × 8 |
Batch Norm. | 30 × 84 × 8 |
ReLU act. fun. | 30 × 84 × 8 |
Aver. Pool. 2D (stride 2 × 2) | 15 × 42 × 8 |
2D Convolution (size 3 × 16, padd. same, stride 1) | 15 × 42 × 16 |
Batch Norm. | 15 × 42 × 16 |
ReLU act. fun. | 15 × 42 × 16 |
Aver. Pool. 2D (stride 2 × 2) | 7 × 21 × 16 |
2D Convolution (size 3 × 32, padd. same, stride 1) | 7 × 21 × 32 |
Batch Norm. | 7 × 21 × 32 |
ReLU act. fun. | 7 × 21 × 32 |
Aver. Pool. 2D (stride 2 × 2) | 3 × 10 × 32 |
2D Convolution (size 3 × 64, padd. same, stride 1) | 3 × 10 × 64 |
Batch Norm. | 3 × 10 × 64 |
ReLU act. fun. | 3 × 10 × 64 |
Aver. Pool. 2D (stride 2 × 2) | 1 × 5 × 64 |
2D Convolution (size 3 × 128, padd. same, stride 1) | 1 × 5 × 128 |
Batch Norm. | 1 × 5 × 128 |
ReLU act. fun. | 1 × 5 × 128 |
Dropout (20% probability) | 1 × 5 × 128 |
Fully connected layer (10 outputs) | 1 × 1 × 10 |
Regression layer | 1 × 1 × 10 |
Metric | Output #1 Reconstructed Field | Output #2 10 Radii |
---|---|---|
MAPE | 4.24% | 3.30% |
NRMSE | 4.14% | 3.18% |
NMAE | 1.70%. | 1.27% |
Metric | Output #1 Reconstructed Field | Output #2 10 Radii |
---|---|---|
MAPE | N.A. | 2.28% |
NRMSE | N.A. | 1.98% |
NMAE | N.A. | 0.78% |
Layers | Activations |
---|---|
Image-based input (size 30 × 84) | 30 × 84 × 1 |
2D Convolution (size 3 × 8, padd. same, stride 1) | 30 × 84 × 8 |
Batch Norm. | 30 × 84 × 8 |
ReLU act. fun. | 30 × 84 × 8 |
Aver. Pool. 2D (stride 2 × 2, stride 2) | 15 × 42 × 8 |
2D Convolution (size 3 × 16, padd. same, stride 1) | 15 × 42 × 16 |
Batch Norm. | 15 × 42 × 16 |
ReLU act. fun. | 15 × 42 × 16 |
Aver. Pool. 2D (2 × 2, stride 2) | 7 × 21 × 16 |
2D Convolution (size 3 × 32, padd. same, stride 1) | 7 × 21 × 32 |
Batch Norm. | 7 × 21 × 32 |
ReLU act. fun. | 7 × 21 × 32 |
Aver. Pool. 2D (2 × 2, stride 2) | 3 × 10 × 32 |
2D Convolution (size 3 × 64, padd. same, stride 1) | 3 × 10 × 64 |
Batch Norm. | 3 × 10 × 64 |
ReLU act. fun. | 3 × 10 × 64 |
Aver. Pool. 2D (2 × 2, stride 2) | 1 × 5 × 64 |
2D Convolution (size 3 × 128, padd. same, stride 1) | 1 × 5 × 128 |
Batch Norm. | 1 × 5 × 128 |
ReLU act. fun. | 1 × 5 × 128 |
Dropout (30% probability) | 1 × 5 × 128 |
Fully connected layer (10 outputs) | 1 × 1 × 10 |
Regression layer | 1 × 1 × 10 |
Metric | Output #1 Reconstructed Field | Output #2 10 Radii |
---|---|---|
MAPE | N.A. | 2.54% |
NRMSE | N.A. | 3.28% |
NMAE | N.A. | 1.28% |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2024 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Barmada, S.; Di Barba, P.; Fontana, N.; Mognaschi, M.E.; Tucci, M. A Source Identification Problem in Magnetics Solved by Means of Deep Learning Methods. Mathematics 2024, 12, 859. https://doi.org/10.3390/math12060859
Barmada S, Di Barba P, Fontana N, Mognaschi ME, Tucci M. A Source Identification Problem in Magnetics Solved by Means of Deep Learning Methods. Mathematics. 2024; 12(6):859. https://doi.org/10.3390/math12060859
Chicago/Turabian StyleBarmada, Sami, Paolo Di Barba, Nunzia Fontana, Maria Evelina Mognaschi, and Mauro Tucci. 2024. "A Source Identification Problem in Magnetics Solved by Means of Deep Learning Methods" Mathematics 12, no. 6: 859. https://doi.org/10.3390/math12060859
APA StyleBarmada, S., Di Barba, P., Fontana, N., Mognaschi, M. E., & Tucci, M. (2024). A Source Identification Problem in Magnetics Solved by Means of Deep Learning Methods. Mathematics, 12(6), 859. https://doi.org/10.3390/math12060859