Recognition of Cosmic Ray Images Obtained from CMOS Sensors Used in Mobile Phones by Approximation of Uncertain Class Assignment with Deep Convolutional Neural Network
Abstract
:1. Introduction
1.1. State of the Art
1.2. Study Motivation
2. Materials and Methods
2.1. Problem Formulation
2.2. Approximation of Uncertain Class Assignment with Deep Convolutional Neural Network
2.3. Image Data Set
- Selection of the subset of the trustworthy devices operating in controlled conditions;
- Taking the image sample from trustworthy devices containing all morphologies of interest;
- Assigning the dataset elements to four classes with the help of 5 judges with the majority vote while retaining the number of votes cast for each class.
2.4. Image Preprocessing
- Let I be an input image in the RGB color scale (Figure 3A). First, the image is converted to gray scale. The gray value is calculated as the linear combination of the weighted RGB channels values by a standard OpenCV 4.2.0.32 function (see details in source code).
- The binary image is dilated and then opened using image morphology operations [46] with an elliptical kernel with a diameter of 5 pixels. After this operation, the objects detected by the Otsu algorithm have slightly increased their borders and nearby objects are merged together. Opening also removes small holes in regions (Figure 3C).
- The final image is generated by extracting from the gray scale image only those pixels that are in the non-zero region of the binary mask . The rest of the pixels in are set to zero (Figure 3D).
3. Results
4. Discussion
5. Conclusions
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Acknowledgments
Conflicts of Interest
References
- Winter, M.; Bourbeau, J.; Bravo, S.; Campos, F.; Meehan, M.; Peacock, J.; Ruggles, T.; Schneider, C.; Simons, A.L.; Vandenbroucke, J. Particle identification in camera image sensors using computer vision. Astropart. Phys. 2019, 104, 42–53. [Google Scholar] [CrossRef] [Green Version]
- Groom, D. Cosmic Rays and Other Nonsense in Astronomical CCD Imagers. In Scientific Detectors for Astronomy; Amico, P., Beletic, J.W., Beletic, J.E., Eds.; Astrophysics and Space Science Library; Springer: Dordrecht, The Netherlands, 2004; Volume 300. [Google Scholar] [CrossRef]
- Vandenbroucke, J.; BenZvi, S.; Bravo, S.; Jensen, K.; Karn, P.; Meehan, M.; Peacock, J.; Plewa, M.; Ruggles, T.; Santander, M.; et al. Measurement of cosmic-ray muons with the Distributed Electronic Cosmic-ray Observatory, a network of smartphones. J. Instrum. 2016, 11. [Google Scholar] [CrossRef] [Green Version]
- Shea, M.A.; Smart, D.F. Cosmic Ray Implications for Human Health. In Cosmic Rays and Earth; Bieber, J.W., Eroshenko, E., Evenson, P., Flückiger, E.O., Kallenbach, R., Eds.; Springer: Amsterdam, The Netherlands, 2000; pp. 187–205. [Google Scholar]
- Kim, J.; Lee, J.; Han, J.; Meyyappan, M. Caution: Abnormal Variability Due to Terrestrial Cosmic Rays in Scaled-Down FinFETs. IEEE Trans. Electron Devices 2019, 66, 1887–1891. [Google Scholar] [CrossRef]
- Höeffgen, S.K.; Metzger, S.; Steffens, M. Investigating the Effects of Cosmic Rays on Space Electronics. Front. Phys. 2020, 8, 318. [Google Scholar] [CrossRef]
- Foppiano, A.J.; Ovalle, E.M.; Bataille, K.; Stepanova, M. Ionospheric evidence of the May 1960 earthquake Concepción? Geofís. Int. 2008, 47, 179–183. [Google Scholar]
- Romanova, N.V.; Pilipenko, V.A.; Stepanova, M.V. On the magnetic precursor of the Chilean earthquake of February 27, 2010. Geomagn. Aeron. 2015, 55, 219–222. [Google Scholar] [CrossRef]
- He, L.; Heki, K. Three-Dimensional Tomography of Ionospheric Anomalies Immediately Before the 2015 Illapel Earthquake, Central Chile. J. Geophys. Res. (Space Phys.) 2018, 123, 4015–4025. [Google Scholar] [CrossRef] [Green Version]
- Pierre Auger Collaboration. The Pierre Auger Cosmic Ray Observatory. Nucl. Instruments Methods Phys. Res. Sect. A Accel. Spectrometers, Detect. Assoc. Equip. 2015, 768, 172–213. [Google Scholar] [CrossRef]
- Ahlers, M.; Halzen, F. Opening a new window onto the universe with IceCube. Prog. Part. Nucl. Phys. 2018, 102, 73–88. [Google Scholar] [CrossRef] [Green Version]
- Spieler, H. Semiconductor Detector Systems; Clarendon Press: Oxford, UK, 2005. [Google Scholar] [CrossRef] [Green Version]
- Szumlak, T. Silicon detectors for the LHC Phase-II upgrade and beyond RD50 Status report. Nucl. Instruments Methods Phys. Res. Sect. A Accel. Spectrometers, Detect. Assoc. Equip. 2020, 958, 162187. [Google Scholar] [CrossRef]
- Ruat, M.; d’Aillon, E.G.; Verger, L. 3D semiconductor radiation detectors for medical imaging: Simulation and design. In Proceedings of the 2008 IEEE Nuclear Science Symposium Conference Record, Dresden, Germany, 19–25 October 2008; pp. 434–439. [Google Scholar] [CrossRef]
- Unger, M.; Farrar, G. (In)Feasability of Studying Ultra-High-Energy Cosmic Rays with Smartphones. arXiv 2015, arXiv:1505.04777. [Google Scholar]
- Kumar, R. Tracking Cosmic Rays by CRAYFIS (Cosmic Rays Found in Smartphones) Global Detector. In Proceedings of the 34th International Cosmic Ray Conference (ICRC 2015), Hague, The Netherlands, 30 July–6 August 2015; p. 1234. [Google Scholar] [CrossRef] [Green Version]
- Borisyak, M.; Usvyatsov, M.; Mulhearn, M.; Shimmin, C.; Ustyuzhanin, A. Muon Trigger for Mobile Phones. In Proceedings of the 22nd International Conference on Computing in High Energy and Nuclear Physics (CHEP2016), San Francisco, CA, USA, 14–16 October 2016; Volume 898, p. 032048. [Google Scholar] [CrossRef]
- Albin, E.; Whiteson, D. Feasibility of Correlated Extensive Air Shower Detection with a Distributed Cosmic Ray Network. arXiv 2021, arXiv:2102.03466. [Google Scholar]
- Whiteson, D.; Mulhearn, M.; Shimmin, C.; Cranmer, K.; Brodie, K.; Burns, D. Searching for ultra-high energy cosmic rays with smartphones. Astropart. Phys. 2016, 79, 1–9. [Google Scholar] [CrossRef] [Green Version]
- Vandenbroucke, J.; Bravo, S.; Karn, P.; Meehan, M.; Peacock, J.; Plewa, M.; Ruggles, T.; Schultz, D.; Simons, A.L. Detecting particles with cell phones: The Distributed Electronic Cosmic-ray Observatory. arXiv 2015, arXiv:1510.07665. [Google Scholar]
- Meehan, M.; Bravo, S.; Campos, F.; Peacock, J.; Ruggles, T.; Schneider, C.; Simons, A.L.; Vandenbroucke, J.; Winter, M. The particle detector in your pocket: The Distributed Electronic Cosmic-ray Observatory. arXiv 2017, arXiv:1708.01281. [Google Scholar]
- Homola, P.; Beznosko, D.; Bhatta, G.; Bibrzycki, Ł.; Borczyńska, M.; Bratek, Ł.; Budnev, N.; Burakowski, D.; Alvarez-Castillo, D.E.; Cheminant, K.A.; et al. Cosmic-Ray Extremely Distributed Observatory. Symmetry 2020, 12, 1835. [Google Scholar] [CrossRef]
- Bibrzycki, Ł.; Burakowski, D.; Homola, P.; Piekarczyk, M.; Niedźwiecki, M.; Rzecki, K.; Stuglik, S.; Tursunov, A.; Hnatyk, B.; Castillo, D.E.; et al. Towards A Global Cosmic Ray Sensor Network: CREDO Detector as the First Open-Source Mobile Application Enabling Detection of Penetrating Radiation. Symmetry 2020, 12, 1802. [Google Scholar] [CrossRef]
- Sahibuddin, S.; Sjarif, N.N.A.; Hussein, I.S. Shape feature extraction methods based pattern recognition: A survey. Open Int. J. Inform. 2018, 6, 26–41. [Google Scholar]
- Yadav, P.; Yadav, B. A survey on application of image processing techniques in object shape recognition. Int. J. Electron. Eng. 2018, 10, 157–164. [Google Scholar]
- Narottambhai, M.; Tandel, P. A Survey on Feature Extraction Techniques for Shape based Object Recognition. Int. J. Comput. Appl. 2016, 137, 16–20. [Google Scholar] [CrossRef]
- Gao, M.; Jiang, J.; Zou, G.; John, V.; Liu, Z. RGB-D-Based Object Recognition Using Multimodal Convolutional Neural Networks: A Survey. IEEE Access 2019, 7, 43110–43136. [Google Scholar] [CrossRef]
- Kaya, M.; Bilge, H. Deep Metric Learning: A Survey. Symmetry 2019, 11, 1066. [Google Scholar] [CrossRef] [Green Version]
- Jiao, L.; Zhang, F.; Liu, F.; Yang, S.; Li, L.; Feng, Z.; Qu, R. A Survey of Deep Learning-Based Object Detection. IEEE Access 2019, 7, 128837–128868. [Google Scholar] [CrossRef]
- Loncomilla, P. Object Recognition using Local Invariant Features for Robotic Applications: A Survey. Pattern Recognit. 2016, 60. [Google Scholar] [CrossRef]
- Krishna, S.; Kalluri, H. Deep learning and transfer learning approaches for image classification. IEEE Trans. Robot. 2019, 7, 427–432. [Google Scholar]
- Falco, P.; Lu, S.; Natale, C.; Pirozzi, S.; Lee, D. A Transfer Learning Approach to Cross-Modal Object Recognition: From Visual Observation to Robotic Haptic Exploration. IEEE Trans. Robot. 2019, 35, 987–998. [Google Scholar] [CrossRef] [Green Version]
- Wang, S.; Zhang, L.; Zuo, W.; Zhang, B. Class-Specific Reconstruction Transfer Learning for Visual Recognition Across Domains. IEEE Trans. Image Process. 2020, 29, 2424–2438. [Google Scholar] [CrossRef]
- Perera, P.; Patel, V.M. Deep Transfer Learning for Multiple Class Novelty Detection. In Proceedings of the 2019 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR), Long Beach, CA, USA, 15–20 June 2019; pp. 11536–11544. [Google Scholar] [CrossRef] [Green Version]
- Daw, A.; Thomas, R.; Carey, C.; Read, J.; Appling, A.; Karpatne, A. Physics-Guided Architecture (PGA) of Neural Networks for Quantifying Uncertainty in Lake Temperature Modeling. In Proceedings of the 2020 SIAM International Conference on Data Mining, Society for Industrial and Applied Mathematics, Cincinnati, OH, USA, 7–9 May 2020; pp. 532–540. [Google Scholar] [CrossRef] [Green Version]
- Hachaj, T.; Miazga, J. Image Hashtag Recommendations Using a Voting Deep Neural Network and Associative Rules Mining Approach. Entropy 2020, 22, 1351. [Google Scholar] [CrossRef]
- Heaton, J.; Goodfellow, I.; Bengio, Y.; Courville, A. Deep Learning; The MIT Press: Cambridge, UK, 2016; ISBN 0262035618. [Google Scholar] [CrossRef] [Green Version]
- Kingma, D.P.; Ba, J. Adam: A Method for Stochastic Optimization. In Proceedings of the 3rd International Conference on Learning Representations—ICLR 2015, San Diego, CA, USA, 7–9 May 2015. [Google Scholar]
- Chollet, F. Xception: Deep Learning with Depthwise Separable Convolutions. In Proceedings of the 2017 IEEE Conference on Computer Vision and Pattern Recognition (CVPR), Honolulu, HI, USA, 21–26 July 2017; pp. 1800–1807. [Google Scholar]
- Huang, G.; Liu, Z.; Van Der Maaten, L.; Weinberger, K.Q. Densely Connected Convolutional Networks. In Proceedings of the 2017 IEEE Conference on Computer Vision and Pattern Recognition (CVPR), Honolulu, HI, USA, 21–26 July 2017; pp. 2261–2269. [Google Scholar]
- Simonyan, K.; Zisserman, A. Very Deep Convolutional Networks for Large-Scale Image Recognition. arXiv 2015, arXiv:abs/1409.1556. [Google Scholar]
- Zoph, B.; Vasudevan, V.; Shlens, J.; Le, Q.V. Learning Transferable Architectures for Scalable Image Recognition. In Proceedings of the 2018 IEEE/CVF Conference on Computer Vision and Pattern Recognition, Salt Lake City, UT, USA, 18–23 June 2018; pp. 8697–8710. [Google Scholar]
- Sandler, M.; Howard, A.; Zhu, M.; Zhmoginov, A.; Chen, L. MobileNetV2: Inverted Residuals and Linear Bottlenecks. In Proceedings of the 2018 IEEE/CVF Conference on Computer Vision and Pattern Recognition, Salt Lake City, UT, USA, 18–23 June 2018; pp. 4510–4520. [Google Scholar]
- Russakovsky, O.; Deng, J.; Su, H.; Krause, J.; Satheesh, S.; Ma, S.; Huang, Z.; Karpathy, A.; Khosla, A.; Bernstein, M.; et al. ImageNet Large Scale Visual Recognition Challenge. Int. J. Comput. Vis. (IJCV) 2015, 115, 211–252. [Google Scholar] [CrossRef] [Green Version]
- Otsu, N. A Threshold Selection Method from Gray-Level Histograms. IEEE Trans. Syst. Man Cybern. 1979, 9, 62–66. [Google Scholar] [CrossRef] [Green Version]
- Soille, P. Morphological Image Analysis-Principles and Applications; Springer: Berlin/Heidelberg, Germany, 2003. [Google Scholar] [CrossRef]
- Lavika, G.; Lavanya, B.; Panchal, P. Hybridization of Biogeography-Based Optimization and Gravitational Search Algorithm for Efficient Face Recognition; IGI Global: Hershey, PA, USA, 2019. [Google Scholar]
- Becker, R.A.; Chambers, J.M.; Wilks, A.R. The New S Language: A Programming Environment for Data Analysis and Graphics; Wadsworth and Brooks/Cole Advanced Books & Software: Pacific Grove, CA, USA, 1988. [Google Scholar]
- Bressem, K.K.; Adams, L.C.; Erxleben, C.; Hamm, B.; Niehues, S.M.; Vahldiek, J.L. Comparing different deep learning architectures for classification of chest radiographs. Sci. Rep. 2020, 10, 13590. [Google Scholar] [CrossRef] [PubMed]
0 | 1 | 2 | 3 | 4 | 5 | |
---|---|---|---|---|---|---|
Spots | 1790 | 103 | 82 | 34 | 80 | 261 |
Tracks | 1832 | 136 | 91 | 73 | 109 | 109 |
Worms | 1834 | 198 | 85 | 98 | 104 | 31 |
Artefacts | 1115 | 82 | 3 | 0 | 0 | 1150 |
Input Convolutional Architecture | Recognition Rate | Mean Squared Error |
---|---|---|
VGG16 | 85.79 ± 2.24 | 0.03 ± 0.00 |
NASNetLarge | 81.66 ± 2.53 | 0.04 ± 0.01 |
MobileNetV2 | 78.43 ± 2.06 | 0.05 ± 0.01 |
Xception | 81.49 ± 2.94 | 0.04 ± 0.01 |
DenseNet201 | 84.68 ± 1.77 | 0.03 ± 0.00 |
Spots | Tracks | Worms | Artefacts | |
---|---|---|---|---|
Spots | 90.84 ± 4.16 | 1.38 ± 1.88 | 5.04 ± 2.82 | 2.74 ± 1.78 |
Tracks | 7.46 ± 3.80 | 73.31 ± 11.50 | 15.10 ± 9.01 | 4.14 ± 2.53 |
Worms | 6.14 ± 6.01 | 22.64 ± 10.70 | 62.59 ± 9.92 | 8.64 ± 4.19 |
Artefacts | 2.54 ± 1.23 | 0.80 ± 1.05 | 2.70 ± 1.60 | 93.97 ± 1.93 |
Spots | Tracks | Worms | Artefacts | |
---|---|---|---|---|
Spots | 89.52 ± 5.40 | 3.31 ± 3.13 | 3.76 ± 3.29 | 3.41 ± 2.05 |
Tracks | 9.27 ± 4.13 | 62.26 ± 5.06 | 17.96 ± 4.93 | 10.52 ± 5.46 |
Worms | 7.83 ± 4.16 | 26.41 ± 9.08 | 51.66 ± 6.54 | 14.11 ± 5.30 |
Artefacts | 2.44 ± 1.79 | 1.58 ± 0.39 | 2.71 ± 1.50 | 93.27 ± 2.71 |
Spots | Tracks | Worms | Artefacts | |
---|---|---|---|---|
Spots | 78.66 ± 3.46 | 13.37 ± 4.03 | 4.34 ± 2.70 | 3.63 ± 2.34 |
Tracks | 12.28 ± 6.03 | 56.14 ± 5.78 | 24.40 ± 4.50 | 7.18 ± 5.96 |
Worms | 7.15 ± 5.44 | 25.63 ± 9.40 | 50.84 ± 7.89 | 16.39 ± 3.87 |
Artefacts | 2.09 ± 1.29 | 1.22 ± 0.61 | 3.15 ± 2.04 | 93.54 ± 1.74 |
Spots | Tracks | Worms | Artefacts | |
---|---|---|---|---|
Spots | 84.29 ± 4.04 | 3.69 ± 2.83 | 6.49 ± 2.86 | 5.53 ± 3.67 |
Tracks | 8.22 ± 5.66 | 60.21 ± 7.30 | 23.58 ± 7.44 | 7.99 ± 4.66 |
Worms | 7.58 ± 4.77 | 21.18 ± 5.23 | 60.18 ± 8.70 | 11.06 ± 4.65 |
Artefacts | 2.01 ± 1.21 | 1.40 ± 1.09 | 3.14 ± 1.71 | 93.45 ± 2.18 |
Spots | Tracks | Worms | Artefacts | |
---|---|---|---|---|
Spots | 87.44 ± 4.97 | 4.79 ± 2.74 | 4.25 ± 2.61 | 3.52 ± 2.54 |
Tracks | 5.11 ± 3.06 | 71.03 ± 5.43 | 20.33 ± 5.41 | 3.53 ± 3.49 |
Worms | 6.03 ± 5.00 | 23.46 ± 7.71 | 61.90 ± 10.16 | 8.61 ± 3.15 |
Artefacts | 2.63 ± 0.86 | 0.79 ± 0.87 | 1.93 ± 1.17 | 94.65 ± 1.86 |
Spots (S) | Tracks (T) | Worms (W) | Artefacts (A) | |
---|---|---|---|---|
Spots | 94.00 ± 3.87 | 2.14 ± 1.90 | 1.40 ± 2.13 | 2.46 ± 2.47 |
Tracks | 1.99 ± 2.14 | 82.64 ± 5.11 | 14.07 ± 4.40 | 1.30 ± 2.38 |
Worms | 2.65 ± 5.33 | 17.09 ± 10.25 | 75.87 ± 13.51 | 4.39 ± 4.15 |
Artefacts | 1.01 ± 1.12 | 1.01 ± 0.69 | 1.86 ± 0.86 | 96.12 ± 1.27 |
Spots (S) | Tracks (T) | Worms (W) | Artefacts (A) | |
---|---|---|---|---|
Spots | 96.23 ± 2.29 | 0.92 ± 1.49 | 0.36 ± 1.13 | 2.49 ± 2.38 |
Tracks | 0.83 ± 2.64 | 88.96 ± 6.79 | 8.72 ± 5.25 | 1.49 ± 3.46 |
Worms | 1.00 ± 3.16 | 14.24 ± 11.74 | 79.54 ± 10.97 | 5.22 ± 5.75 |
Artefacts | 0.65 ± 0.64 | 0.56 ± 0.48 | 1.80 ± 0.81 | 97.00 ± 0.95 |
Spots (S) | Tracks (T) | Worms (W) | Artefacts (A) | |
---|---|---|---|---|
Spots | 98.71 ± 2.99 | 0.45 ± 1.44 | 0.00 ± 0.00 | 0.84 ± 1.78 |
Tracks | 1.43 ± 4.52 | 88.89 ± 12.78 | 8.85 ± 8.78 | 0.83 ± 2.64 |
Worms | 0.00 ± 0.00 | 7.43 ± 15.82 | 89.65 ± 15.52 | 2.92 ± 6.23 |
Artefacts | .38 ± 0.50 | 0.27 ± 0.44 | 1.64 ± 0.81 | 97.70 ± 1.24 |
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations. |
© 2021 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).
Share and Cite
Hachaj, T.; Bibrzycki, Ł.; Piekarczyk, M. Recognition of Cosmic Ray Images Obtained from CMOS Sensors Used in Mobile Phones by Approximation of Uncertain Class Assignment with Deep Convolutional Neural Network. Sensors 2021, 21, 1963. https://doi.org/10.3390/s21061963
Hachaj T, Bibrzycki Ł, Piekarczyk M. Recognition of Cosmic Ray Images Obtained from CMOS Sensors Used in Mobile Phones by Approximation of Uncertain Class Assignment with Deep Convolutional Neural Network. Sensors. 2021; 21(6):1963. https://doi.org/10.3390/s21061963
Chicago/Turabian StyleHachaj, Tomasz, Łukasz Bibrzycki, and Marcin Piekarczyk. 2021. "Recognition of Cosmic Ray Images Obtained from CMOS Sensors Used in Mobile Phones by Approximation of Uncertain Class Assignment with Deep Convolutional Neural Network" Sensors 21, no. 6: 1963. https://doi.org/10.3390/s21061963
APA StyleHachaj, T., Bibrzycki, Ł., & Piekarczyk, M. (2021). Recognition of Cosmic Ray Images Obtained from CMOS Sensors Used in Mobile Phones by Approximation of Uncertain Class Assignment with Deep Convolutional Neural Network. Sensors, 21(6), 1963. https://doi.org/10.3390/s21061963