Next Article in Journal
A Numerical Study of Microwave Frequency Comb Generation in a Semiconductor Laser Subject to Modulated Optical Injection and Optoelectronic Feedback
Previous Article in Journal
Thermo-Optic Switch with High Tuning Efficiency Based on Nanobeam Cavity and Hydrogen-Doped Indium Oxide Microheater
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Multi-Wavelength Computational Ghost Imaging Based on Feature Dimensionality Reduction

Department of Physics, Changchun University of Science and Technology, Changchun 130022, China
*
Authors to whom correspondence should be addressed.
Photonics 2024, 11(8), 739; https://doi.org/10.3390/photonics11080739
Submission received: 29 June 2024 / Revised: 2 August 2024 / Accepted: 5 August 2024 / Published: 7 August 2024
(This article belongs to the Special Issue Advances in Scattering Imaging and Single-Pixel/Ghost Imaging)

Abstract

:
Multi-wavelength ghost imaging usually involves extensive data processing and faces challenges such as poor reconstructed image quality. In this paper, we propose a multi-wavelength computational ghost imaging method based on feature dimensionality reduction. This method not only reconstructs high-quality color images with fewer measurements but also achieves low-complexity computation and storage. First, we utilize singular value decomposition to optimize the multi-scale measurement matrices of red, green, and blue components as illumination speckles. Subsequently, each component image of the target object is reconstructed using the second-order correlation function. Next, we apply principal component analysis to perform feature dimensionality reduction on these reconstructed images. Finally, we successfully recover a high-quality color reconstructed image. Simulation and experimental results show that our method not only improves the quality of the reconstructed images but also effectively reduces the computational and storage burden. When extended to multiple wavelengths, our method demonstrates greater advantages, making it more feasible to handle large-scale data.

1. Introduction

Ghost imaging (GI) is an imaging technique that reconstructs images of unknown objects through the intensity correlation between the object beam and the reference beam. The total intensity of the object beam, which contains the object’s information, is collected by a bucket detector, while the reference beam is directly detected by a detector with spatial resolution. GI was first proposed by Pittman et al. [1,2]. This technique offers higher sensitivity in detection and greater efficiency in information extraction compared to traditional optical imaging. Additionally, GI has garnered increasing interest for applications such as remote sensing [3,4,5], super-resolution [6], and optical encryption [7].
In 2008, Shapiro et al. [8] theoretically proposed a computational ghost imaging (CGI) scheme and achieved single-arm ghost imaging. Subsequently, Bromberg et al. [9] experimentally validated its feasibility. Compared to traditional ghost imaging, CGI has the advantages of a simpler optical path and enhanced usability. It allows for the artificial design of speckle patterns with various characteristics to improve imaging quality, thereby further advancing the practical application of ghost imaging technology.
In recent years, differential computational ghost imaging (DCGI) [10], singular value decomposition ghost imaging (SVDGI) [11], deep learning ghost imaging (DLGI), and several other methods [12,13,14,15,16,17] have further improved the computational efficiency and imaging quality of CGI. However, current research primarily focuses on grayscale imaging, with relatively few studies on multi-wavelength computational ghost imaging (MWCGI). In practical applications, most images are in color, making the study of MWCGI particularly necessary. Compared to monochromatic CGI, MWCGI not only recovers the spatial information of the target object but also captures its color- and wavelength-related information. However, its complex imaging system results in a time-consuming process and increased computational and storage demands. When the RGB three bands are extended to multiple bands especially, the system’s consumption of memory and computational resources becomes more significant. Additionally, multi-wavelength systems often face the challenge of poor reconstructed image quality under undersampling conditions.
To overcome the challenges faced by MWCGI, various research teams have proposed a range of innovative methods. Welsh et al. [17,18] analyzed multi-wavelength compressed computational ghost imaging and demonstrated that the system can produce full-color, multi-band, high-quality images of real objects. Duan et al. demonstrated that utilizing rotating ground glass and spatial light modulators can effectively produce color images from monochrome images through incoherent superposition. Additionally, Zhang et al. [19] proposed a wavelength-multiplexing ghost imaging technique, successfully reducing the required number of measurements. Huang et al. [20] proposed a computational ghost imaging scheme based on spectral encoding technology, achieving multispectral imaging. Additionally, other researchers have proposed various methods, including color ghost imaging schemes based on optimized random speckles and truncated singular value decomposition [21], as well as color ghost imaging schemes based on deep learning [22,23]. While these studies have enhanced the imaging efficiency and quality of MWCGI, they have also increased computational and storage demands with the addition of more wavelength bands. Addressing these limitations is crucial for advancing the technology’s application and development.
In this paper, we propose a multi-wavelength computational ghost imaging method based on feature dimensionality reduction, which can achieve high-quality image reconstruction at low sampling rates and significantly reduce computational complexity and storage requirements. We optimize multi-scale speckles as illumination patterns and use the second-order correlation function to reconstruct the component images of the target object. Furthermore, we apply principal component analysis (PCA) for feature dimensionality reduction on these reconstructed component images, which are then fused to form a color image. This method effectively improves imaging quality at low sampling rates while successfully retaining key information from each wavelength band, significantly reducing computational and storage burdens. The rest of this article is arranged as follows. In Section 2, we introduce the feature dimensionality reduction-based MWCGI scheme. In Section 3, we carry out numerical simulations and discuss the results. In Section 4, we present experimental results and discuss their significance. Finally, conclusions are drawn in Section 5.

2. Theory

The principle diagram of MWCGI is shown in Figure 1. In this setup, computer-generated random illumination speckle patterns are used as the light source. These speckle patterns are projected onto the surface of the object using a projector. In the signal light path, three bucket detectors equipped with red, green, and blue filters are set up to individually capture the intensity signals of the three light colors’ paths. Subsequently, second-order correlation operations are used to obtain the reconstructed image of the target object. By fusing the intensity information at corresponding positions in the images from each wavelength band, a reconstructed image containing the color information of the object can be obtained.
In the MWCGI system, at the pth measurement, the illumination speckle pattern with dimensions m × n is denoted as R p , j ( x , y ) , and the bucket signal received by the bucket detector is denoted as B p , j . Where x = 1 , 2 , 3 , , m , y = 1 , 2 , 3 , , n , p = 1 , 2 , 3 , , M , j = r , g , b , M is the number of measurements. After M measurements, the second-order correlation reconstruction is performed for each wavelength band. The second-order correlation function can be expressed as:
G j ( 2 ) ( x , y ) = 1 M p = 1 M R p , j ( x , y ) B p , j
where G j ( 2 ) ( x , y ) represents the three RGB components obtained by detecting the target object, corresponding to the imaging results in the red, green, and blue wavelength bands. Here, each speckle pattern pre-generated by the computer is reconfigured into a row vector of length m × n to form a row of the matrix Φ j , resulting in a measurement matrix of size M × N ( N = m × n ) . The measurement matrix is represented as:
Φ j = R 1 , j ( 1 , 1 ) R 1 , j ( 1 , 2 ) R 1 , j ( m , n ) R 2 , j ( 1 , 1 ) R 2 , j ( 1 , 2 ) R 2 , j ( m , n ) R M , j ( 1 , 1 ) R M , j ( 1 , 2 ) R M , j ( m , n )
The matrix form of the signal B collected by the bucket detectors can be represented as:
B p , j = Φ j O j
Referring to the second-order correlation operation, Equation (1) can be rewritten as follows:
G j ( 2 ) ( x , y ) = 1 M Φ j T Φ j O j
It is evident that when Φ j T Φ j = M I , the target object can be accurately reconstructed. However, this condition is difficult to achieve with a low number of measurements. To address this issue, we introduce a measurement matrix optimization method to achieve high-quality multi-wavelength computational ghost imaging. Specifically, we perform singular value decomposition (SVD) on the constructed random measurement matrix Φ j to obtain the optimized measurement matrix. The decomposition of the measurement matrices for the RGB three wavelength bands can be expressed as follows:
Φ j = U j Λ j V j T
where U and V are two orthogonal matrices and Λ is a diagonal matrix. We choose V j as the optimized illumination speckles. In our method, the second-order correlation function for the RGB three wavelength bands can be rewritten as follows:
G j ( 2 ) ( x , y ) = 1 M V j T V j O j
In our design method, the generated measurement matrix V j is an orthogonal matrix. However, in traditional ghost imaging techniques, the random measurement matrix Φ j used may not satisfy orthogonality. Therefore, we can significantly improve the imaging quality of multi-wavelength computational ghost imaging while maintaining the same number of measurements. Subsequently, by fusing the intensity information at corresponding positions in the RGB three-band images, we can successfully reconstruct an image containing the color information of the target object. However, as the number of wavelength bands increases, this method also correspondingly increases the computational complexity and storage requirements of the system. To address this issue, we introduced dimensionality reduction techniques into the MWCGI system.
Firstly, the RGB reconstructed component image is standardized using the following formula:
X j = X j μ j σ j
where X j represents the RGB reconstructed component image matrix ( X j = G j ( 2 ) ), and μ j and σ j are the mean and standard deviation of the data, respectively. Next, the covariance matrix of the standardized data is calculated using the following formula:
C j = 1 N 1 X j T X j
By performing eigenvalue decomposition on the covariance matrix C j , a set of eigenvalues and corresponding eigenvectors can be obtained. These eigenvectors define a new coordinate system for the data. The mathematical expression for eigenvalue decomposition is as follows:
C j ω j = λ j ω j
The eigenvalue λ j quantifies the variance contribution in the direction of each eigenvector, thereby determining its importance. In practical applications, the eigenvectors corresponding to the top p largest eigenvalues are often selected as the principal components to achieve dimensionality reduction. The number of principal components p selected is usually determined by the threshold of the cumulative contribution rate, which is chosen based on actual needs to ensure that most of the information is retained. It is important to note that as the number of principal components p increases, the reconstruction quality typically improves. Increasing p allows for the capture of more variance information, thereby making the reconstruction results more accurate and closer to the original data. Using the selected eigenvectors, the original data can be transformed into a new feature space, which is typically achieved using the following formula:
T j = X j ω j
where T j is the transformed data matrix and ω j is the matrix of selected eigenvectors. Subsequently, through the inverse transformation process, we can approximately reconstruct the low-dimensional principal component representation T j back to the original high-dimensional data form X ^ j , retaining as much of the original information as possible. The formula for this reconstruction process is:
X ^ j = T j ω j T
After dimensionality reduction, we obtained the reconstructed images for the three bands, X ^ r , X ^ g , and X ^ b . Finally, by fusing the intensity information at the same positions from the RGB band images X ^ j , the final color reconstructed image is generated.
Next, we conduct a theoretical analysis of computational complexity and storage requirements. In the MWCGI system, the RGB images for the three bands are all of size m × n . By applying PCA for dimensionality reduction, the subsequent computational complexity and storage requirements can be significantly reduced.
Firstly, we analyze the impact of the dimensionality reduction method on computational complexity. For each band, the complexity of calculating the covariance matrix is O ( n 2 m ) , and the complexity of eigenvalue decomposition is O ( n 3 ) . Since there are three bands, the total computational complexity is O 3 × n 2 m + n 3 .
By applying PCA for dimensionality reduction, we transform the data matrix X for each band into a lower-dimensional space T . In this case, the complexity of calculating the covariance matrix and eigenvalue decomposition are reduced to O ( p 2 m ) and O ( p 3 ) , respectively. The total computational complexity is O 3 × p 2 m + p 3 . Thus, the complexities of these calculations are significantly reduced because p n .
Next, we analyze the impact of the dimensionality reduction method on storage requirements. Storing each m × n image data matrix X requires m × n storage units. For three bands, the total storage requirement is 3 × ( m × n ) .
By applying PCA for dimensionality reduction, the data matrix for each band is reduced to m × p , requiring m × p storage units. Additionally, the principal component eigenvector matrix ω needs to be stored, which requires n × p storage units. For three bands, the total storage requirement is 3 × m × p + n × p . Considering that p n , even with the additional storage requirement for the principal component eigenvector matrix of 3 × n × p , the storage needs after dimensionality reduction are still significantly reduced.
To more intuitively demonstrate the comparison of the aforementioned complexities and storage requirements, we utilize a tabular format, as shown in Table 1. This approach makes the significant advantages of dimensionality reduction methods in terms of computational complexity and storage requirements clearer and more evident. In PCA, a small number of principal components p typically suffices to retain most of the information, as the majority of the data’s variance is concentrated in the first few principal components. For smaller data matrices, the number of principal components p may not be significantly smaller relative to n, though p is still generally much smaller than n. For larger data matrices, p becomes noticeably smaller in relation to n, making the dimensionality reduction effect of PCA more pronounced. As n increases, the gap between p and n also widens, making the advantages of dimensionality reduction in terms of computational complexity and storage requirements more pronounced. Additionally, as the number of bands increases, the advantages of dimensionality reduction in terms of computational complexity and storage needs become even more significant.
Therefore, our method ensures the quality of the color image reconstruction even under undersampling conditions, while optimizing subsequent computational complexity and storage requirements, achieving efficient data processing.

3. Numerical Simulations and Discussion

To validate the effectiveness of the proposed theory, we conduct a series of numerical simulations to obtain multi-wavelength computational ghost imaging results. We select a 64 × 64 pixels color “house” image as the target object for testing. First, the original color image is decomposed into three separate wavelength bands: red, green, and blue (RGB), resulting in a grayscale images for each wavelength band. Then, the second-order correlation algorithm is used to reconstruct the images for each of these wavelengths independently. Finally, the reconstructed single-wavelength band images are fused to form a color image. The schematic diagram of the numerical simulation process for MWCGI is shown in Figure 2.
We use three types of speckle patterns: random, SVD, and multi-scale speckle patterns [24], and compare the quality of reconstructed color images using the second-order correlation algorithm. Figure 3 shows the results, with Figure 3a–c corresponding to the numerical simulation results of random, SVD, and multi-scale speckle patterns, respectively. From Figure 3, it can be observed that as the number of measurements increases, the reconstructed image information becomes clearer. Although Figure 3a successfully reconstructs the target object, the image quality is relatively poor with noticeable noise issues. In contrast, Figure 3b shows a significant improvement in image quality. Figure 3c shows that the optimized multi-scale speckle patterns significantly enhance the image reconstruction quality, especially at low sampling rates. With 1500 measurements, high-quality image reconstruction is achieved, highlighting the potential of the multi-scale speckle method in improving imaging efficiency.
To quantitatively evaluate the quality of reconstructed images, we use Peak Signal-to-Noise Ratio (PSNR) as the evaluation standard. The definition of PSNR is shown as follows:
P S N R = 10 × log 10 MaxVal 2 M S E
where M a x V a l represents the grayscale level of the image, which is 255 for an 8-bit image. M S E denotes the mean squared error.
Figure 4 presents the PSNR curves for the reconstructed color images using random speckle patterns, SVD speckle patterns, and multi-scale speckle patterns under different measurement times. These curves clearly demonstrate that the quality of the reconstructed images gradually improves with an increase in measurement times. Notably, the PSNR values achieved with the multi-scale speckle patterns are significantly higher than those obtained with random speckle patterns and SVD speckle patterns at low sampling rates. This result is consistent with previous numerical simulation reconstruction results, further confirming the effectiveness of multi-scale speckle patterns in enhancing reconstruction image quality.
Although the multi-wavelength computational ghost imaging method based on optimized multi-scale speckles effectively improves image quality, it also introduces significant challenges in computational complexity and storage space as the number of wavelength bands increases. To address this issue, we introduce dimensionality reduction methods into the multi-wavelength computational ghost imaging system. We provide the numerical simulation results of multi-wavelength computational ghost imaging based on feature dimensionality reduction, as shown in Figure 5. In this method, we first perform dimensionality reduction on the preliminary reconstructed images for each wavelength band until the cumulative contribution rate of each wavelength band reaches 100%, resulting in optimized single-wavelength band images. The cumulative contribution rate is the ratio of the cumulative eigenvalues of the selected eigenvectors to the total eigenvalues, used to measure the amount of original data information retained by the selected eigenvectors. Subsequently, we perform fusion calculations on these dimensionally reduced images. This process effectively reduces the data dimensionality of multi-wavelength color images while ensuring that the most critical visual information is retained, facilitating subsequent analysis or visualization. This method holds significant practical value in image processing, storage, and analysis.
To validate our method, we conduct PCA numerical simulations on the reconstruction results of multi-scale speckles with 1500 measurements, as shown in Figure 6. In the figure, p values represent the number of principal components, and the percentages indicate the cumulative contribution rate of each component’s principal component. As can be clearly seen from Figure 6, the image quality gradually improves with the increase in the number of principal components. When the first nine principal components are selected, the cumulative contribution rate reaches 100%, indicating that there is a large amount of redundant information in the data. This also validates our method for effectively preserving the main information of the image while removing redundant data.
Subsequently, we perform dimensionality reduction simulations on the data from Figure 3c, with the results shown in Figure 7. Figure 7a displays the multi-wavelength computational ghost imaging reconstruction results using multi-scale speckles, while Figure 7b presents the fused multi-wavelength color ghost imaging reconstruction results after dimensionality reduction. From the figure, we can observe that there is no significant visual difference in image quality before and after dimensionality reduction. This result indicates that by combining multi-scale speckle optimization methods with PCA, we not only reconstruct high-quality color images with a lower number of measurements but also effectively extract key information from the reconstructed images. Additionally, this method significantly reduces the dimensionality of the feature space, thereby decreasing the required storage space and computational complexity. These results further validate the effectiveness and practicality of our approach.
Then, we use the Structural Similarity Index (SSIM) [25] to evaluate the dimensionality-reduced images [as shown in Figure 8], quantifying the similarity between the reconstructed images in Figure 7a,b. The results show that when the number of principal components is 9, the SSIM values for all reconstructed images are 1. This indicates that during the compression and reconstruction process using PCA, the images after dimensionality reduction are very similar to the images before dimensionality reduction, with no significant quality loss. These results further validate the effectiveness and practicality of our method.

4. Experiments and Discussion

To validate the feasibility of this scheme in practical multi-wavelength computational ghost imaging applications, we conduct a series of experiments. The experiments use a metal sheet with a hollow “CUST” in the center as the target object. The projector model is XGIMI-XE11F, and the photodiode model in the optical detection circuit is PDA100A2. We use the method introduced in the previous section to reconstruct the target object’s image and compare the results of reconstructing the target image under the same conditions. The experimental reconstruction results are shown in Figure 9. Figure 9a–c represent the experimental reconstruction results using random speckles, SVD speckles, and multi-scale speckles, respectively. As shown in the figure, the experimental reconstruction results using random speckles and SVD speckles are poor, with a significant amount of noise. In contrast, the multi-scale speckle method shown in Figure 9c can clearly reconstruct the target object’s image with fewer measurements and achieve higher quality reconstruction with 1500 measurements. Figure 9d shows the experimental reconstruction results using the combined multi-scale speckle and PCA method. From the experimental results, it can be observed that Figure 9c,d exhibit consistent visual effects, indicating that the dimensionally reduced fused image can replace the original image. The effective combination of this multi-scale speckle optimization method and PCA not only reconstructs high-quality images with fewer measurements but also significantly reduces computational and storage complexity. This facilitates computational optimization and effectively improves system efficiency. These experimental results validate the feasibility and practicality of our approach.
To more specifically compare the imaging quality of the experimental results, we calculate their PSNR and SSIM, as shown in Figure 10a,b. Figure 10a shows the variation in the PSNR for random, SVD, and multi-scale speckles at different measurement times [Figure 9a–c]. It can be seen that with an increasing number of measurements, the PSNR of multi-scale speckles is significantly higher than that of random and SVD speckles. Figure 10b displays the SSIM curves for all dimensionally reduced images [Figure 9d] when the number of principal components is p = 9. These curves quantify the similarity between the reconstructed images in Figure 9c,d. It can be observed that the SSIM values of the dimensionally reduced images remain around 1.0, indicating that the structural similarity of the images remains stable in this case. These two curves demonstrate that multi-scale speckles can reconstruct high-quality images even with a low number of measurements, and the imaging quality is significantly better than that of random and SVD speckles. Additionally, the stability of the SSIM curves indicates that the structural similarity of the dimensionally reduced images is well-maintained. This observation is in line with the numerical simulation reconstruction outcomes mentioned earlier, further validating the effectiveness and reliability of our method.

5. Conclusions

In this paper, we introduce a multi-wavelength computational ghost imaging method based on feature dimensionality reduction, aimed at addressing the heavy data processing and poor reconstruction image quality issues faced by traditional multi-wavelength ghost imaging. We optimize the multi-scale measurement matrices of the red, green, and blue components using singular value decomposition and use them as the illumination speckles. Subsequently, we reconstruct the component images of the target object using the second-order correlation function. Additionally, we apply principal component analysis for feature dimensionality reduction on the reconstructed images and complete the fusion of the images of each wavelength band. This step significantly reduces computational complexity and storage requirements. Through simulations and experimental validation, our method not only successfully reconstructs high-quality color images with fewer measurements but also significantly reduces memory usage and computational resource requirements. This makes processing large-scale datasets more efficient while also improving the quality of the reconstruction image. This approach offers significant advantages for practical applications such as remote sensing, medical imaging, and microscopic imaging, and serves as a valuable reference for future research in the field.

Author Contributions

Conceptualization, H.W. and X.W.; methodology, H.W.; software, H.W. and C.G.; validation, H.W. and Y.W.; formal analysis, X.W. and Z.Y.; investigation, Y.W.; resources, H.Z.; data curation, H.W., X.W. and C.G.; writing—original draft preparation, H.W.; writing—review and editing, H.W. and Z.Y.; visualization, H.W. and Y.W.; supervision, Z.Y.; project administration, C.G.; funding acquisition, Z.Y. and C.G. All authors have read and agreed to the published version of the manuscript.

Funding

This work is supported by the Science & Technology Development Project of Jilin Province (No. YDZJ202101ZYTS030).

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

Data underlying the results presented in this paper are not publicly available at this time but may be obtained from the authors upon reasonable request.

Conflicts of Interest

The authors declare no conflicts of interest.

References

  1. Klyshko, D. Combine EPR and two-slit experiments: Interference of advanced waves. Phys. Lett. A 1988, 132, 299–304. [Google Scholar] [CrossRef]
  2. Pittman, T.B.; Shih, Y.; Strekalov, D.; Sergienko, A.V. Optical imaging by means of two-photon quantum entanglement. Phys. Rev. A 1995, 52, R3429. [Google Scholar] [CrossRef] [PubMed]
  3. Zhao, C.; Gong, W.; Chen, M.; Li, E.; Wang, H.; Xu, W.; Han, S. Ghost imaging lidar via sparsity constraints. Appl. Phys. Lett. 2012, 101, 141123. [Google Scholar] [CrossRef]
  4. Gong, W.; Zhao, C.; Yu, H.; Chen, M.; Xu, W.; Han, S. Three-dimensional ghost imaging lidar via sparsity constraint. Sci. Rep. 2016, 6, 26133. [Google Scholar] [CrossRef] [PubMed]
  5. Gong, W.; Han, S. High-resolution far-field ghost imaging via sparsity constraint. Sci. Rep. 2015, 5, 9280. [Google Scholar] [CrossRef] [PubMed]
  6. Gong, W.; Han, S. Experimental investigation of the quality of lensless super-resolution ghost imaging via sparsity constraints. Phys. Lett. A 2012, 376, 1519–1522. [Google Scholar] [CrossRef]
  7. Clemente, P.; Durán, V.; Torres-Company, V.; Tajahuerce, E.; Lancis, J. Optical encryption based on computational ghost imaging. Opt. Lett. 2010, 35, 2391–2393. [Google Scholar] [CrossRef] [PubMed]
  8. Shapiro, J.H. Computational ghost imaging. Phys. Rev. A 2008, 78, 061802. [Google Scholar] [CrossRef]
  9. Bromberg, Y.; Katz, O.; Silberberg, Y. Ghost imaging with a single detector. Phys. Rev. A 2009, 79, 053840. [Google Scholar] [CrossRef]
  10. Sun, B.; Edgar, M.; Bowman, R.; Vittert, L.; Welsh, S.; Bowman, A.; Padgett, M. Differential computational ghost imaging. In Proceedings of the Computational Optical Sensing and Imaging 2013, Arlington, VA, USA, 23–27 June 2013; p. CTu1C–4. [Google Scholar]
  11. Zhang, X.; Meng, X.; Yang, X.; Wang, Y.; Yin, Y.; Li, X.; Peng, X.; He, W.; Dong, G.; Chen, H. Singular value decomposition ghost imaging. Opt. Express 2018, 26, 12948–12958. [Google Scholar] [CrossRef]
  12. Katkovnik, V.; Astola, J. Compressive sensing computational ghost imaging. J. Opt. Soc. Am. A 2012, 29, 1556–1567. [Google Scholar] [CrossRef] [PubMed]
  13. He, Y.; Wang, G.; Dong, G.; Zhu, S.; Chen, H.; Zhang, A.; Xu, Z. Ghost imaging based on deep learning. Sci. Rep. 2018, 8, 6469. [Google Scholar] [CrossRef] [PubMed]
  14. Lyu, M.; Wang, W.; Wang, H.; Wang, H.; Li, G.; Chen, N.; Situ, G. Deep-learning-based ghost imaging. Sci. Rep. 2017, 7, 17865. [Google Scholar] [CrossRef] [PubMed]
  15. Wang, F.; Wang, H.; Wang, H.; Li, G.; Situ, G. Learning from simulation: An end-to-end deep-learning approach for computational ghost imaging. Opt. Express 2019, 27, 25560–25572. [Google Scholar] [CrossRef]
  16. Zhou, C.; Tian, T.; Gao, C.; Gong, W.; Song, L. Multi-resolution progressive computational ghost imaging. J. Opt. 2019, 21, 055702. [Google Scholar] [CrossRef]
  17. Wang, L.; Zhao, S. Fast reconstructed and high-quality ghost imaging with fast Walsh–Hadamard transform. Photonics Res. 2016, 4, 240–244. [Google Scholar] [CrossRef]
  18. Welsh, S.S.; Edgar, M.P.; Jonathan, P.; Sun, B.; Padgett, M.J. Multi-wavelength compressive computational ghost imaging. In Proceedings of the Emerging Digital Micromirror Device Based Systems and Applications V, San Francisco, CA, USA, 5–6 February 2013; Volume 8618, pp. 158–163. [Google Scholar]
  19. Zhang, D.J.; Li, H.G.; Zhao, Q.L.; Wang, S.; Wang, H.B.; Xiong, J.; Wang, K. Wavelength-multiplexing ghost imaging. Phys. Rev. A 2015, 92, 013823. [Google Scholar] [CrossRef]
  20. Huang, J.; Shi, D.; Meng, W.; Zha, L.; Yuan, K.; Hu, S.; Wang, Y. Spectral encoded computational ghost imaging. Opt. Commun. 2020, 474, 126105. [Google Scholar] [CrossRef]
  21. Chen, L.Y.; Zhao, Y.N.; Chen, L.S.; Wang, C.; Ren, C.; Cao, D.Z. Color ghost imaging based on optimized random speckles and truncated singular value decomposition. Opt. Laser Technol. 2024, 169, 110007. [Google Scholar] [CrossRef]
  22. Wang, P.; Wang, C.; Yu, C.; Yue, S.; Gong, W.; Han, S. Color ghost imaging via sparsity constraint and non-local self-similarity. Chin. Opt. Lett. 2021, 19, 021102. [Google Scholar] [CrossRef]
  23. Ni, Y.; Zhou, D.; Yuan, S.; Bai, X.; Xu, Z.; Chen, J.; Li, C.; Zhou, X. Color computational ghost imaging based on a generative adversarial network. Opt. Lett. 2021, 46, 1840–1843. [Google Scholar] [CrossRef] [PubMed]
  24. Wang, H.; Wang, X.Q.; Gao, C.; Liu, X.; Wang, Y.; Zhao, H.; Yao, Z.H. High-quality computational ghost imaging with multi-scale light fields optimization. Opt. Laser Technol. 2024, 170, 110196. [Google Scholar] [CrossRef]
  25. Huang, H.; Zhou, C.; Gong, W.; Song, L. Block matching low-rank for ghost imaging. Opt. Express 2019, 27, 38624–38634. [Google Scholar] [CrossRef] [PubMed]
Figure 1. Schematic diagram of MWCGI.
Figure 1. Schematic diagram of MWCGI.
Photonics 11 00739 g001
Figure 2. Schematic diagram of the numerical simulation process for MWCGI.
Figure 2. Schematic diagram of the numerical simulation process for MWCGI.
Photonics 11 00739 g002
Figure 3. Numerical simulation results of MWCGI using different speckle patterns under various measurement numbers. Rows (ac) are the numerical simulation results of random speckle patterns, SVD speckle patterns, and multi-scale speckle patterns, respectively.
Figure 3. Numerical simulation results of MWCGI using different speckle patterns under various measurement numbers. Rows (ac) are the numerical simulation results of random speckle patterns, SVD speckle patterns, and multi-scale speckle patterns, respectively.
Photonics 11 00739 g003
Figure 4. The numerical curves of PSNR under different measurement times with random speckle patterns, SVD speckle patterns, and multi-scale speckle patterns.
Figure 4. The numerical curves of PSNR under different measurement times with random speckle patterns, SVD speckle patterns, and multi-scale speckle patterns.
Photonics 11 00739 g004
Figure 5. Schematic diagram of the MWCGI process based on feature dimensionality reduction.
Figure 5. Schematic diagram of the MWCGI process based on feature dimensionality reduction.
Photonics 11 00739 g005
Figure 6. PCA numerical simulation results of multi-scale speckles with 1500 measurements.
Figure 6. PCA numerical simulation results of multi-scale speckles with 1500 measurements.
Photonics 11 00739 g006
Figure 7. Numerical simulation results of multi-wavelength ghost imaging based on feature dimensionality reduction. Row (a) is the simulation result of multi-scale speckle patterns. Row (b) is the simulation result after dimensionality reduction.
Figure 7. Numerical simulation results of multi-wavelength ghost imaging based on feature dimensionality reduction. Row (a) is the simulation result of multi-scale speckle patterns. Row (b) is the simulation result after dimensionality reduction.
Photonics 11 00739 g007
Figure 8. SSIM curve for images with p = 9 under different measurement times.
Figure 8. SSIM curve for images with p = 9 under different measurement times.
Photonics 11 00739 g008
Figure 9. Multi-wavelength ghost imaging experimental results. Rows (ac) are the experimental results of random speckle patterns, SVD speckle patterns, and multi-scale speckle patterns, respectively. Row (d) is the experimental result after dimensionality reduction.
Figure 9. Multi-wavelength ghost imaging experimental results. Rows (ac) are the experimental results of random speckle patterns, SVD speckle patterns, and multi-scale speckle patterns, respectively. Row (d) is the experimental result after dimensionality reduction.
Photonics 11 00739 g009
Figure 10. The numerical curves of PSNR and SSIM under different measurement times. (a) PSNR curves under different measurement times with random, SVD, and multi-scale speckle patterns. (b) SSIM curve for images with p = 9 under different measurement times.
Figure 10. The numerical curves of PSNR and SSIM under different measurement times. (a) PSNR curves under different measurement times with random, SVD, and multi-scale speckle patterns. (b) SSIM curve for images with p = 9 under different measurement times.
Photonics 11 00739 g010
Table 1. Comparison of computational complexity and storage requirements between traditional and proposed methods.
Table 1. Comparison of computational complexity and storage requirements between traditional and proposed methods.
MethodCovariance
Matrix
Complexity
Eigenvalue
Decomposition
Complexity
Overall
Computational
Complexity
Matrix
Storage
Requirements
Principal Component
Vector Matrix
Storage Requirements
Overall
Storage
Requirements
Traditional
Method
O ( 3 × n 2 m ) O ( 3 × n 3 ) O ( 3 × [ n 2 m + n 3 ] ) 3 × ( m × n ) N/A 3 × ( m × n )
Our Method O ( 3 × p 2 m ) O ( 3 × p 3 ) O ( 3 × [ p 2 m + p 3 ] ) 3 × ( m × p ) 3 × ( n × p ) 3 × [ m × p + n × p ]
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Wang, H.; Wang, X.; Gao, C.; Wang, Y.; Zhao, H.; Yao, Z. Multi-Wavelength Computational Ghost Imaging Based on Feature Dimensionality Reduction. Photonics 2024, 11, 739. https://doi.org/10.3390/photonics11080739

AMA Style

Wang H, Wang X, Gao C, Wang Y, Zhao H, Yao Z. Multi-Wavelength Computational Ghost Imaging Based on Feature Dimensionality Reduction. Photonics. 2024; 11(8):739. https://doi.org/10.3390/photonics11080739

Chicago/Turabian Style

Wang, Hong, Xiaoqian Wang, Chao Gao, Yu Wang, Huan Zhao, and Zhihai Yao. 2024. "Multi-Wavelength Computational Ghost Imaging Based on Feature Dimensionality Reduction" Photonics 11, no. 8: 739. https://doi.org/10.3390/photonics11080739

APA Style

Wang, H., Wang, X., Gao, C., Wang, Y., Zhao, H., & Yao, Z. (2024). Multi-Wavelength Computational Ghost Imaging Based on Feature Dimensionality Reduction. Photonics, 11(8), 739. https://doi.org/10.3390/photonics11080739

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop