Next Article in Journal
Generation of 48 fs, 1 GHz Fundamentally Mode-Locked Pulses Directly from an Yb-doped “Solid-State Fiber Laser”
Next Article in Special Issue
Restoration of Atmospheric Turbulence-Degraded Short-Exposure Image Based on Convolution Neural Network
Previous Article in Journal
Dependence of the Registered Blood Flow in Incoherent Optical Fluctuation Flowmetry on the Mean Photon Path Length in a Tissue
Previous Article in Special Issue
Numerical and Monte Carlo Simulation for Polychromatic L-Shell X-ray Fluorescence Computed Tomography Based on Pinhole Collimator with Sheet-Beam Geometry
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Communication

Lensless Imaging via Blind Ptychography Modulation and Wavefront Separation

1
School of Physics, University of Electronic Science and Technology of China, Chengdu 610054, China
2
Institute of Optics and Electronics, Chinese Academy of Sciences, Chengdu 610209, China
*
Authors to whom correspondence should be addressed.
Photonics 2023, 10(2), 191; https://doi.org/10.3390/photonics10020191
Submission received: 12 January 2023 / Revised: 5 February 2023 / Accepted: 9 February 2023 / Published: 10 February 2023
(This article belongs to the Special Issue Computational Optical Imaging and Its Applications)

Abstract

:
A novel lensless imaging approach based on ptychography and wavefront separation is proposed in this paper, which was characterized by rapid convergence and high-quality imaging. In this method, an amplitude modulator was inserted between the light source and the sample for light wave modulation. By laterally translating this unknown modulator to different positions, we acquired a sequence of modulated intensity images for quantitative object recovery. In addition, to effectively separate the object and modulator wavefront, a couple of diffraction patterns without modulation were recorded. Optical experiments were performed to verify the feasibility of our approach by testing a resolution plate, a phase object, and an agaricus cell.

1. Introduction

Light detectors such as CCDs and CMOSs merely record the amplitude distribution of light waves; the phase information is lost. Lensless imaging is a technique to recover the complete wavefront based on single or multiple intensity measurements, which is essential in various fields such as quantitative phase microscopy [1,2], biomedical imaging [3,4], and three-dimensional imaging [5,6].
As a representative approach of the lensless imaging technique, ptychography [7,8,9,10,11,12,13,14,15,16] records several tens of diffraction patterns by laterally scanning the object with a localized coherent beam at several grid positions. However, for smooth or weakly scattered objects, the reconstruction is hampered due to an inadequate intensity variation [17]. To address this problem, Jiang et al. proposed the insertion of a thin diffuser between the object and the image sensor and then blindly scanning the unknown diffuser to different x-y positions [18]. Zhang et al. used Scotch tape and a galvo-scanner to achieve a compact, cost-effective, and field-portable lensless imaging platform [19]. For many biomedical applications, mechanical scanning is not desired [20]. Thus, Lu et al. employed a binary mask and a light-emitting diode array for sample illumination [21]. Multiple intensity patterns could be captured by illuminating the sample with different LED elements. In this method, a diffuser and a binary mask were adopted to achieve significant intensity changes to promote the algorithm convergence. However, hundreds of measurements are required, which greatly increases the time for image acquisition as well as the computational load.
To reduce the number of measurements and improve the ability of fast imaging, Zhang et al. placed a phase plate between the object and the CCD, and shifted the plate transversely [22]. The number of recordings could be reduced to three whereas the phase plate needed to be precisely known in advance. Wen et al. combined the ptychographic iterative engine (PIE) algorithm and the idea of one-dimensional scanning to recover an object and a diffuser simultaneously with only ~15 patterns [23].
In this paper, we placed a binary amplitude mask upstream of the object and recorded multiple intensity maps by laterally moving the mask in one dimension. Compared with a diffuser, pure amplitude modulation only requires amplitude reconstruction, which can effectively reduce the unknown variables, thus reducing the number of measurements. Inspired by the wavefront separation idea [24], diffraction patterns of the object at different distances without the mask were recorded; thus, the modulating wavefront and the object wavefront could be effectively separated to further accelerate the convergence.

2. Methods

The principle of our method is illustrated in Figure 1. The laser source passed through the collimating lens to form a plane wave and then the collimated beam illuminated the random binary amplitude mask and produced a speckle field on the sample. The scattered wave carried the information of the sample and free propagates to the detector plane. The mask was then moved perpendicular to the optical axis, resulting in a change in the captured modulated intensity. Each time the mask moved, one diffraction pattern was recorded. In addition, we removed the mask and acquired defocused diffraction patterns by translating the detector along the optical axis, aiming to separate the object and modulator wavefront.
The specific calculation procedure to reconstruct the sample was as follows.
The distribution of the mask M(x,y) was initially set to be a constant matrix. Light traveled through the mask and propagated to the sample, yielding a modulated wave p(x,y):
p ( x , y ) = P S F z 1 { M ( x , y ) } ,
where z1 is the distance between the mask and the sample and PSF{} denotes the point-spread function operator. The exiting wave T(x,y) from the sample was given by
T ( x , y ) = p ( x , y ) o ( x , y ) ,
where o(x,y) is the distribution of the sample. Its initial guess was defined as a constant matrix. The wavefront T(x,y) then propagated to the detector plane, which could be calculated as
D ( x , y ) = P S F z 2 { T ( x , y ) } ,
where z2 is the distance between the sample and the detector. In the detector plane, applying the magnitude constraint and the D(x,y) was updated as
D ˜ ( x , y ) = I ( x , y ) D ( x , y ) D ( x , y ) ,
where I(x,y) is the recorded intensity. The estimate field D ˜ (x,y) propagated back to the sample plane and T(x,y) could be renewed as
T ˜ ( x , y ) = P S F z 2 { D ˜ ( x , y ) } .
The sample and the modulated light field were then calculated with updated functions [8]:
o ˜ ( x , y ) = o ( x , y ) + p ( x , y ) [ T ˜ ( x , y ) T ( x , y ) ] p ( x , y ) max 2 ,
p ˜ ( x , y ) = p ( x , y ) + o ˜ ( x , y ) [ T ˜ ( x , y ) T ( x , y ) ] o ˜ ( x , y ) max 2 .
We applied the amplitude-only constrain; thus, the new mask distribution could be written as
M ( x , y ) = P S F z 1 { p ˜ ( x , y ) } .
The mask was then moved to the next position, and the function was given as
M ( x , y ) = M ( x Δ , y ) ,
where ∆ is a random constant. The position (x − ∆, y) was calculated by a cross-correlation.
We repeated the above steps until all modulated diffraction intensities had been processed. Several diffraction patterns of the sample without the mask were then used to further update the object. The estimated field of the receiving plane could be written as
D ( x , y ) = P S F d { o ˜ ( x , y ) } ,
where d is the distance between the sample and the detector. D’(x,y) was then constrained by the recorded intensity
D ˜ ( x , y ) = I ( x , y ) D ( x , y ) D ( x , y ) .
The distribution of the sample was updated by
o ˜ ( x , y ) = P S F d { D ˜ ( x , y ) } .
We re-ran Formula (10)–(12) until all the unmodulated images were processed. The normalized cross-correlation (NCC) [25] and root mean square error (RMSE) [26] were used to evaluate the quality of the reconstructed images.

3. Simulation

To investigate the performance of our method, a numerical simulation was carried out. In the simulation, the wavelength of the laser source was 658 nm and the sampling interval of the camera was 2.74 μm. The distance between the mask and the sample was 10 mm and the camera was placed 20 mm away from the sample. The amplitude “sunflower” and phase “fruits” consisted of the simulated ground truth, as shown in Figure 2(a1,a2). The pixel number of the images was 256 × 256. The amplitude was normalized and the phase was scaled in the range of [0, π]. Zeros were then padded around the images; the total calculated area was up to 512 × 512 pixels. The random binary amplitude mask was designed to be 512 × 1600 pixels and each small cell of the mask took up 8 × 8 pixels. The mask moved 6 times with a random step between 5 and 15 pixels, and 7 modulated diffraction intensities were recorded. Thereafter, the mask was removed and two unmodulated intensities were captured with the corresponding sample and camera distances of 10 mm and 5 mm, respectively.
To verify the effectiveness of the proposed method, a comparison with the original amplitude constraint-expended PIE (ac-ePIE) method [23] was also made. The number of iterations was 100 for both methods. The retrieved results with ac-ePIE and our method are shown in Figure 2c,d, respectively. Visually, the recovered patterns with ac-ePIE were blurry; our method recovered clearer results. Numerically, the RMSE of the reconstructed amplitude of ac-ePIE was 0.1337 and our result was 0.0207, which indicated that our method recovered the objects with a higher accuracy.
Figure 3 shows the iterative convergence curve of the two methods. From the view of the RMSE and NCC curves, our method jumped out of the stagnation stage much earlier and converged faster to a smaller RMSE value and a higher NCC value. From the trend of the NCC curve, our method converged after 25 iterations whereas ac-ePIE did not converge before 100 iterations. The results proved that our method held a faster convergence speed and a higher retrieval fidelity.
The influence of the number of recorded diffraction intensities on the reconstruction accuracy was analyzed and compared. Figure 4 shows the error of the retrieved amplitude with the different numbers of captured diffraction patterns. The number of unmodulated patterns was fixed to 2 and the number of modulated images was changed from 1 to 13. It could be seen that when the number of total diffraction patterns was 7, our method was capable of recovering high-precision results. With ac-ePIE, it seemed that at least 12 diffraction patterns were required for a satisfactory reconstruction. Therefore, our method was superior to ac-ePIE in terms of the convergence speed, accuracy, and measurement time.

4. Experiments

To further verify the effectiveness of the proposed method, experiments were carried out; the experimental setup is shown in Figure 5. A collimating laser beam with a wavelength of 658 nm was shaped by an aperture and was incident on the polarizer. The combination of a polarizer and a polarizing beam splitter (PBS) enabled the SLM (HDSLM80R, UPOlabs; contrast better than 1000:1) to work in a pure amplitude modulation mode. An SLM with a pixel size of 8 μm × 8 μm and a pixel number of 1920 × 1200 was adopted to load the random mask distribution. The laser beam was then modulated by the SLM and then illuminated the sample. The diffraction patterns were captured by a CCD camera with a pixel size of 2.74 μm × 2.74 μm and a pixel number of 4504 × 4504. One random binary mask pattern with a pixel number of 1920 × 2000 was generated; the size of the unit cell of the mask was 64 μm × 64 μm. The distance between the SLM and the sample was 198.6 mm and the distance between the sample and the CCD was 17.6 mm. The measurement of the distances could be achieved by the diffraction grating. Further details of this method are referred to in [27]. After recording 8 modulated intensities, blank images were loaded onto the SLM to record one unmodulated diffraction intensity. Another unmodulated diffraction intensity was then recorded by changing the relative distance between the sample and the CCD. The new distance between the sample and the CCD was 23.5 mm. There was a relative deviation between the two unmodulated intensities. To solve this problem, we used a cross-correlation to calculate the position deviation, and then we aligned the two images to perform the subsequent diffraction calculation [28].
A 2.5 inch USAF resolution target was first used as the amplitude sample. Figure 6a,c are the raw images with ac-ePIE and our method, respectively. The recovered amplitude distribution after 300 iterations employing the two methods is shown in Figure 6b,d, respectively. It could be seen that there was a degree of residual noise in the ac-ePIE reconstructed result; a line in group 5, element 5 (line width of 9.84 μm) in the enlarged part of Figure 6b can be discerned. In our method, a line in group 6, element 6 (line width of 4.38 μm)—marked with the red line in the enlarged part of Figure 6d—was clearly recovered. The experimental resolution with our method was close to the theoretical diffraction limit in the proposed imaging system, which was given by 𝛿 = 𝜆/NA∼2𝜆z/𝑙 ≈ 2.5 μm, where 𝑙 is the size of the sample. The experimental results showed that the reconstruction accuracy of the proposed method was significantly improved compared with ac-ePIE.
On the other hand, a binary phase object was used to evaluate the quantitative phase imaging capability of the proposed method. The phase sample was processed on a silica substrate by a reactive ion etching technique and consisted of many diverging fan shapes. Figure 7a,b show the captured raw images with ac-ePIE and our method. Figure 7c,d are the reconstructed results with ac-ePIE and our method, respectively. In our method, the boundaries of the fan were clearly distinguishable and the phase variance was also obvious. However, the result from ac-ePIE was unable to quantify the phase information. The curves drawn in Figure 7e,f show the cross-sectional data plotted in Figure 7b,d. The recovered phase was in good agreement with the ground truth height of the phase target with our method.
An agaricus cell was used as a sample to further verify the capability of microscopic imaging. Figure 8(b1,b2) are the reconstructed results with ac-ePIE and our method, respectively. The reconstruction result of ac-ePIE had significant noise whereas the result of the proposed method had a clean background. From the locally enlarged images shown in Figure 8(a1,a2) and Figure 8(c1,c2), we could see that the inner morphology was successfully reconstructed with our method and in good agreement with the microscope images using 5× objectives, as shown in Figure 8(a3,c3). The images obtained both ways were identical and showed similar microscopic details. In addition, our method achieved a field of view size of 6 mm2 whereas only 1.5 mm2 could be observed under the 5× microscope objective.

5. Discussion

In the coherent diffraction imaging technique, it is important to analyze the effect of the unit size of the mask on the reconstruction performance. As the diffraction calculation adopted the angular spectrum propagation, the smallest unit size of the mask was set to be the same as the sampling interval step (δ = 2.74 μm) of the CCD. The other parameters were the same as in the simulation. Figure 9 shows the RMSE and NCC iteration convergence curves with different unit sizes of the mask. A total of 5 masks of different cell sizes were tested, which were 1δ, 2δ, 4δ, 8δ, and 16δ (the coefficients should be divisible by 512), respectively. Under the modulation of the five different unit-size masks, both the RMSE and NCC curves showed that the error of the amplitude reconstruction would eventually converge to a stable value and the number of iterations required to achieve stability was very similar. This showed that the unit size of the mask had little influence on the reconstruction accuracy and speed of the target.
In the same way, Figure 10 shows the retrieved curves employing ac-ePIE with diverse unit-size masks. A total of 16 diffraction intensities were used. However, the iteration results indicated that as the unit size of the mask increased, the iteration converged faster. When the cell size of the mask grew to 16δ, its convergence accuracy was slightly inferior to the other situations. The reason could be interpreted from Figure 11. Figure 11a,b show the real and corresponding recovered phase masks of the different cell sizes with ac-ePIE, respectively. There was a lot of scatter noise in the small unit-size phase mask recovery results; therefore, more iterations were needed to obtain the optimal solution of the phase value. The larger unit-size mask had a higher, more accurate contour recovery, but the phase value had a large deviation; thus, in this case, the convergence accuracy of the target was poor. Figure 11c,d show the real and retrieved different unit-size amplitude mask distributions in our method, respectively. It could be seen that the retrieved masks were evenly distributed and closer to the true value; thus, the reconstruction of the target was independent of the unit size of the amplitude mask. This also reflected that the amplitude mask was more conducive to the stability of the algorithm than the phase mask. Selecting a mask with a larger unit size in the experiment could effectively reduce the error caused by the mismatch between the pixel size of the CCD and the SLM.
Another problem with this imaging model was the necessity to move the mask instead of moving the object. We believed that moving the modulator was equivalent to moving the object, in theory. However, in the experiment, moving the modulator whilst keeping the sample stationary kept the light field information of the sample at the same position on the detector plane; moving the sample introduced translation errors. For example, the position deviation between the first diffraction intensity and the second diffraction intensity of an object was 10 pixels (27.4 μm) through the cross-correlation calculation, but the accuracy of the cross-correlation calculation largely depended on the ability of the camera to record the complete light field. The actual offset results had a deviation of ±2 pixels (±5.48 μm), which indicated that the reconstructed result from the first diffraction intensity could not exactly coexist with the results from the second diffraction intensity. This led to the aliasing of finer linewidths, thus resulting in a decreased resolution. There was no such problem when only moving the modulator. Compared with the complexity of the sample distribution, the structure of the modulator was simpler, with only 0 and 1 values. The light field was easier to recover and could effectively reduce the position error caused by the translation.
In the proposed method, one of the factors affecting the imaging resolution was the sampling interval of the camera. In general, the resolution of the imaging system was about twice the sampling interval of the camera. When the diffraction limit was larger than twice the sampling interval of the camera, the resolution of the system should have been equal to the diffraction limit. When the diffraction limit was smaller than twice the sampling interval of the camera, we believed that the resolution of the system could not exceed the diffraction limit. In the experiment, the diffraction limit of the system was 2.5 μm, the sampling interval of the camera was 2.74 μm, and the imaging resolution was 4.38 μm, which was consistent with the theory. One of the ways to break the diffraction limit was to use the super-resolution imaging algorithm [29]. We introduced the super-resolution algorithm into our imaging system and found that the resolution improved to 2.76 μm. The results are shown in Figure 12. Figure 12a shows the recovered resolution plate without super-resolution using our method; a line in group 6, element 6 (line width of 4.38 μm) marked with a white line in the enlarged part could be discerned. Combined with the super-resolution algorithm, a line in group 7, element 4 (line width of 2.76 μm)—marked with a red line in the enlarged part of Figure 12b—was clearly recovered.

6. Conclusions

In conclusion, a new lensless imaging method using a random binary amplitude mask was proposed. With the aid of ptychography technology and the wavefront separation idea, our method achieved a high reconstruction accuracy and fast convergence speed compared with the original method. A numerical simulation and the experimental results indicated the feasibility of this technique to recover amplitude, phase, and biological samples. The proposed method was simple and did not require a complex algorithm, which has potential applications in the field of lensless imaging.

Author Contributions

Conceptualization, C.X. and H.P.; Methodology, C.X. and H.P.; Software, C.X. and A.C.; Validation, C.X.; Writing—original draft preparation, C.X.; Writing—review and editing, H.P. and S.H.; Supervision, Q.D. and H.Y.; Funding acquisition, H.P. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by the National Natural Science Foundation of China (61905251 and 11574042) and the Natural Science Foundation of Sichuan Province (2022NSFSC0561).

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Takazawa, S.; Kang, J.; Abe, M.; Uematsu, H.; Ishiguro, N.; Takahashi, Y. Demonstration of single-frame coherent X-ray diffraction imaging using triangular aperture: Towards dynamic nanoimaging of extended objects. Opt. Express 2021, 29, 14394–14402. [Google Scholar] [CrossRef] [PubMed]
  2. Kang, J.; Takazawa, S.; Ishiguro, N.; Takahashi, Y. Single-frame coherent diffraction imaging of extended objects using triangular aperture. Opt. Express 2021, 29, 1441–1453. [Google Scholar] [CrossRef] [PubMed]
  3. He, X.; Veetil, S.P.; Jiang, Z.; Kong, Y.; Wang, S.; Liu, C. High-speed coherent diffraction imaging by varying curvature of illumination with a focus tunable lens. Opt. Express 2020, 28, 25655–25663. [Google Scholar] [CrossRef] [PubMed]
  4. Almoro, P.F.; Pedrini, G.; Gundu, P.N.; Osten, W.; Hanson, S.G. Phase microscopy of technical and biological samples through random phase modulation with a diffuser. Opt. Lett. 2010, 35, 1028–1030. [Google Scholar] [CrossRef]
  5. Jiang, H.; Song, C.; Chen, C.-C.; Xu, R.; Raines, K.S.; Fahimian, B.P.; Lu, C.-H.; Lee, T.-K.; Nakashima, A.; Urano, J.; et al. Quantitative 3D imaging of whole, unstained cells by using X-ray diffraction microscopy. Proc. Natl. Acad. Sci. USA 2010, 107, 11234–11239. [Google Scholar] [CrossRef]
  6. Kocsis, P.; Shevkunov, I.; Katkovnik, V.; Egiazarian, K. Single exposure lensless subpixel phase imaging: Optical system design, modeling, and experimental study. Opt. Express 2020, 28, 4625–4637. [Google Scholar] [CrossRef]
  7. Rodenburg, J.M.; Faulkner, H.M.L. A phase retrieval algorithm for shifting illumination. Appl. Phys. Lett. 2004, 85, 4795–4797. [Google Scholar] [CrossRef]
  8. Maiden, A.; Johnson, D.; Li, P. Further improvements to the ptychographical iterative engine. Optica 2017, 4, 736–745. [Google Scholar] [CrossRef]
  9. Li, M.; Bian, L.; Zheng, G.; Maiden, A.; Liu, Y.; Li, Y.; Suo, J.; Dai, Q.; Zhang, J. Single-pixel ptychography. Opt. Lett. 2021, 46, 1624–1627. [Google Scholar] [CrossRef]
  10. Sun, A.; He, X.; Kong, Y.; Cui, H.; Song, X.; Xue, L.; Wang, S.; Liu, C. Ultra-high speed digital micro-mirror device based ptychographic iterative engine method. Biomed. Opt. Express 2017, 8, 3155–3162. [Google Scholar] [CrossRef] [Green Version]
  11. Maiden, A.M.; Rodenburg, J.M. An improved ptychographical phase retrieval algorithm for diffractive imaging. Ultramicroscopy 2009, 109, 1256–1262. [Google Scholar] [CrossRef]
  12. Baksh, P.D.; Ostril, M.; Miszczak, M.; Pooley, C.; Brocklesby, W.S. Quantitative and correlative extreme ultraviolet coherent imaging of mouse hippocampal neurons at high resolution. Sci. Adv. 2020, 6, eazz3025. [Google Scholar] [CrossRef]
  13. Tanksalvala, M.; Porter, C.L.; Esashi, Y.; Wang, B.; Jenkins, N.W.; Zhang, Z.; Miley, G.P.; Knobloch, J.L.; McBennett, B.; Horiguchi, N.; et al. Nondestructive, high-resolution, chemically specific 3D nanostructure characterization using phase-sensitive EUV imaging reflectometry. Sci. Adv. 2021, 7, eadb9667. [Google Scholar] [CrossRef]
  14. Eschen, W.; Loetgering, L.; Schuster, V.; Kals, R.; Kirsche, A.; Berthold, L.; Steinert, M.; Pertsch, T.; Gross, H.; Krause, M.; et al. Material-specific high-resolution table-top extreme ultraviolet microscopy. Light Sci. Appl. 2022, 11, 117. [Google Scholar] [CrossRef]
  15. Brooks, N.J.; Wang, B.; Binnie, L.; Tanksalvala, M.; Esashi, Y.; Knobloch, J.L.; Nguyen, Q.L.D. Temporal and spectral multiplexing for EUV multibeam ptychography with a high harmonic light source. Opt. Express 2022, 30, 30331–30346. [Google Scholar] [CrossRef]
  16. Wang, B.; Brooks, N.J.; Johnsen, P.C.; Jenkins, N.W.; Esashi, Y.; Binnie, Y.; Tanksalvala, M.; Kapteyn, H.C.; Margaret, M. Murnane High-fidelity ptychographic imaging of highly periodic structures enabled by vortex high harmonic beams. arXiv 2023, arXiv:2301.05563. [Google Scholar]
  17. Abregana, T.J.T.; Almoro, P.F. Phase retrieval by amplitude modulation using digital micromirror device. Opt. Lasers Eng. 2022, 150, 106851. [Google Scholar] [CrossRef]
  18. Jiang, S.; Zhu, J.; Song, P.; Guo, C.; Bian, Z.; Wang, R.; Huang, Y.; Wang, S.; Zhang, H.; Zheng, G. Wide-field, high-resolution lensless on-chip microscopy via near-field blind ptychographic modulation. Lab Chip 2020, 20, 1058–1065. [Google Scholar] [CrossRef]
  19. Zhang, H.; Bian, Z.; Jiang, S.; Liu, J.; Song, P.; Zheng, G. Field-portable quantitative lensless microscopy based on translated speckle illumination and sub-sampled ptychographic phase retrieval. Opt. Lett. 2019, 44, 1976–1979. [Google Scholar] [CrossRef]
  20. Zhang, Z.; Zhou, Y.; Jiang, S.; Guo, K.; Hoshino, K.; Zhong, J.; Suo, J.; Dai, Q.; Zheng, G. Invited article: Mask-modulated lensless imaging with multi-angle illuminations. APL Photonics 2018, 3, 060803. [Google Scholar] [CrossRef]
  21. Lu, C.; Zhou, Y.; Guo, Y.; Jiang, S.; Zhang, Z.; Zheng, G.; Zhong, J. Mask-modulated lensless imaging via translated structured illumination. Opt. Express 2021, 29, 12491–12501. [Google Scholar] [CrossRef] [PubMed]
  22. Zhang, F.; Pedrini, G.; Osten, W. Phase retrieval of arbitrary complex-valued fields through aperture plane modulation. Phys. Rev. A 2007, 75, 43805. [Google Scholar] [CrossRef]
  23. Wen, X.; Geng, Y.; Zhou, X.; Tan, J.; Liu, S.; Tan, C.; Liu, Z. Ptychography imaging by 1-D scanning with a diffuser. Opt. Express 2020, 28, 22658–22668. [Google Scholar] [CrossRef] [PubMed]
  24. Kocsis, P.; Shevkunov, I.; Katkovnik, V.; Rekola, H.; Egiazarian, K. Single-shot pixel super-resolution phase imaging by wavefront separation approach. Opt. Express 2021, 29, 43662–43678. [Google Scholar] [CrossRef]
  25. Shen, C.; Tan, J.; Wei, C.; Liu, Z. Coherent diffraction imaging by moving a lens. Opt. Express 2016, 24, 16520–16529. [Google Scholar] [CrossRef]
  26. Xu, C.; Pang, H.; Cao, A.X.; Deng, Q.L.; Yang, H. Phase retrieval by random binary amplitude modulation and ptychography principle. Opt. Express 2022, 30, 14505–14517. [Google Scholar] [CrossRef]
  27. Xu, C.; Pang, H.; Cao, A.X.; Deng, Q.L. Enhanced multiple-plane phase retrieval using a transmission grating. Opt. Lasers Eng. 2022, 149, 106810. [Google Scholar] [CrossRef]
  28. Guo, C.; Li, Q.; Wei, C.; Tan, J.; Liu, S.; Liu, Z. Axial multi-image phase retrieval under tilt illumination. Sci. Rep. 2017, 7, 7562. [Google Scholar] [CrossRef]
  29. Mainden, A.M.; Humphry, M.J.; Zhang, F.C.; Raodenburg, J.M. Superresolution imaging via ptychography. J. Opt. Soc. Am. A 2011, 28, 604–612. [Google Scholar] [CrossRef]
Figure 1. The schematic diagram of the proposed method.
Figure 1. The schematic diagram of the proposed method.
Photonics 10 00191 g001
Figure 2. Numerical analysis of our method and ac-ePIE. (a1) The amplitude of the ground truth. (a2) The phase of the ground truth. (b) The collected modulated and unmodulated intensities. (c1) The recovered amplitude with ac-ePIE. (c2) The recovered phase with ac-ePIE. (d1) The recovered amplitude with our method. (d2) The recovered phase with our method.
Figure 2. Numerical analysis of our method and ac-ePIE. (a1) The amplitude of the ground truth. (a2) The phase of the ground truth. (b) The collected modulated and unmodulated intensities. (c1) The recovered amplitude with ac-ePIE. (c2) The recovered phase with ac-ePIE. (d1) The recovered amplitude with our method. (d2) The recovered phase with our method.
Photonics 10 00191 g002
Figure 3. The RMSE and NCC amplitude convergence curves of our method and ac-ePIE.
Figure 3. The RMSE and NCC amplitude convergence curves of our method and ac-ePIE.
Photonics 10 00191 g003
Figure 4. The RMSE and NCC amplitude convergence curves with different numbers of recorded intensities.
Figure 4. The RMSE and NCC amplitude convergence curves with different numbers of recorded intensities.
Photonics 10 00191 g004
Figure 5. The experimental setup.
Figure 5. The experimental setup.
Photonics 10 00191 g005
Figure 6. The reconstruction results of the USAF 1951 resolution plate. (a) One of the raw images with ac-ePIE. (b) The recovered result and its enlarged parts with ac-ePIE. (c) One of the raw images with our method. (d) The recovered result and its enlarged parts with our method.
Figure 6. The reconstruction results of the USAF 1951 resolution plate. (a) One of the raw images with ac-ePIE. (b) The recovered result and its enlarged parts with ac-ePIE. (c) One of the raw images with our method. (d) The recovered result and its enlarged parts with our method.
Photonics 10 00191 g006
Figure 7. The reconstruction results of a binary phase object. (a) One of the captured raw images with ac-ePIE. (b) The recovered phase with ac-ePIE. (c) One of the captured raw images with our method. (d) The recovered phase with our method. (e) The height profiles along the black lines in (b); the ground truth phase difference of the phase object was 1.94 rad. (f) The height profiles along the red lines in (d); the ground truth height of the phase object was 442 nm.
Figure 7. The reconstruction results of a binary phase object. (a) One of the captured raw images with ac-ePIE. (b) The recovered phase with ac-ePIE. (c) One of the captured raw images with our method. (d) The recovered phase with our method. (e) The height profiles along the black lines in (b); the ground truth phase difference of the phase object was 1.94 rad. (f) The height profiles along the red lines in (d); the ground truth height of the phase object was 442 nm.
Photonics 10 00191 g007
Figure 8. The reconstruction results of the agaricus cell. (a1,c1) are the enlarged parts of the retrieved result with ac-ePIE. (a2,c2) are the enlarged parts of the retrieved result with our method. (b1) The retrieved results of ac-ePIE. (b2) The retrieved results of our method. (a3,c3) are the microscope images using 5× objectives.
Figure 8. The reconstruction results of the agaricus cell. (a1,c1) are the enlarged parts of the retrieved result with ac-ePIE. (a2,c2) are the enlarged parts of the retrieved result with our method. (b1) The retrieved results of ac-ePIE. (b2) The retrieved results of our method. (a3,c3) are the microscope images using 5× objectives.
Photonics 10 00191 g008
Figure 9. The RMSE and NCC curves with different unit-size masks using our method.
Figure 9. The RMSE and NCC curves with different unit-size masks using our method.
Photonics 10 00191 g009
Figure 10. The RMSE and NCC curves with different unit-size masks using ac-ePIE.
Figure 10. The RMSE and NCC curves with different unit-size masks using ac-ePIE.
Photonics 10 00191 g010
Figure 11. (a) Different unit-size real phase masks with ac-ePIE. (b) Different unit-size retrieved phase masks with ac-ePIE. (c) Different unit-size real amplitude masks with our method. (d) Different unit-size retrieved amplitude masks with our method.
Figure 11. (a) Different unit-size real phase masks with ac-ePIE. (b) Different unit-size retrieved phase masks with ac-ePIE. (c) Different unit-size real amplitude masks with our method. (d) Different unit-size retrieved amplitude masks with our method.
Photonics 10 00191 g011
Figure 12. (a) The recovered resolution plate without super-resolution using the proposed method. (b) The recovered resolution plate with super-resolution using the proposed method.
Figure 12. (a) The recovered resolution plate without super-resolution using the proposed method. (b) The recovered resolution plate with super-resolution using the proposed method.
Photonics 10 00191 g012
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Xu, C.; Pang, H.; Cao, A.; Deng, Q.; Hu, S.; Yang, H. Lensless Imaging via Blind Ptychography Modulation and Wavefront Separation. Photonics 2023, 10, 191. https://doi.org/10.3390/photonics10020191

AMA Style

Xu C, Pang H, Cao A, Deng Q, Hu S, Yang H. Lensless Imaging via Blind Ptychography Modulation and Wavefront Separation. Photonics. 2023; 10(2):191. https://doi.org/10.3390/photonics10020191

Chicago/Turabian Style

Xu, Cheng, Hui Pang, Axiu Cao, Qiling Deng, Song Hu, and Huajun Yang. 2023. "Lensless Imaging via Blind Ptychography Modulation and Wavefront Separation" Photonics 10, no. 2: 191. https://doi.org/10.3390/photonics10020191

APA Style

Xu, C., Pang, H., Cao, A., Deng, Q., Hu, S., & Yang, H. (2023). Lensless Imaging via Blind Ptychography Modulation and Wavefront Separation. Photonics, 10(2), 191. https://doi.org/10.3390/photonics10020191

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop