Next Article in Journal
Experimental Investigation of an Optical Resonator Gyroscope with a Mach–Zehnder Modulator and Its Sensitive Elements
Next Article in Special Issue
Optical Bottle Shaping Using Axicons with Amplitude or Phase Apodization
Previous Article in Journal
Identification of Browning in Human Adipocytes by Partial Least Squares Regression (PLSR), Infrared Spectral Biomarkers, and Partial Least Squares Discriminant Analysis (PLS-DA) Using FTIR Spectroscopy
Previous Article in Special Issue
Laguerre-Gaussian Beams with an Increased Dark Area and Autofocusing
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Implementation of a Large-Area Diffractive Lens Using Multiple Sub-Aperture Diffractive Lenses and Computational Reconstruction

by
Shivasubramanian Gopinath
1,†,
Praveen Periysamy Angamuthu
1,†,
Tauno Kahro
1,†,
Andrei Bleahu
1,
Francis Gracy Arockiaraj
1,2,
Daniel Smith
3,
Soon Hock Ng
3,
Saulius Juodkazis
3,4,
Kaupo Kukli
1,
Aile Tamm
1 and
Vijayakumar Anand
1,3,*
1
Institute of Physics, University of Tartu, W. Ostwaldi Str. 1, 50411 Tartu, Estonia
2
PG and Research Department of Physics, The American College, Madurai 625002, India
3
Optical Sciences Centre, Swinburne University of Technology, Melbourne 3122, Australia
4
Tokyo Tech World Research Hub Initiative (WRHI), School of Materials and Chemical Technology, Tokyo Institute of Technology, 2-12-1, Ookayama, Meguro-ku, Tokyo 152-8550, Japan
*
Author to whom correspondence should be addressed.
These authors contributed equally to this work.
Photonics 2023, 10(1), 3; https://doi.org/10.3390/photonics10010003
Submission received: 12 November 2022 / Revised: 15 December 2022 / Accepted: 16 December 2022 / Published: 21 December 2022

Abstract

:
Direct imaging systems that create an image of an object directly on the sensor in a single step are prone to many constraints, as a perfect image is required to be recorded within this step. In designing high resolution direct imaging systems with a diffractive lens, the outermost zone width either reaches the lithography limit or the diffraction limit itself, imposing challenges in fabrication. However, if the imaging mode is switched to an indirect one consisting of multiple steps to complete imaging, then different possibilities open. One such method is the widely used indirect imaging method with Golay configuration telescopes. In this study, a Golay-like configuration has been adapted to realize a large-area diffractive lens with three sub-aperture diffractive lenses. The sub-aperture diffractive lenses are not required to collect light and focus them to a single point as in a direct imaging system, but to focus independently on different points within the sensor area. This approach of a Large-Area Diffractive lens with Integrated Sub-Apertures (LADISA) relaxes the fabrication constraints and allows the sub-aperture diffractive elements to have a larger outermost zone width and a smaller area. The diffractive sub-apertures were manufactured using photolithography. The fabricated diffractive element was implemented in indirect imaging mode using non-linear reconstruction and the Lucy–Richardson–Rosen algorithm with synthesized point spread functions. The computational optical experiments revealed improved optical and computational imaging resolutions compared to previous studies.

1. Introduction

Imaging systems and components have occupied a significant part of our day-to-day life, starting from our built-in imager, our eyes, to all vision enhancement imaging systems and components, such as microscopes, telescopes, web cameras, smart phone cameras, etc. However, most of the available imaging systems, such as the ones above, fall predominantly into the direct imaging category. A direct imaging system uses a conventional imaging mode consisting of a single step: the image of an object is directly formed by a lens on the sensor [1]. An alternative method for imaging is by the indirect imaging mode, which, as the name suggests, involves multiple steps to complete the imaging process. Some examples of the indirect imaging method are digital holography [2] and coded aperture imaging [3]; both can be used with coherent, as well as incoherent, light sources. Coherent light sources are often limited to lab environments, unlike incoherent ones, which have broad applicability. In incoherent digital holography (IDH), the light from an object is split into two, differently modulated and interfered to create a hologram which is processed in the computer to reconstruct the 3D object information [4,5]. Hence, IDH requires two beam interference, resulting in complicated optical configurations and bulky, heavy optical systems. Some notable optical configurations of IDH are rotational–shearing interferometry [6,7], multiple viewpoint projection methods [8,9], conoscopic holography [10,11], optical scanning holography [12,13], Fresnel incoherent correlation holography [14,15] and coded aperture correlation holography (COACH) [16]. As implied from the above discussion, indirect imaging requires a complicated system, and the imaging procedure involves multiple steps in comparison to conventional imaging. However, the significantly higher density of information made available in 3D within few camera recordings, in comparison to 2D information captured using direct imaging methods, justifies the experimental requirements of IDH.
Like holography, coded aperture imaging (CAI) methods also exhibit advantages in comparison to direct imaging methods. Pinhole imaging systems can be considered as the oldest coded direct imaging system. However, imaging using pinholes has a significantly low light throughput and is, therefore, not suitable for many applications. The history and development of CAI methods in indirect mode are interesting [17,18,19,20]. The main motivation for the beginning of research in CAI was the lack of availability of technology to manufacture lenses for non-visible regions of the electromagnetic spectrum, such as X-rays and Gamma rays [17,18]. Dicke and Ables employed a random pinhole array to scatter light from the object and reconstructed it numerically by processing it with a pre-recorded or synthesized point spread function (PSF). Later, CAI was extended to 3D imaging in 2D space and spectra (x,y,λ), unlike holography, which can record object information in 3D space (x,y,z). Recently, CAI met holography when the COACH technique was developed. In COACH, the recording method involves two beam interference like holography, but the reconstruction is similar to CAI, involving a cross-correlation with the PSF. Later, COACH evolved into I-COACH when two beam interference was found redundant, as 4D information of the object (x,y,z,λ) was found to be contained in the light scattered from a coded phase mask if the spatial (z) and spectral (λ) PSFs are known [21,22,23,24]. From a single camera shot, the entire 4D information of the object can be reconstructed.
Golay-type synthetic aperture configuration for telescopes is a powerful approach to achieve super-resolution [25,26]. In Golay-type synthetic aperture telescopes, a single aperture is replaced by multiple sparse sub-apertures, which reduces the cost and complications associated with manufacturing large area lenses and mirrors [27]. In [27], a Weiner-type deconvolution method was implemented to improve the raw Golay image [28]. Different types of deconvolution methods have been applied for reconstructing the object information recorded in Golay-type imaging systems [29,30]. In this study, the indirect imaging concepts and the optical configuration inspired by Golay-type imaging systems have been integrated to solve one fundamental problem associated with the design and manufacturing of large-area diffractive lenses [31].
The design and manufacturing of large-area diffractive lenses for any imaging system, be it telescopes, microscopes, projections or displays and holography systems, are challenging for the following reasons. The radius of the zones of a diffractive lens is given as r n ~ n f λ , where n is an integer which is the order of the half-period zone, f is the focal length and λ is the wavelength. The thickness of the nth zone is, therefore, given as Δ n ~ { n f λ ( n 1 ) f λ } , which can be simplified as Δ n ~ [ f λ { n ( n 1 ) } ] . As seen from this expression, as n increases, Δ n decreases, and for very large values of n, Δ n reaches significantly low values. Consequently, the fabrication of the outermost areas of a diffractive lens is challenging as these areas have features that are either in the sub-lithography limit or even in the sub-diffraction limit with polarization sensitivity [32,33].
A widely used solution to solve this problem once the lithography or diffraction limit is reached is to maintain the same period after this cut-off radius, i.e., Δ n = [ f λ { m ( m 1 ) } ] for n m , where n = m is the cut-off radius when Δ n reaches the lithography limit Δ l [34]. In general, the above approximation does not affect the behavior of the diffractive lens, as the variation of r n with respect to n is non-linear near the central part, and nearly linear in the outermost part. However, in high numerical aperture (NA) lenses, it is possible that the lithography limit is reached within the non-linear region of r n vs. n variation. In such high NA cases, the above approximation results in spherical aberrations [35]. The consequences worsen if the diffractive lens is needed in the finite conjugate mode instead of the infinite conjugate mode [35]. The concept figure of the problem is shown in Figure 1.
A diffractive lens is designed for the infinite conjugate mode with f = 20 mm, λ = 0.65 μm, and 100 number of half-period zones. The above number of zones was selected assuming a scenario of a printing diffractive lens in an inkjet printer which has ~1200 dots per inch with a pixel size of about 20 μm. Three scenarios are considered in the case of a high NA diffractive lens: ideal diffractive lens, with all the zones fabricated according to theory; diffractive lens, fabricated according to theory until the lithography limit (~20 μm with an inkjet printer) and a constant zone width maintained henceforth; and, finally, diffractive lens fabricated only until the lithography limit. The plot of the radius and width of zones for diffractive lenses with and without linear approximation are shown in Figure 1a. Images of the binary versions of the accurate diffractive lens, approximate diffractive lens and low NA diffractive lens are shown in Figure 1b–d, respectively. The PSF of the above three cases Figure 1b–d is shown in Figure 1e–g, respectively. A test object ‘camera man’ was imaged and the imaging results for the cases Figure 1b–d are shown in Figure 1h–j, respectively. The imaging process was simulated using a binary phase version (0, π) of the diffractive lens to avoid the unmodulated light which may cause difficulty in comparing the performances of the three diffractive lenses. As seen from the imaging results, the existing approximation approach surely improved the resolution of imaging, but it is not as high as the accurate diffractive lens. However, if only light collection but not imaging is the focus, then there is no difference between an approximate and accurate diffractive lens. The second challenge in manufacturing large-area diffractive lenses is the memory size of the CAD files. In many cases [35], the source files for the CAD files are image files generated directly using computational simulation in software such as MATLAB and converted into CAD files using conversion software such as ‘LinkCAD’, to avoid manually creating thousands of zones zone-by-zone.
In this study, a Golay-type configuration inspired imaging system and indirect imaging concepts have been integrated to redefine the imaging problem with a diffractive lens. When a diffractive lens is designed, the radii of the zones are calculated, such that the light from every radial zone will constructively interfere at a single point. This requirement shrinks the widths of the zones in the outermost areas of the diffractive lens as higher diffraction angles are required from zones far away from the optical center. Applying the Golay-type configuration, the above condition can be relaxed. Since the image is captured by an image sensor, the condition for imaging is redefined to collect and focus light within the image sensor’s active area instead of a single point. This new condition allows the design of sub-aperture diffractive lenses with a low NA and their integration into a larger diffractive lens. This new diffractive lens can collect all the spatial frequencies within the full aperture and accumulate them within the sensor area. This new diffractive lens is called Large-Area Diffractive lens with Integrated Sub-Apertures (LADISA). However, the recorded image appears blurred due to the overlap of several low-resolution images of the object. Using computational reconstruction methods, the recorded intensity distribution can be reconstructed into a super-resolution image. This super-resolution is achieved in comparison to a single sub-aperture or currently available unsuccessful methods. The above approach also solves the problem of large memory sizes of CAD designs as the individual file sizes of sub-aperture diffractive lenses are significantly smaller than the memory size of a large-area diffractive lens. Unlike I-COACH and CAI, in this study, it is not necessary to record a PSF as it can be easily synthesized from the object intensity distribution, which makes this approach non-invasive.
The manuscript consists of five sections. In the next section on methodology, the design of LADISA and the imaging process is described. The simulation studies are presented in the third section. The fabrication procedure is presented in the fourth section. The experimental results are presented in the fifth section. The final section presents the summary, conclusion and future perspectives of the study.

2. Methodology

The optical configuration of the imaging system is shown in Figure 2. Spatially incoherent light from a distant object is incident on the LADISA with three sub-apertures. Several low-resolution images of the objects were formed on the sensor. The diffractive element LADISA is composed of three diffractive lenses approximated by quadratic phase functions whose phase function is given as Ψ L A D I S A = a = 1 3 exp [ j π R a 2 / ( λ f ) ] × P a × exp [ j ( θ x a + θ y a ) ] , where Ra is the radial coordinate R a = ( x x a ) 2 + ( y y a ) 2 and P a = { 1 ,   R a < r s 0 ,   e l s e w h e r e , where r s is the radius of the sub-aperture, θ x a and θ y a are the angles of the linear phases along the x and y directions, and ‘×’ represents element-wise product. The distance between the centers of any two sub-apertures is greater than twice the value of rs. Since the object distance zs is assumed to be very large, the PSF can be approximated as
I P S F = | a = 1 3 exp [ j π R a 2 / ( λ f ) ] × P a × exp [ j ( θ x a + θ y a ) ] exp [ j π R 2 / ( λ z h ) ] | 2
where ‘⊗’ is a 2D convolutional operator and R = x 2 + y 2 . It can be seen that when zh = f, the above expression reduces to the square of the Fourier transforms of the sum of aperture functions and the linear phases, given as
I P S F = | [ a = 1 3 { exp [ j ( θ x a + θ y a ) ] × P a } ] | 2
where ‘ ’ is the Fourier transform operator. Since the design has been made such that the three image spots do not overlap, using the linearity property of Fourier transform, Equation (2) can be also modified as sums of intensity distributions from individual apertures. The above Fourier transform operation on three apertures and linear phases generate Airy patterns, each with a size of 1.22λf/2rs at three different locations in the sensor plane. The size of the Airy pattern obtained by the full aperture is ~1.22λf/4rs for the case shown in Figure 2. The individual spot sizes are larger than the spot size obtained from the full aperture, but they have the information of higher spatial frequencies which can be retrieved using a suitable computational reconstruction method.
Considering a 2D object O at a large distance from the LADISA similar to the telescopic Golay-configuration, the object intensity distribution is given as I O = O I P S F . Now the challenge is to extract the image of the object O, from IO and IPSF. This can be achieved using different types of correlation, such as matched filter, phase-only filter [36], Wiener filter or inverse filter [37] and the non-linear reconstruction (NLR) method [38]. The NLR approach is a generalized correlation method, in which matched, phase-only, and Wiener filters are only special cases. The reconstructed image using NLR is given as
I R = 1 { | I ˜ P S F | α exp [ j · arg ( I ˜ P S F ) ] | I ˜ O | β exp [ j · arg ( I ˜ O ) ] }
where α and β are varied between −1 to 1 until a minimum background noise is obtained. When α = 1, and β = 1, it is a matched filter; when α = 0, and β = 1, it is a phase-only filter; and when α = −1, and β = 1, it is a Wiener filter. It has been well-established by various studies that NLR performs significantly better than the other filters [28]. While all the above methods use the correlation approach, an alternative method to reconstruct the object information is the Lucy–Richardson algorithm (LRA), which estimates the maximum likelihood solution iteratively [39,40]. Recently, a novel computational reconstruction method called the Lucy–Richardson–Rosen algorithm (LRRA) was developed by integrating LRA and NLR [41]. The schematic of LRRA is shown in Figure 3. The LRA consists of a forward convolution between the approximate solution and the PSF, and a backward correlation between the PSF and the ratio between IO and the estimated solution. This ratio is multiplied with the previous solution and this process is continued until an optimal solution is obtained. In LRRA, the backward correlation (matched filter) is replaced by NLR, which not only improves the estimation, but enables a rapid convergence. Different studies were carried out recently and it was found that LRRA performs better than LRA and NLR if the PSF is symmetric [42,43]. However, NLR is capable of reconstructing object information convoluted with both symmetric and asymmetric PSFs [44].

3. Simulation Results

A simulation study was carried out with a matrix size of 500 pixels along the x and y directions, pixel size Δ = 10 μm, wavelength λ = 0.65 μm, object distance zs = ∞ and the focal length of the lens f and the distance between the lens and sensor zh was set to the same value zs = f = 10 cm in MATLAB. Optical configurations with symmetric and asymmetric PSFs were designed. To obtain a symmetric and asymmetric PSF, LADISA with four and three equally spaced sub-apertures were designed, respectively. The PSF and MTF, given as | ( I P S F ) | of three cases, the ideal diffractive lens, one sub-aperture diffractive lens and LADISA for the symmetric case with four sub-apertures, are compared in Figure 4. Unlike direct imaging systems where the PSF is the image of a point formed by the imaging device, the PSF is the reconstructed image of a point which is the autocorrelation function. The phase image of an ideal lens, its PSF and MTF are shown in Figure 4a–c, respectively. The phase image of a sub-aperture diffractive lens, its PSF and MTF are shown in Figure 4d–f, respectively. The phase image of LADISA, its PSF, autocorrelation using NLR, corresponding MTF, autocorrelation using LRRA and its corresponding MTF are shown in Figure 4g–l, respectively. The MTF of LRRA is broader than NLR, indicating that higher spatial frequencies are present in the case of LRRA.
A test object ‘Emblem of Tartu University’ was used for the further simulation studies, as shown in Figure 5a. The imaging results using the ideal diffractive lens, sub-aperture diffractive lens, and LADISA with four apertures are shown in Figure 5b–d, respectively. The reconstruction results of the object using NLR and LRRA are shown in Figure 5e,f, respectively. The magnified versions of the direct images formed by the ideal diffractive lens, the sub-aperture diffractive lens, and the reconstruction results of NLR (α = 0, β = 0.6) and LRRA (α = 0, β = 1, iterations = 15) are shown in Figure 5g–j, respectively. The above comparison shows the improved resolution with NLR and LRRA, and LRRA exhibited a better performance in comparison to both the sub-aperture diffractive lens as well as NLR. However, when the PSF is not symmetric, such as in the case of this study where instead of four sub-apertures, there are only three, then the performances of LRRA are significantly different. The phase image of LADISA with three sub-apertures, its PSF and imaging result of the test object are shown in Figure 6a–c, respectively. The reconstruction result of NLR (α = 0, β = 0.6) and LRRA (α = 0, β = 1, iterations = 15) are shown in Figure 6d,e, respectively. It can be seen that in this case, NLR performs better than LRRA. To solve this problem, the PSF and the object intensity distributions were flipped and added to the original images to make them symmetric. The images of the PSF and reconstruction results from LRRA after this process are shown in Figure 6f,g, respectively. As it is seen, the reconstruction results are significantly improved but the field of view is diminished.
To understand the resolution enhancement with LADISA, simulation was repeated for a test object consisting of only two points separated by 7 pixels, which is beyond the individual resolution limit of the sub-apertures. The imaging result of the object using an ideal diffractive lens with full aperture is shown in Figure 7a. The imaging result of the object using LADISA with four sub-apertures is shown in Figure 7b. As it is seen, the images obtained from the individual sub-apertures does not resolve the two points. However, collectively in the computational imaging framework, the reconstruction results using NLR and LRRA, shown in Figure 7c,d, respectively resolve the two points. The line data of Figure 7b–d are normalized and plotted in Figure 7e, which shows the improvement in resolution with NLR and LRRA. LRRA has a better performance than NLR. The above simulation study demonstrates the resolution enhancement in the Golay configuration in comparison to that from a single sub-aperture.

4. Fabrication Results

The LADISA was designed in MATLAB as a grayscale element with 256 levels varying from 0 to 2π, corresponding to λ~650 nm with a maximum theoretical efficiency of 100%. The design consisted of 5000 × 5000 pixels with a pixel size of 2 µm and was saved as a bitmap file. The maximum and minimum width of the zones were 62 µm and 12 µm, respectively. The thickness of the photoresist needed to achieve the maximum efficiency is given as λ/(nr − 1), where nr is the refractive index of the photoresist. The total size of the LADISA was 1 cm while the individual sub-aperture was about 4 mm. The resolution in LADISA with respect to a single sub-aperture whose center was aligned with the optical axis was >2. The fabrication of LADISA was carried out using photolithography in an ISO5 cleanroom. Positive photoresist (AR-P 3510T, Allresist, Germany) was spin coated (4000 rpm, 60 s) onto cleaned ITO (indium tin oxide) coated glass substrates and softly baked on a hot plate at 100 °C for 60 s. The promoter AR 300-80 new (Allresist, Germany) was used to improve the adhesion between the photoresist and the ITO glass. Maskless Aligner (Heidelberg Instruments µMLA 100, Germany) with a dose control of the light source at 390 nm was used to expose the photoresist and AR 300-44 (Allresist, Germany) was used for developing the UV irradiated structures. Finally, the LADISA were rinsed with ultrapure water to remove possible residuals. The design and the optical microscopy (Nikon Eclipse LV150) image of a section of one of the sub-aperture diffractive lenses are shown in Figure 8a,b, respectively. The sample was coated with a thin layer (~14 nm) of Au by using direct-current magnetron sputtering and observed under a scanning electron microscope (SEM) FEI Helios NanoLab 600 (FEI, Hillsboro, OR, USA). The SEM image is shown in Figure 8c. The image of the design is shown in Figure 8d and the optical microscopy image of a section of one of the sub-apertures of the fabricated element is shown in Figure 8e.

5. Experimental Results

The experimental setup used in this study is given in Figure 9. The setup consists of a spatially incoherent high-power LED source (Thorlabs, 170 mW, λ = 650 nm and Δλ = 20 nm). An iris was placed in front of the light source to control the light illumination area. A negative USAF test object (Thorlabs) was used for the study. Numeral 2 of size 0.12 × 0.11 mm2 (Group-3, Element-5) and line thickness ~40 µm was critically illuminated by the light source using a refractive lens L1 of focal length 50 mm. The light from the object was collimated by another refractive lens L2 of focal length 80 mm to create a large distance effect. This was then collected on an image sensor (Zelux CS165MU/M 1.6 MP monochrome CMOS camera, 1440 × 1080 pixels with pixel size < 3.5 µm) after modulation by the LADISA with the focal length 100 mm. The experiment was carried for two different LADISA apertures, three and four, and compared with a single aperture. The obtained object intensities (IO), synthesized PSFs and the corresponding reconstructed images are given in Figure 10.
For the reconstruction, the PSFs were synthesized directly from the recorded images. For this purpose, different points, such as sharp edges, were chosen and the best among them were used for the final reconstruction. The reconstruction parameters for LADISA with three and four apertures were NLR (α = 0 and β = 0.7) and LRRA (α = 0 and β = 0.9 with four iterations). With the addition of more sub-apertures, it is possible to obtain sharper images. In both cases, LRRA performs better than the NLR. However, in the case of three aperture LADISA, the PSF has to be inverted to perform reconstruction in the case of LRRA, and with this modification, it can be seen that the results of LRRA were better than those of NLR. Additional apertures improve the resolution and it can be seen that with the four aperture LADISA, LRRA produces sharp images compared to the single and three apertures.

6. Summary and Conclusions

The realization of high resolution in direct imaging systems are often limited by the fabrication capabilities. In the case of diffractive element-based imaging, the outermost zone width of the zone plate is limited to the lithography limit. Secondly, during the writing of large areas, the beam conditions have to be maintained constant over a long period, which is often challenging. Moreover, the memory size of a single large diffractive lens is significantly high. In this work, we have shown that by adapting a Golay-like configuration for the fabrication of diffractive elements and by using indirect imaging principles, it is possible to realize a large-area diffractive lens with LADISA. For this purpose, three and four aperture LADISA elements were designed and their properties were simulated. Since these apertures are designed in a way to create multi-focused image points, the chances of retrieving higher spatial frequencies are improved by manifold. This was shown by the simulation results, and in order to further confirm the simulations, three and four aperture LADISA were fabricated using photolithography and subjected to experimental analysis. It was found that the four aperture LADISA performs better than the three sub-aperture elements in the case of LRRA, while NLR showed a similar performance throughout. In the case of symmetric PSFs, LRRA always perform better than NLR. Since spatial aberration correction has already been shown in the case of a refractive lens in which LRRA performed better than NLR, we believe that the spatial and spectral aberrations associated with diffractive lens can be corrected with LRRA [42]. Unlike in previous studies with LRRA [41,43], where the PSF was pre-recorded, in this case, the PSFs were synthesized computationally. When the PSF is recorded and used for reconstruction, the aberrations in the PSF, which is also generated for every object point while recording the object intensity distribution, cancels the reconstruction noises originating from the aberrations. The low reconstruction noises obtained with synthetic PSF in Figure 10 indicate that there is no significant fabrication error. However, this method has not yet been applied to cases with a high resolution-enhancement factor (~10). In such cases, the resolution of fabrication is expected to be high, which we will investigate in the near future. This study is limited to the proof-of-concept of indirect imaging using LADISA with NLR and LRRA. While the resolution enhancement is demonstrated in simulation, the enhancement factor was not quantified, as it is dependent upon multiple variables such as number, diameter and locations of the sub-apertures [45]. The method has the potential to expand the resolution limit by a significant factor, which will be investigated in future studies. In the current study, the sub-apertures were manufactured with a diameter of 4 mm to form a full aperture with a diameter of ~1 cm, which can be scaled up in our maskless photolithography [46,47] or diamond turning [48], achieving a sub-aperture size of 800 mm [49]. We believe that these preliminary results can be implemented to manufacture portable, low-weight devices with higher resolution, such as telescopes, lensless cameras and microscopes. The reported method can be directly extended to single-plane imaging using coherent light if the low-resolution objects do not overlap [50].

Author Contributions

Conceptualization, V.A.; methodology, V.A., A.T., S.J. and K.K; software, S.G., P.P.A., T.K. and V.A; validation, S.G., A.B., P.P.A., T.K., A.T., K.K., S.H.N., D.S., F.G.A. and V.A.; formal analysis, V.A., A.T. and K.K; investigation, S.G., A.B., P.P.A. and T.K.; resources, A.T., K.K. and V.A.; fabrication of LADISA, T.K., A.T. and K.K.; SEM characterization of LADISA, T.K., A.T. and K.K.; data curation, S.G., P.P.A. and T.K.; writing—original draft preparation, V.A., T.K., P.P.A. and S.G.; writing—review and editing, all authors; visualization, V.A., T.K., P.P.A. and S.G.; supervision, V.A., A.T., S.J. and K.K.; project administration, V.A., A.T. and S.J.; funding acquisition, V.A., A.T. and S.J. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by the European Union’s Horizon 2020 research and innovation programme, grant agreement No. 857627 (CIPHR), and the ARC Linkage LP190100505 project. The present study was partially funded by the European Regional Development Fund project “Emerging orders in quantum and nanomaterials” (TK134) and the Estonian Research Agency (PRG4).

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

Not applicable.

Acknowledgments

The authors thank Aravind Simon and Tiia Lillemaa for their administrative support, Helle-Mai Piirsoo for SEM measurements and Peeter Ritslaid for magnetron sputtering. This work acknowledges the ERDF project Centre of Technologies and Investigations of Nanomaterials (NAMUR+, project number 2014-2020.4.01.16-0123) and the NAMUR+ core facility funded project by the Estonian Research Council (TT 13).

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Bhandari, A.; Kadambi, A.; Raskar, R. Computational Imaging; MIT Press: Cambridge, MA, USA, 2022. [Google Scholar]
  2. Javidi, B.; Carnicer, A.; Anand, A.; Barbastathis, G.; Chen, W.; Ferraro, P.; Goodman, J.W.; Horisaki, R.; Khare, K.; Kujawinska, M.; et al. Roadmap on digital holography. Opt. Express. 2021, 29, 35078–35118. [Google Scholar] [CrossRef]
  3. Thomas Cathey, W.; Dowski, E.R. New paradigm for imaging systems. Appl. Opt. 2002, 41, 6080–6092. [Google Scholar] [CrossRef] [PubMed]
  4. Tahara, T.; Zhang, Y.; Rosen, J.; Anand, V.; Cao, L.; Wu, J.; Koujin, T.; Matsuda, A.; Ishii, A.; Kozawa, Y.; et al. Roadmap of incoherent digital holography. Appl. Phys. B. 2022, 128, 1–31. [Google Scholar] [CrossRef]
  5. Rosen, J.; Vijayakumar, A.; Kumar, M.; Rai, M.R.; Kelner, R.; Kashter, Y.; Bulbul, A.; Mukherjee, S. Recent advances in self-interference incoherent digital holography. Adv. Opt. Photonics. 2019, 11, 1–66. [Google Scholar] [CrossRef]
  6. Murty, M.V.R.K.; Hagerott, E.C. Rotational shearing interferometry. Appl. Opt. 1966, 5, 615–619. [Google Scholar] [CrossRef]
  7. Armitage, J.D.; Lohmann, A. Rotary shearing interferometry. Opt. Acta. 1965, 12, 185–192. [Google Scholar]
  8. Shaked, N.T.; Katz, B.; Rosen, J. Review of three-dimensional holographic imaging by multiple-viewpoint-projection based methods. Appl. Opt. 2009, 48, H120–H136. [Google Scholar] [CrossRef]
  9. Rivenson, Y.; Stern, A.; Rosen, J. Compressive multiple view projection incoherent holography. Opt. Express. 2011, 19, 6109–6118. [Google Scholar] [CrossRef]
  10. Sirat, G.Y. Conoscopic holography. I. Basic principles and physical basis. J. Opt. Soc. Am. A 1992, 9, 70–83. [Google Scholar] [CrossRef]
  11. Mugnier, L.M.; Sirat, G.Y. On-axis conoscopic holography without a conjugate image. Opt. Lett. 1992, 17, 294–296. [Google Scholar] [CrossRef]
  12. Schilling, B.W.; Poon, T.C.; Indebetouw, G.; Storrie, B.; Shinoda, K.; Suzuki, Y.; Wu, M.H. Three-dimensional holographic fluorescence microscopy. Opt. Lett. 1997, 22, 1506–1508. [Google Scholar] [CrossRef] [PubMed]
  13. Poon, T.C.; Indebetouw, G. Three-dimensional point spread functions of an optical heterodyne scanning image processor. Appl. Opt. 2003, 42, 1485–1492. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  14. Rosen, J.; Brooker, G. Digital spatially incoherent Fresnel holography. Opt. Lett. 2007, 32, 912–914. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  15. Rosen, J.; Brooker, G. Non-scanning motionless fluorescence three-dimensional holographic microscopy. Nat. Photon. 2008, 2, 190–195. [Google Scholar] [CrossRef]
  16. Vijayakumar, A.; Kashter, Y.; Kelner, R.; Rosen, J. Coded aperture correlation holography—a new type of incoherent digital holograms. Opt. Express. 2016, 24, 12430–12441. [Google Scholar] [CrossRef] [Green Version]
  17. Ables, J.G. Fourier transform photography: A new method for X-ray astronomy. Proc. Astron. Soc. Aust. 1968, 1, 172–173. [Google Scholar] [CrossRef]
  18. Dicke, R.H. Scatter-hole cameras for X-rays and gamma rays. Astrophys. J. 1968, 153, L101. [Google Scholar] [CrossRef]
  19. Fenimore, E.E.; Cannon, T.M. Coded aperture imaging with uniformly redundant arrays. Appl. Opt. 1978, 17, 337–347. [Google Scholar] [CrossRef]
  20. Tsai, T.H.; Brady, D.J. Coded aperture snapshot spectral polarization imaging. Appl. Opt. 2013, 52, 2153–2161. [Google Scholar] [CrossRef]
  21. Sahoo, S.K.; Tang, D.; Dang, C. Single-shot multispectral imaging with a monochromatic camera. Optica 2017, 4, 1209–1213. [Google Scholar] [CrossRef]
  22. Vijayakumar, A.; Rosen, J. Interferenceless coded aperture correlation holography–a new technique for recording incoherent digital holograms without two-wave interference. Opt. Express. 2017, 25, 13883–13896. [Google Scholar] [CrossRef]
  23. Antipa, N.; Kuo, G.; Heckel, R.; Mildenhall, B.; Bostan, E.; Ng, R.; Waller, L. DiffuserCam: Lensless single-exposure 3D imaging. Optica 2018, 5, 1–9. [Google Scholar] [CrossRef]
  24. Anand, V.; Ng, S.H.; Maksimovic, J.; Linklater, D.; Katkus, T.; Ivanova, E.P.; Juodkazis, S. Single shot multispectral multidimensional imaging using chaotic waves. Sci. Rep. 2020, 10, 13902. [Google Scholar] [CrossRef]
  25. Golay, M. Point arrays having compact non-redundant autocorrelations. J. Opt. Soc. Am. 1971, 61, 272–273. [Google Scholar] [CrossRef]
  26. Meinel, A.B.; Meinel, M.P. Large sparse-aperture space optical systems. Opt. Eng. 2002, 41, 1983–1994. [Google Scholar] [CrossRef]
  27. Miller, N.J.; Dierking, M.P.; Duncan, B.D. Optical sparse aperture imaging. Appl. Opt. 2007, 46, 5933–5943. [Google Scholar] [CrossRef]
  28. Vijayakumar, A.; Jayavel, D.; Muthaiah, M.; Bhattacharya, S.; Rosen, J. Implementation of a speckle-correlation-based optical lever with extended dynamic range. Appl. Opt. 2019, 58, 5982–5988. [Google Scholar] [CrossRef] [Green Version]
  29. Paykin, I.; Yacobi, L.; Adler, J.; Ribak, E.N. Phasing a segmented telescope. Phys. Rev. E. 2015, 91, 023302. [Google Scholar] [CrossRef] [Green Version]
  30. Ding, J.; Noshad, M.; Tarokh, V. Complementary lattice arrays for coded aperture imaging. J. Opt. Soc. Am. A 2016, 33, 863–881. [Google Scholar] [CrossRef] [Green Version]
  31. Vijayakumar, A.; Bhattacharya, S. Design and Fabrication of Diffractive Optical Elements with MATLAB; SPIE: Bellingham, WA, USA, 2017. [Google Scholar]
  32. Meem, M.; Banerji, S.; Pies, C.; Oberbiermann, T.; Majumder, A.; Sensale-Rodriguez, B.; Menon, R. Large-area, high-numerical-aperture multi-level diffractive lens via inverse design. Optica 2020, 7, 252–253. [Google Scholar] [CrossRef] [Green Version]
  33. Wang, L.; Xu, B.B.; Cao, X.W.; Li, Q.K.; Tian, W.J.; Chen, Q.D.; Juodkazis, S.; Sun, H.B. Competition between subwavelength and deep-subwavelength structures ablated by ultrashort laser pulses. Optica 2017, 4, 637–642. [Google Scholar] [CrossRef]
  34. Kress, B.C.; Meyrueis, P. Applied Digital Optics: From Micro-Optics to Nanophotonics; John Wiley & Sons: Hoboken, NJ, USA, 2014. [Google Scholar]
  35. Vijayakumar, A.; Bhattacharya, S. Characterization and correction of spherical aberration due to glass substrate in the design and fabrication of Fresnel zone lenses. Appl. Opt. 2013, 52, 5932–5940. [Google Scholar] [CrossRef]
  36. Horner, J.L.; Gianino, P.D. Phase-only matched filtering. Appl. Opt. 1984, 23, 812–816. [Google Scholar] [CrossRef]
  37. Khireddine, A.; Benmahammed, K.; Puech, W. Digital image restoration by Wiener filter in 2D case. Adv. Eng. Softw. 2007, 38, 513–516. [Google Scholar] [CrossRef] [Green Version]
  38. Rai, M.R.; Anand, V.; Rosen, J. Non-linear adaptive three-dimensional imaging with interferenceless coded aperture correlation holography (I-COACH). Opt. Express 2018, 26, 18143–18154. [Google Scholar] [CrossRef]
  39. Richardson, W.H. Bayesian-Based Iterative Method of Image Restoration. J. Opt. Soc. Am. 1972, 62, 55–59. [Google Scholar] [CrossRef]
  40. Lucy, L.B. An iterative technique for the rectification of observed distributions. Astron. J. 1974, 79, 745. [Google Scholar] [CrossRef] [Green Version]
  41. Anand, V.; Han, M.; Maksimovic, J.; Ng, S.H.; Katkus, T.; Klein, A.; Bambery, K.; Tobin, M.J.; Vongsvivut, J.; Juodkazis, S.; et al. Single-shot mid-infrared incoherent holography using Lucy-Richardson-Rosen algorithm. Opto-Electron. Sci. 2022, 1, 210006. [Google Scholar] [CrossRef]
  42. Praveen, P.A.; Arockiaraj, F.G.; Gopinath, S.; Smith, D.; Kahro, T.; Valdma, S.-M.; Bleahu, A.; Ng, S.H.; Reddy, A.N.K.; Katkus, T.; et al. Deep Deconvolution of Object Information Modulated by a Refractive Lens Using Lucy-Richardson-Rosen Algorithm. Photonics 2022, 9, 625. [Google Scholar] [CrossRef]
  43. Anand, V.; Khonina, S.; Kumar, R.; Dubey, N.; Reddy, A.N.K.; Rosen, J.; Juodkazis, S. Three-dimensional incoherent imaging using spiral rotating point spread functions created by double-helix beams. Nanoscale Res. Lett. 2022, 17, 37. [Google Scholar] [CrossRef]
  44. Smith, D.; Gopinath, S.; Arockiaraj, F.G.; Reddy, A.N.K.; Balasubramani, V.; Kumar, R.; Dubey, N.; Ng, S.H.; Katkus, T.; Selva, S.J.; et al. Nonlinear Reconstruction of Images from Patterns Generated by Deterministic or Random Optical Masks—Concepts and Review of Research. J. Imaging 2022, 8, 174. [Google Scholar] [CrossRef]
  45. Bulbul, A.; Vijayakumar, A.; Rosen, J. Superresolution far-field imaging by coded phase reflectors distributed only along the boundary of synthetic apertures. Optica 2018, 5, 1607–1616. [Google Scholar] [CrossRef]
  46. Veiko, V.P.; Korolkov, V.P.; Poleshchuk, A.G.; Sinev, D.A.; Shakhno, E.A. Laser technologies in micro-optics. Part 1. Fabrication of diffractive optical elements and photomasks with amplitude transmission. Optoelectron. Instrum. Data Process. 2017, 53, 474–483. [Google Scholar] [CrossRef]
  47. Koczorowski, W.; Kuświk, P.; Przychodnia, M.; Wiesner, K.; El-Ahmar, S.; Szybowicz, M.; Nowicki, M.; Strupiński, W.; Czajka, R. CMOS-compatible fabrication method of graphene-based micro devices. Mater. Sci. Semicond. Process. 2017, 67, 92–97. [Google Scholar] [CrossRef]
  48. Available online: https://www.lightpath.com/capabilities/diamond-turning/ (accessed on 1 November 2022).
  49. Atcheson, P.; Stewart, C.; Domber, J.; Whiteaker, K.; Cole, J.; Spuhler, P.; Seltzer, A.; Britten, J.A.; Dixit, S.N.; Farmer, B.; et al. MOIRE—Initial Demonstration of a Transmissive Diffractive Membrane Optic for Large Lightweight Optical Telescopes. In Space Telescopes and Instrumentation 2012: Optical, Infrared, and Millimeter Wave; International Society for Optics and Photonics: Bellingham, WA, USA, 2012; Volume 8442. [Google Scholar] [CrossRef]
  50. Hai, N.; Rosen, J. Interferenceless and motionless method for recording digital holograms of coherently illuminated 3D objects by coded aperture correlation holography system. Opt. Express 2019, 27, 24324–24339. [Google Scholar] [CrossRef]
Figure 1. (a) Plot of the radius and thickness of zones of an accurate and approximate diffractive lens. Phase image of (b) an accurate diffractive lens, (c) an approximate diffractive lens and (d) a diffractive lens within the lithography limit. PSF of (e) an accurate diffractive lens, (f) an approximate diffractive lens and (g) a diffractive lens within the lithography limit. Imaging results of a test object obtained using (h) an accurate diffractive lens, (i) an approximate diffractive lens and (j) a diffractive lens within the lithography limit.
Figure 1. (a) Plot of the radius and thickness of zones of an accurate and approximate diffractive lens. Phase image of (b) an accurate diffractive lens, (c) an approximate diffractive lens and (d) a diffractive lens within the lithography limit. PSF of (e) an accurate diffractive lens, (f) an approximate diffractive lens and (g) a diffractive lens within the lithography limit. Imaging results of a test object obtained using (h) an accurate diffractive lens, (i) an approximate diffractive lens and (j) a diffractive lens within the lithography limit.
Photonics 10 00003 g001
Figure 2. Optical configuration of the Golay-type diffractive imaging system consisting of LADISA with three sub-apertures.
Figure 2. Optical configuration of the Golay-type diffractive imaging system consisting of LADISA with three sub-apertures.
Photonics 10 00003 g002
Figure 3. Schematic of the LRRA. The dotted box shows the NLR part of the LRRA.
Figure 3. Schematic of the LRRA. The dotted box shows the NLR part of the LRRA.
Photonics 10 00003 g003
Figure 4. (a) Phase image of ideal diffractive lens and its (b) PSF and (c) MTF. (d) Phase image of a sub-aperture diffractive lens and its (e) PSF and (f) MTF. (g) Phase image of LADISA with four apertures and its (h) PSF; (i) autocorrelation obtained using NLR and its (j) MTF; (k) autocorrelation obtained using LRRA and its (l) MTF.
Figure 4. (a) Phase image of ideal diffractive lens and its (b) PSF and (c) MTF. (d) Phase image of a sub-aperture diffractive lens and its (e) PSF and (f) MTF. (g) Phase image of LADISA with four apertures and its (h) PSF; (i) autocorrelation obtained using NLR and its (j) MTF; (k) autocorrelation obtained using LRRA and its (l) MTF.
Photonics 10 00003 g004
Figure 5. (a) Test object and imaging result from (b) the ideal diffractive lens, (c) the sub-aperture diffractive lens and (d) LADISA. Reconstruction results of LADISA using (e) NLR and (f) LRRA. The magnified versions of the test object in (af) are shown in (gl), respectively.
Figure 5. (a) Test object and imaging result from (b) the ideal diffractive lens, (c) the sub-aperture diffractive lens and (d) LADISA. Reconstruction results of LADISA using (e) NLR and (f) LRRA. The magnified versions of the test object in (af) are shown in (gl), respectively.
Photonics 10 00003 g005
Figure 6. (a) Phase image of LADISA with three sub-apertures, (b) the PSF and (c) object intensity distribution of the test object. Reconstruction results of LADISA using (d) NLR and (e) LRRA. Image of (f) PSF and (g) reconstruction result from LRRA after post-processing by converting the asymmetric distributions into symmetric ones.
Figure 6. (a) Phase image of LADISA with three sub-apertures, (b) the PSF and (c) object intensity distribution of the test object. Reconstruction results of LADISA using (d) NLR and (e) LRRA. Image of (f) PSF and (g) reconstruction result from LRRA after post-processing by converting the asymmetric distributions into symmetric ones.
Photonics 10 00003 g006
Figure 7. Simulated intensity distribution for an object consisting of only two points using (a) the ideal diffractive lens and (b) LADISA with four sub-apertures. Reconstruction results of (b) using (c) NLR and (d) LRRA. (e) Plots of the line data extracted from (bd).
Figure 7. Simulated intensity distribution for an object consisting of only two points using (a) the ideal diffractive lens and (b) LADISA with four sub-apertures. Reconstruction results of (b) using (c) NLR and (d) LRRA. (e) Plots of the line data extracted from (bd).
Photonics 10 00003 g007
Figure 8. (a) Image of the design of LADISA with three apertures, (b) optical microscopy image and (c) SEM image of a section of one of the sub-apertures of LADISA. (d) Image of the design of LADISA with four apertures and (e) optical microscopy image of a section of one of the sub-apertures of LADISA.
Figure 8. (a) Image of the design of LADISA with three apertures, (b) optical microscopy image and (c) SEM image of a section of one of the sub-apertures of LADISA. (d) Image of the design of LADISA with four apertures and (e) optical microscopy image of a section of one of the sub-apertures of LADISA.
Photonics 10 00003 g008
Figure 9. Photograph of the experimental setup: (1) LED, (2) iris, (3) LED power source, (4) lens L1 (f = 50 mm), (5) test object, (6) lens L2 (f = 80 mm), (7) LADISA, (8) image sensor.
Figure 9. Photograph of the experimental setup: (1) LED, (2) iris, (3) LED power source, (4) lens L1 (f = 50 mm), (5) test object, (6) lens L2 (f = 80 mm), (7) LADISA, (8) image sensor.
Photonics 10 00003 g009
Figure 10. Images of the IO of the test object, synthesized PSF and the reconstruction results using NLR and LRRA. (a,b) corresponds to the three and four aperture LADISA, respectively.
Figure 10. Images of the IO of the test object, synthesized PSF and the reconstruction results using NLR and LRRA. (a,b) corresponds to the three and four aperture LADISA, respectively.
Photonics 10 00003 g010
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Gopinath, S.; Angamuthu, P.P.; Kahro, T.; Bleahu, A.; Arockiaraj, F.G.; Smith, D.; Ng, S.H.; Juodkazis, S.; Kukli, K.; Tamm, A.; et al. Implementation of a Large-Area Diffractive Lens Using Multiple Sub-Aperture Diffractive Lenses and Computational Reconstruction. Photonics 2023, 10, 3. https://doi.org/10.3390/photonics10010003

AMA Style

Gopinath S, Angamuthu PP, Kahro T, Bleahu A, Arockiaraj FG, Smith D, Ng SH, Juodkazis S, Kukli K, Tamm A, et al. Implementation of a Large-Area Diffractive Lens Using Multiple Sub-Aperture Diffractive Lenses and Computational Reconstruction. Photonics. 2023; 10(1):3. https://doi.org/10.3390/photonics10010003

Chicago/Turabian Style

Gopinath, Shivasubramanian, Praveen Periysamy Angamuthu, Tauno Kahro, Andrei Bleahu, Francis Gracy Arockiaraj, Daniel Smith, Soon Hock Ng, Saulius Juodkazis, Kaupo Kukli, Aile Tamm, and et al. 2023. "Implementation of a Large-Area Diffractive Lens Using Multiple Sub-Aperture Diffractive Lenses and Computational Reconstruction" Photonics 10, no. 1: 3. https://doi.org/10.3390/photonics10010003

APA Style

Gopinath, S., Angamuthu, P. P., Kahro, T., Bleahu, A., Arockiaraj, F. G., Smith, D., Ng, S. H., Juodkazis, S., Kukli, K., Tamm, A., & Anand, V. (2023). Implementation of a Large-Area Diffractive Lens Using Multiple Sub-Aperture Diffractive Lenses and Computational Reconstruction. Photonics, 10(1), 3. https://doi.org/10.3390/photonics10010003

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop