Next Article in Journal
Estimates of Aboveground Biomass from Texture Analysis of Landsat Imagery
Previous Article in Journal
Improved van Zyl Polarimetric Decomposition Lessening the Overestimation of Volume Scattering Power
Previous Article in Special Issue
Estimation of the Image Interpretability of ZY-3 Sensor Corrected Panchromatic Nadir Data
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Inner FoV Stitching of Spaceborne TDI CCD Images Based on Sensor Geometry and Projection Plane in Object Space

1
Satellite Surveying and Mapping Application Center, NASG, Beijing 101300, China
2
Key Laboratory of Satellite Mapping Technology and Application, NASG, Beijing 101300, China
3
State Key Laboratory of Information Engineering, Surveying, Mapping and Remote Sensing, Wuhan University, Wuhan 430072, China
4
Jiangsu Province Surveying and Mapping Engineering Institute, Nanjing 210013, China
*
Author to whom correspondence should be addressed.
Remote Sens. 2014, 6(7), 6386-6406; https://doi.org/10.3390/rs6076386
Submission received: 28 February 2014 / Revised: 24 June 2014 / Accepted: 25 June 2014 / Published: 8 July 2014
(This article belongs to the Special Issue Satellite Mapping Technology and Application)

Abstract

:
High-quality inner FoV (Field of View) stitching is currently a prerequisite step for photogrammetric processing and application of image data acquired by spaceborne TDI CCD cameras. After reviewing the technical development in the issue, we present an inner FoV stitching method based on sensor geometry and projection plane in object space, in which the geometric sensor model of spaceborne TDI CCD images is used to establish image point correspondence between the stitched image and the TDI CCD images, using an object-space projection plane as the intermediary. In this study, first, the rigorous geometric sensor model of the TDI CCD images is constructed. Second, principle and implementation of the stitching method are described. Third, panchromatic high-resolution (HR) images of ZY-1 02C satellite and triple linear-array images of ZY-3 satellite are utilized to validate the correctness and feasibility of the method. Fourth, the stitching precision and geometric quality of the generated stitched images are evaluated. All the stitched images reached the sub-pixel level in precision. In addition, the geometric models of the stitched images can be constructed with zero loss in geometric precision. Experimental results demonstrate the advantages of the method for having small image distortion when on-orbit geometric calibration of satellite sensors is available. Overall, the new method provide a novel solution for inner FoV stitching of spaceborne TDI CCD images, in which all the sub-images are projected to the object space based on the sensor geometry, performing indirect image geometric rectification along and across the target trajectory. At present, this method has been successfully applied in the daily processing system for ZY-1 02C and ZY-3 satellites.

Graphical Abstract

1. Introduction

Time Delay and Integration (TDI) CCD is a new type of photosensitive device that features high sensitivity and a large signal/noise ratio, which has been well-adapted in designing small-aperture, light-weighted, and high-resolution spaceborne optical cameras [1]. Currently, all the imaging systems on many high-resolution satellites (e.g., IKONOS, QuickBird, Worldview, CBERS-02B, ZY-1 02C, ZY-3, Pleiades, etc.) utilize the TDI CCD chips as the sensor elements [113]. Although each single TDI CCD sensor has a small array physical structure, it can be equivalently seen as a linear array CCD from the perspective of sensor geometry [3,6,8,12], judging from its linear array output for the imaging of each scanline.
It is necessary to get a comparatively wide swath of image by increasing the field of the spaceborne camera that works in a linear array pushbroom imaging mode. Since the number of CCD detectors for each single sensor line is restricted technically by the manufactures, three or more sensor lines are connected with each other to constitute a large field of view (FoV) of a camera. However, it would be unwise to directly place multiple sensor lines on the focal plane end–to-end to form a complete sensor line, due to the restrictions of installation precision, swath width in cross-track direction, and physical structure of outer cover of each sensor line, especially in the case of small array structure of each TDI CCD sensor [113].
Currently, two innovations are often adopted by spaceborne TDI CCD cameras for a wide FoV. The first one is to place the multiple sensor lines on the focal plane in a non-collinear way, as illustrated in Figure 1a–d. In this design, three or more sensor lines are assembled into upper and lower rows to stagger several detectors in the linear array direction, to guarantee a certain overlap between two sub-images captured by adjacent sensor lines [13,913]; for example, all the TDI CCD cameras onboard QuickBird, IKONOS, Worldview-2, CBERS-02B, ZY1-02C, etc. are in this type. The second one is to make the multiple sensor lines proximately in a collinear way and stagger several detectors with each other in the field of camera, by using the beam splitting principle of prisms, such as TDI CCD cameras onboard Pleiades and ZY-3 satellites [48] (Figure 1e–f).
Available results indicate that, for TDI CCD cameras, several factors, such as the position of sensor lines on the focal plane, the lens distortion of the optical system, drift angle deviation, satellite platform jitter, the integral time variation of scanlines, large geographic relief, etc., are possible to complicate the image overlap and misplacement of adjacent sensor lines. In other words, TDI CCD images, i.e., the sub-images captured individually by multiple sensor lines when the sensor platform moves forward in the orbit direction cannot be mosaicked into a seamless image scene by simple panning in row and column directions, especially when the image misplacement of adjacent sensor lines is relatively larger [9,12,13]. In addition, the accuracy of TDI CCD images transformed from the rigorous geometric sensor model to the rational function model (RFM) would be affected by these factors [1416]. Inner FoV stitching [12,15], which aims at transforming the sub-images into a seamless image scene covering the total FoV of the camera and possessing sound geometric properties, has to be efficiently performed by image dealers who provide users with high-quality image products for photogrammetric processing and application. Nowadays, France, the United States, and several other countries have made an intensive study of this problem with their own high-resolution commercial satellites equipped with TDI CCD cameras; however, the inner FoV stitching tasks are primarily accomplished by image and service providers, and little information is available for reference when mentioning the details of stitching models and algorithms [2,6]. In China, concerns have been given to the inner FoV stitching task of spaceborne TDI CCD sensors these years, such as high-resolution TDI CCD cameras onboard CBERS-02B, ZY-1 02C, ZY-3, etc. [8,9,12,13,1519]. It is important and significant to study the related theories and methods so that a high stitching precision can be achieved in the ground application system of satellites.
In a previous work, we analyzed the geometric characteristics of CBERS-02B HR camera with the inner correlation of three sub-images acquired, and summarized two major technical routes, i.e., the image-space-oriented and the object-space-oriented routes [12], to solve the problem of inner FoV stitching for non-collinear TDI CCD images. Actually, such two routes are applicable for spaceborne TDI CCD images either “non-collinear” or “collinear”, since the focal plane layout of multiple sensor lines in collinear style can be identified as a special case of those in non-collinear style, in regard to the imaging characteristics.
The image-space-oriented route aims to establish a stitching model from the intrinsic characteristics of TDI CCD images, i.e., image overlap and misplacement of the adjacent sensor lines. For example, Li et al. [17] proposed a stitching method for CBERS-02B HR image data by extracting tie points from adjacent sub-images; Lu [18] studied inner FoV stitching of spaceborne TDI CCD cameras by image matching and correction. Currently, the shift transformation model, piecewise affine transformation model, or piecewise polynomial transformation model, etc. are often established and used for image correction, based on tie points of adjacent sub-images extracted with image matching operators [12,1719]. In general, the image-space-oriented route for an inner FoV stitching process includes three steps, i.e., matching the tie points, establishing the image-space transformation models, and registering the sub-images. Thoroughly independent of the geometric sensor model and lack of strictness in theoretical foundation, such route is inadequate for high-precision image geometric processing and application.
However, the object-space-oriented route aims to establish a corresponding relationship between the pixels of a stitched image and the pixels of TDI CCD images from the perspective of sensor geometry, with the object space as an intermediary [8,9,12,15,16], in other words, to associate the stitched image with the ground surface, for example, by assuming that the stitched image is observed by a virtual CCD line on the focal plane [12,15,16], and meanwhile to associate the ground surface with the TDI CCD images based on corresponding geometric sensor model. Apparently, the object-space-oriented route is stricter in theory and more desirable for high-precision image processing and application. Recently, Zhang et al. and Pan et al. advanced the virtual CCD line based stitching method and applied it successfully to the image data of PRISM scanner onboard ALOS satellite and TDI CCD cameras onboard ZY-3 satellite [15,16]. Another approach to correspond the stitched image with ground surface is to project the TDI CCD images directly to a reference plane in object space [9,12]. However, for a long period, no reliable and mature solutions are available to guarantee the consistency in geometric positioning accuracy of adjacent sensor lines, which serves as a requisite to reach the desirable stitching precision due to limited conditions for geometric calibration of high-resolution spaceborne optical sensors. As a result, the correctness of this approach can be proven only by simulated non-collinear TDI CCD images under ideal conditions that no error exists in auxiliary data. At present, it is still not widely used for image preprocess of spaceborne TDI CCD cameras in China.
Under certain circumstances, we presents a novel inner FoV stitching method based on sensor geometry and projection plane in object space, where the geometric sensor model of spaceborne TDI CCD images is used to associate the stitched image with the TDI CCD images, using object-space projection plane as the intermediary. In this method, the geometric sensor model of original images is established firstly, and then the principle, error sources and workflow of the method are described in detail; in addition, triple linear-array images of ZY-3 satellite and panchromatic HR images of ZY-1 02C satellite are utilized to validate the correctness and feasibility of such method; finally, the stitching precision and geometric quality of stitched images generated are assessed, where a comparison between the method of this paper and the virtual CCD line based method, another mainstream object-space-oriented method, is also made. In general, this method provides a new solution for inner FoV stitching of spaceborne TDI CCD images. The key of the method is to project the original image to object space based on the sensor geometry, performing an indirect image geometric rectification along and across the direction of target trajectory. It can be widely applied to spaceborne TDI CCD images whether the focal plane layout of the multiple sensor lines is in a collinear style or not.

2. Geometric Sensor Model of Spaceborne TDI CCD Images

ZY-3 satellite carries three high-resolution panchromatic TDI CCD cameras pointed separately at the front-facing, ground-facing, and rear-facing positions. Both forward and backward cameras are formed by four CCDs, and the nadir camera is formed by three CCDs (Figure 1f). By adopting an optical mosaic method with a half transparent and half reflecting mirror, the multiple CCDs can be directly mosaicked into a line in precision up to 0.3 pixel in forward, backward and nadir cameras, according to the related technical data provided by satellite designer [8,20].
ZY-1 02C satellite carries two identical high-resolution panchromatic TDI CCD cameras (HR1 and HR2) with the same focal plane assembly as CBERS-02B HR camera. As illustrated in Figure 1d, o-xy represents the focal plane coordinates system; the three CCDs are assembled into two parallel rows on the focal plane, with an overall bias placement in the field; CCD1 and CCD3 are aligned with each other, while CCD2 is placed in a different row; the interval of the two parallel rows is about 26 mm; in addition, there is an overlap of about 0.3 mm between adjacent sensor lines.
The geometric characteristics of panchromatic high-resolution TDI CCD cameras onboard ZY-3 and ZY-1 02C are listed in Table 1.
As we know, to meet the requirement of high-precision geometric processing and applications of high-resolution satellites, it is a key technical issue to establish a rigorous geometric sensor model for the raw image products [8,1012,2024]. In imaging mechanism, the sensor geometry of each TDI CCD sensor is similar to that of a single-line CCD (linear array CCDs). In other words, it complies with the strict central perspective projection in linear array direction, while in orbital pushbroom direction, it can be approximated as parallel projection [2024].
For each scanline of a sub-image, parameters of the position and orientation at the exposure moment can be interpolated from the provided auxiliary data of time, ephemeris, and attitude [8,12,2023]. In addition, since GPS measures the position of GPS phase center, and attitude sensor measures the orientation from the satellite body-fixed coordinates system and the orbit coordinates system, or from the star sensor to the ground in the J2000 inertial frame, to acquire the camera position and attitude of the main optical axis, the data measured by GPS and star sensor need to be transformed to the position and orientation of the camera [8,11,25,26]. If we ignore the offset vector of GPS phase center in the satellite body-fixed coordinates system, the lever arm matrix between camera frame and satellite platform, as well as the influence of atmospheric refraction, the rigorous geometric sensor model of a spaceborne TDI CCD image can be then expressed in Equations (1) or (2);
[ X Y Z ] WGS 84 = [ X S Y S Z S ] WGS 84 + m R J 2000 WGS 84 R star J 2000 ( R star body ) T R camera body ( tan ψ y tan ψ x 1 )
[ X Y Z ] WGS 84 = [ X S Y S Z S ] WGS 84 + m R J 2000 WGS 84 R orbit J 2000 ( R body orbit ) R camera body ( tan ψ y tan ψ x 1 )
where m is the scale factor at the exposure moment, ( X , Y , Z ) WGS 84 T are three-dimensional (3-D) Cartesian coordinates of the ground point in the WGS84 frame. ( X S , Y S , Z S ) WGS 84 T are 3-D Cartesian coordinates of satellite position in the WGS84 frame. R camera bodyis the bore-sight angle matrix, i.e., the rotation matrix between camera coordinates and satellite body-fixed coordinates. R star body is the rotation matrix between star sensor and satellite body-fixed coordinates. R star J 2000 is the rotation matrix from star sensor to ground in the J2000 inertial frame. R J 2000 WGS 84 is the transformation matrix between the J2000 inertial coordinates and the WGS84 coordinates. R body orbit is the rotation matrix between the satellite body-fixed coordinates system and the orbit coordinates system. R orbit J 2000 is the rotation matrix between the orbit coordinates system and J2000 inertial coordinates system. x, Ψy) are the pointing angles of a certain CCD detector, which represent the geometric relation of each detector ray in the camera coordinates system [8,12,2527].
In practice, regarding the auxiliary data provided, the rigorous geometric sensor model in the form of Equation (1) is most often used in ZY-3 satellite [8,16,25,26], while Equation (2) will be suited for ZY-1 02C satellite [1012,26]. As shown in Figure 2a, oc-XcYcZc represents the camera coordinates system. o-xy represents the focal plane coordinates system. o is the nominal principle point. u⃗ represents the ray vector of a certain detector Ψx, Ψy. denote the pointing angles of a certain detector across and along the sensor pushbroom direction, respectively, then u⃗ = (tan Ψy, tan Ψx,−1)T. According to the installation position of each detector, the two pointing angles are calculated by the camera coordinate system in central projection geometry as
tan ψ y = x Δ x f tan ψ x = y Δ y f
where x, y are the nominal focal plane coordinates of the detector; f is the calibrated focal length; and (Δx, Δy) denote the geometric distortion of the detector.
As previous studies indicate, it is acceptable to replace the on-orbit calibration for interior orientation parameters (IOP) by direct modeling the pointing angles [11,12,25,26]. For each calibrated sensor line, a pointing angle of detector can be expressed in a high-order polynomial function as represented below,
{ ψ x ( s j ) = a 0 j + a 1 j s j + a 2 j s j 2 + a 3 j s j 3 ψ y ( s j ) = b 0 j + b 1 j s j + b 2 j s j 2 + b 3 j s j 3
where j denotes the index of the sensor line sj is the index of detectors on corresponding sensor line. a0j, a1j, a2j, a3j, b0j, b1j b2j b3j are the three-order polynomial coefficients.
Figure 2b shows the imaging characteristics of a spaceborne TDI CCD camera. Since the focal plane layout of the multiple sensor lines in a collinear style can be treated as a special case of non-collinear style, and it will be more complicated for high-precision inner FoV stitching when multiple sensor lines are in typical non-collinear style [12,13]. Therefore, we use the occurance of three non-collinear TDI CCDs as a typical instance to illustrate more distinctly. The sub-images share the same parameters for geometric sensor modeling including time, orbit, attitude, bias angle, offset angle of camera etc., except for the polynomial functions that express the pointing angles of CCD detectors [8,12,25,26], since they are synchronously collected in dynamic pushbroom imaging mode by sensor lines on the focal plane separately while the satellite moves along the orbital direction.
An original image is formed by directly aligning and combining the sub-images according to the imaging time, i.e., index of image scanlines, and the overlap and misplacement of adjacent sub-images are associated with the imaging parameters [12,13]. For instance, as regards the focal plane designing parameters of ZY-1 02C HR cameras, it can be estimated that there will be an overlap of about 30 pixels between adjacent sub-images in the direction of scanlines. In addition, CCD1 and CCD3 image at the same ground target delaying for about 2600 scanlines compared with CCD2. In fact, the overlap and misplacement of adjacent sub-images vary, which is sometimes a bit complex [12,13]. However, the ground cover of the original image, i.e., area of projection, as shown in Figure 2b, always features seamlessness and continuity.

3. Inner FoV Stitching of TDI CCD Images Based on Sensor Geometry and Projection Plane in Object Space

3.1. Principle

We propose here an inner FoV stitching method for spaceborne TDI CCD images based on sensor geometry and projection plane in object space. Supposing that the ground surface is approximated as the object-space projection plane with a mean elevation value, and ground cover of a stitched image is denoted as area of interest (AOI), if we segment AOI into several regular grid units along and across the direction of target trajectory as shown in Figure 3a, and correspond the grid units with the pixels of stitched image one to one, the stitched image can be generated in a way similar to an indirect image geometric rectification, in other words, performing back-projection calculation based on geometric sensor model to decide the pixel position of original image corresponding to a certain grid unit. Briefly, the basic principle of the method is to project the original image to object space based on the sensor geometry. As a result, for the stitched image, the geometric model can be also derived from the imaging geometry of TDI CCD images.
Theoretically speaking, this method and the virtual CCD line based method (Figure 3b) have similar characteristics since they are both object-space-oriented, by taking advantages of the geometric sensor model of the TDI CCD images to associate the ground surface with the TDI CCD images. The main difference is in the manner of associating the stitched image with the ground surface. Our proposed method utilizes just the projection plane in object space instead of the virtual CCD line on the focal plane, which is more straightforward in the processing strategy. As illustrated in Figure 3b, supposing that the stitched image is observed by a virtual CCD line that is placed on the focal plane and perpendicular to the sensor pushbroom direction, each scanline of the stitched image would be corresponded to the imaging area of ground surface within a designated integration time period; here, S denotes the projection center at the exposure moment.

3.2. Analysis of Error Sources

Accordingly, two factors shall be considered regarding the error sources of such a method: one is the inconsistent geometric accuracy of adjacent sub-images; the other is the height difference between the actual ground surface and the projection plane.
Here, we again take the situation of typical non-collinear TDI CCDs as an instance, to illustrate more visually distinct.

3.2.1. Inconsistent Geometric Accuracy of Adjacent Sub-Images

The first factor is associated with sensor geometric constraints between the adjacent sub-images that are the guarantees of a strict and high-precision inner FoV stitching process. As illustrated in Figure 4a, supposing P is a ground point located in the overlapping area of CCD1 and CCD2, if we perform a back-projection calculation based on the geometric sensor model (inverse forms of Equations (1) or (2)) to determine its positions represented by p1 and p2 here on sub-images of CCD1 and CCD2, respectively, p1 and p2 would become the conjugate image points on the condition that the sensor geometry accuracy of CCD1 image and CCD2 image is consistent.
In practice, various observation errors in the auxiliary data would incur the inconsistent geometric precision of adjacent sub-images. By experience, these errors can be compensated to a large extent through on-orbit geometric calibration of cameras periodically [1012,2531].

3.2.2. Height Difference between the Actual Ground Surface and the Projection Plane

The second factor, i.e., the influence of geographic relief relative to the mean elevation value, would also bring in stitching error along the orbital pushbroom direction [12,13,15,16]. In Figure 4b, supposing φy1 and φy2 represent the nominal pointing angles of CCD1 and CCD2 along the sensor pushbroom direction, respectively, P is a ground point located in the overlapping area of CCD1 and CCD2, Equation (5) can be established as:
Δ q = | Δ h × ( tan φ y 2 tan φ y 1 ) |
where Δh denotes the height difference between P and the mean elevation, Δq represents the projection distance in object space corresponding to the stitching misplacement in image space that is caused by Δh.
For HR camera of ZY-1 02C, according to the focal plane assembly (Figure 1d) and sensor details (Table 1), it can be derived that Δq ≈h × (tan 1.115° – tan 0.7°); when Δh is up to 300 m, Δq will reach 2.36 m, which is in other words, one pixel of stitching error will arise. Therefore, for plain and hilly sites where the maximum height difference against the mean elevation is less than 300 m, the stitching error will be at sub-pixel level in theory and ignorable upon most cases. To minimize the processing cost for inner FoV stitching, it is necessary to approximate the ground surface as the mean elevation plane in object space.
Apparently, the stitching error is in proportion to the tangent value difference of adjacent pointing angles at the same terrain condition. Δq will be very subtle in any terrain condition when the difference approaches to zero. Therefore, in regard to the design of spaceborne TDI CCD cameras, it would be better to decrease the interval of adjacent sensor lines along the sensor pushbroom direction to minimize the comprehensive impact of geographic relief on the stitching precision.

3.3. Workflow

As illustrated in Figure 5, the inner FoV stitching method based on sensor geometry and projection plane in object space is performed via the following steps.
To simplify our description, the coordinate transformation between object space and image space is denoted by Equations (6) and (7), in which here, f1 and f2 denote direct and inverse forms of rigorous geometric sensor model of a spaceborne TDI CCD image, respectively; (so, lo)T are the pixel coordinates of the original image; hei is the altitude above ellipsoid in the WGS84 frame.
( s o , l o , hei ) f 1 ( X , Y , Z ) WGS 84 T
( s o , l o , hei ) f 1 ( X , Y , Z ) WGS 84 T
( X , Y , Z ) WGS 84 T f 2 ( s 0 , l o ) T
In addition, transformations between different coordinate systems in object space can be denoted by Equations (8)(11); f3 and f4 represent transformation functions from geodetic coordinates to 3-D Cartesian coordinates in the WGS84 frame, and vice versa, respectively; f5 and f6 represent transformation functions from geodetic coordinates to Universal Transverse Mercator (UTM) coordinates in the WGS84 frame, and vice versa, respectively.
( Lat , Lon , hei ) WGS 84 T f 3 ( X , Y , Z ) WGS 84 T
( X , Y , Z ) WGS 84 T f 4 ( Lat , Lon , hei ) WGS 84 T
( Lat , Lon ) WGS 84 T f 5 ( X utm , Y utm ) WGS 84 T
( X utm , Y utm ) WGS 84 T f 6 ( Lat , Lon ) WGS 84 T

3.3.1. Deciding the Coverage of a Stitched Image on Projection Plane

(1)
Define the object-space projection plane in UTM coordinates system in average elevation H (an altitude value relative to the WGS84 ellipsoid). As illustrated in Figure 6, supposing O-XutmYutm represents the UTM coordinates system, which is a left-handed coordinate system whose Xutm axis coincides with the projection line of central meridian, and Yutm axis coincides with the projection line of equator and pointing in the due-east direction;
(2)
Choose the pixels at the two ends of the first column of the original image, and decide the positions of their ground points on projection plane by Equations (6), (9) and (10), and then the line connecting the two projective points thus could define the direction of target trajectory on projection plane;
(3)
Suppose O-Xutm′Yutm is a left-handed coordinate system sharing the same origin as that of, O-XutmYutm and with its Yutm axis pointing in the direction of target trajectory; if (Xutm Yutm)T denote the UTM coordinates of a certain point on projection plane, then the corresponding coordinates in O-Xutm′Yutm can be thus obtained by Equation (12), where R is the 2-D rotation matrix derived from direction of target trajectory;
( X utm Y utm ) = R ( X utm Y utm )
(4)
Find the coverage of the original image, i.e., the area of projection (AOP), by Equations (6), (9) and (10), and then the ground cover of a stitched image, i.e., AOI, is decided by the minimum rectangular region enveloping AOP, in which the boundaries of AOI are parallel and perpendicular to the line direction of target trajectory.

3.3.2. Establishing the Relation between Pixels of a Stitched Image and Grid Units of AOI

(1)
Segment AOI into regular grid units with uniform size almost identical with ground sample distance (GSD) of the original image, and then the grid units are associated with pixels of stitched image one by one. That is, the number of image rows and columns are the same as the number of grid units along and across the direction of target trajectory, respectively;
(2)
Suppose (s1, l1)T are pixel coordinates of a stitched image, k is the size of each grid unit, i.e., the scaling parameter, O′ is the grid unit at the upper left corner of AOI, (dX, dY)T are the translation parameters which are equal to the coordinates of O′ in O-Xutm′Yutm, and then there is
( X utm Y utm ) = ( s 1 × k l 1 × k ) + ( d X d Y )
Thus, the coordinates transforming relation between pixels of a stitched image and grid units of AOI is established by Equation (14). Obviously, the scaling, translation, and rotation here is equivalent to a 2-D affine transformation.
( X utm Y utm ) = R 1 ( ( s 1 × k l 1 × k ) + ( d X d Y ) )

3.3.3. Perform Image Resampling to Generate a Stitched Image

Since for a certain grid unit in ground coordinates (Xutm,Yutm)T, the corresponding pixel position on the original image can be calculated based on Equations (7), (8) and (11); thus, relation between pixels of the stitched image and pixels of the TDI CCD images is established by taking the grid units of AOI as a bridge. Thereafter, the resampling process of original image is performed and the stitched image is generated in a way of indirect image geometric rectification.
Note that, when a geographic relief is large enough to affect the sub-pixel stitching precision as analyzed in Section 3.2.2, additional geometric rectification shall be done in image-space transformation model for any local parts of the stitched image [12,19].

3.3.4. Establishing the Geometric Model of a Stitched Image

(1)
Let (s1, l1)T be the pixel coordinates of the stitched image, the ground coordinates of corresponding gird unit that represented as (Xutm, Yutm)T can be determined by Equation (14);
(2)
Perform a back-projection calculation with Equations (7), (8) and (11), to find its position on the origial image, having pixel coordinates represented as so, lo)T;
(3)
In regard to (so, lo)T, when the altitude value is assigned as hei, the ground point coordinates expressed as (lat, lon, hei)T is determined using Equations (6) and (9).
(4)
Set up the virtual control points in object space by the strict geometric model, and calculate the coefficients of RFM by least-square adjustment [14,27,32]; hence, the RFM in the form of (s1, l1)T = F′(lat, lon, hei)T is established; F′ represents the RFM-based backward coordinates transformation, from the geodetic coordinates in object space to the pixel coordinates of a stitched image correspondingly.
Thus, the strict geometric model of a stitched image in the form of (lat, lon)T = F′(s1, l1, hei)T is established; F represents the transformation from the pixel coordinates of a stitched image to the geodetic coordinates of the corresponding ground point in object space.

4. Experiment Data and Result Analysis

4.1. Data Description

To validate the correctness, feasibility and advantages of the proposed stitching method, we carried out experiments on triple linear-array images of ZY-3 satellite and HR images of ZY-1 02C satellite. We chose two scenes of image data captured by ZY-3 triple-linear array panchromatic camera. The two scenes covered the Dengfeng and Anyang districts of Henan Province each with different geographical conditions. Meanwhile, two corresponding ones captured by ZY1-02C HR camera were also selected. In addition, digital aerial ortho-map (DOM) and digital elevation model (DEM) were used for reference to automatically extract the reference points that used to evaluate the geometric quality of stitched images. Table 2 shows the details of the experimental data.

4.2. Experimental Results and Analysis

4.2.1. Evaluation of the Stitching Precision

Figure 7a gives an overview of the original images. Here, for image data 1 and 2, only the nadir image is given as the representative. Since the panchromatic high-resolution camera onboard ZY-3 satellite arranges the multiple sensor lines proximately in a collinear style, the image misplacement of adjacent sensor lines is so small that the boundaries of adjacent sub-images seem not so clear. In comparison, for image data 3 and 4 captured by HR camera of ZY-1 02C satellite, the misplacement of adjacent sub-images is obviously larger. For those original images, there is an overlap around 20∼30 pixels between adjacent sub-images in the direction of scanlines.
Figure 7b gives an overview of the stitched images generated using the method based on sensor geometry and a projection plane in object space.
To evaluate the stitching precision, visual check is carried out on the generated stitched images. A direct approach is to observe the continuity of ground features (roads, buildings, and etc.) in the local images of stitching regions, as shown in Figure 7c. If any discontinuity is observed by visual check, we can display the local images with a magnified view and measure the discontinuity quantitatively. We see from the results that a sub-pixel level stitching precision is reached for a seamless visual effect, which proves the correctness and feasibility of this method.

4.2.2. Geometric Quality Assessment of the Stitched Images

To further validate the advantages of our method, in this section, geometric quality of the stitched images is assessed from two aspects: one is about fitting precision of RFM compared with strict geometric model, the other is the internal geometric accuracy of each stitched image.
As mentioned in Section 3.3.4, the RFM parameters of a stitched image are calculated in a “terrain independent” manner [14,27,32]: first, divide the image space into regular gird units, and the size of each gird unit is 256 pixels × 256 pixels. Then, by supposing there are five elevation planes in object space, and the height difference of adjacent planes is equal and related to the minimum and maximum elevations of the site, image gird points are projected to the five elevation planes to establish virtual control points, based on the strict geometric model. Next, the 3-order RFM coefficients are solved by least squares adjustment. Finally, staggered by half a grid distance in image space, virtual check points are built up to analyze the discrepancies of RFM in comparison with the strict geometric model, i.e., the fitting precision of RFM.
For each image data, the precision was satisfactory. Taking the Dengfeng image data as an example, the fitting precision of RFM reached 0.01 pixel level or better (shown as Table 3). Therefore, for the stitched images generated, the RFM can be adopted as a good replacement for the strict geometric model. To a large extent, it indicates that, the strict geometric model of stitched image has sound properties; the RFM parameters to be solved are chosen properly; and also, the simulated virtual control points have a desirable spatial structure which is related to the density of image grid points, levels of elevation planes, and etc.
In addition, the geometric accuracy of each stitched image is assessed by the following steps:
First, several number of evenly distributed reference points are extracted from the stitched image by automatic image matching using high-resolution reference data provided.
Second, all the reference points are used as check points to evaluate the geometric accuracy of the stitched image. RFM-based back-projection is performed to decide the pixel coordinates of the reference points, and then the differences from their actual pixel coordinates are determined from which the image location errors are processed statistically.
Third, based on the control points, error compensation for RFM is performed, during which the image-space affine transformation model [33] is adopted to compensate the systematic error of RFM; after that, the image location errors of the stitched image again are processed statistically. Here, about half of the reference points are randomly chosen as the control points and the rest half are as the check points, and they are almost evenly distributed.
The statistic standard deviations of image location errors before and after RFM compensation are listed in Table 4. For ZY-3, the results are less than 2 pixels in maximum without RFM compensation; while for ZY-1 02C, the results are less than 4 pixels in maximum. In addition, after RFM compensation, the standard deviation showed no significant change in most cases. As we know, the standard deviation error value to a great extent reflects the internal geometric accuracy of the image; therefore, the results manifest that the stitched images preserve a good geometric quality with relatively small internal distortion. In our experiment, the calibrated results of the cameras are available and applied to the rigorous geometric sensor model, which is not only helpful to achieve a substantial consistency in geometric accuracy among adjacent sub-images, but also advantageous to guarantee a good image geometric quality since the systematic geometric distortion is to a large extent eliminated during stitching process.
Table 4 also compares the geometric quality of stitched images generated by the virtual CCD line-based method with those of our method, using the same image dataset listed in Table 2. It is indicated that the two object-space-oriented methods achieved the same level of stitching quality, and both shall be valuable and useful in practice. To some extent, our proposed method is more straightforward in associating the stitched image with the ground surface.

5. Conclusions

High-quality inner FoV (Field of View) stitching is currently a prerequisite for photogrammetric processing and application of image data acquired by spaceborne TDI CCD cameras. In this paper, we presented a novel inner FoV stitching method based on sensor geometry and projection plane in object space, where all TDI CCD images are projected to object space based on the sensor geometry, in the manner of indirect image geometric rectification along and across the direction of target trajectory. Experiments with image data of triple linear-array camera onboard ZY-3 satellite and HR cameras onboard ZY-1 02C satellite were performed to prove the correctness, feasibility and advantages of our method. The stitching precision and geometric quality of stitched images generated were objectively assessed; moreover, a comparison was made between the method of this paper and the virtual CCD line based method, another mainstream object-space-oriented method. The experimental results show that, (1) a sub-pixel level stitching precision can be reached for a seamless visual effect; (2) in regard to the geometric model of a stitched image, the RFM (rational function model) can be adopted as a good replacement for the strict geometric model because a high fitting precision is available; (3) the method of this paper has great potential to eliminate the image systematic geometric distortion during the stitching process so that the stitched images can preserve a good geometric quality with relatively small internal distortion; (4) compared with the virtual CCD line based method, our method is not only able to achieve the same level of stitching quality, but also more straightforward in the processing strategy, since it directly utilizes the projection plane in object space instead of the virtual CCD line on the focal plane. Overall, this method provides a new solution for high-precision inner FoV stitching of spaceborne TDI CCD images. Now it has already been applied in the daily processing system of ZY-1 02C and ZY-3 satellites and shall be promising in practice.

Acknowledgments

The authors thank the editors and the reviewers for their constructive and helpful comments for substantial improvement of this paper. This research is financially supported by the National Basic Research Program of China (The 973 Program) (No. 2014CB744201, 2012CB719901, and 2012CB719902); the National High Technology Research and Development Program of China (No. 2011AA120203); the Natural Science Foundation of China (Nos. 41271394, 41371430, and 40901209); the State Science and Technology Support Program (Nos. 2011BAB01B02, 2011BAB01B05, and 2012BAH28B04-04); and a Foundation for the Author of National Excellent Doctoral Dissertation of PR China (FANEDD, No. 201249).

Author Contributions

The first author conceived the study and designed the experiments; the second author developed technical flow of the method and wrote the program; the third author helped optimize the program and perform the experiments; the fourth author helped perform the experiments; the fifth author helped perform the analysis with constructive discussions; the sixth author contributed to manuscript preparation.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Yang, B.X. Characteristics and main specifications of IKONOS and QuickBird2 satellite camera—Some points for developing such like satellite camera. Spacecr. Recover. Remote Sens 2002, 23, 14–16. [Google Scholar]
  2. Updike, T.; Comp, C. Radiometric Use of WorldView-2 Imagery, Technical Note (2010), DigitalGlobe, 1601 Dry Creek Drive Suite 260 Longmont, CO, USA, 80503. Available online: http://www.digitalglobe.com/resources/technical-information/ (accessed on 28 February 2014).
  3. Jacobsen, K. Calibration of Optical Satellite Sensors. Available online: http://www.isprs.org/proceedings/2006/euroCOW06/euroCOW06_files/papers/CalSatJac_Jacobsen.pdf (accessed on 28 February 2014).
  4. Materne, A.; Bardoux, A.; Geoffray, H.; Tournier, T.; Kubik, P.; Morris, D. Backthinned TDI CCD image sensor design and performance for the Pleiades high resolution earth observation satellites. Proceedings of the 6th International Conference on Space Optics, Noordwijk, The Netherlands, 27–30 June 2006.
  5. Feng, Z.K.; Shi, D.; Chen, W.X.; Luo, A.R. The progress of French remote sensing satellite—From SPOT toward Pleiades. Remote Sens. Inf 2007, 4, 89–92. [Google Scholar]
  6. Lussy, F.D.; Philippe, K.; Daniel, G.; Véronique, P. Pleiades-HR Image System Products and Quality Pleiades-HR Image System Products and Geometric Accuracy. Available Online: http://www.isprs.org/proceedings/2005/hannover05/paper/075-delussy.pdf (accessed on 28 February 2014).
  7. Li, D.R. China’s first civilian three-line-array stereo mapping satellite ZY-3. Acta Geod. Cartogr. Sin 2012, 41, 317–322. [Google Scholar]
  8. Tang, X.M.; Zhang, G.; Zhu, X.Y.; Pan, H.B.; Jiang, Y.H.; Zhou, P.; Wang, X. Triple linear-array imaging geometry model of ZIYuan-3 surveying satellite and its validation. Int. J. Image Data Fusion 2013, 41, 33–51. [Google Scholar]
  9. Hu, F.; Wang, M.; Jin, S.Y. An algorithm for mosaicking non-collinear TDI CCD images based on reference plane in object-space. Proceedings of the ISDE6, Beijing, China, 9 September 2009.
  10. Zhang, S.J.; Jin, S.Y. In-orbit calibration method based on empirical model for non-collinear TDI CCD camera. Int. J. Comput. Sci. Issues 2013, 10, 1694–0814. [Google Scholar]
  11. Yang, B.; Wang, M. On-orbit geometric calibration method of ZY1–02C panchromatic camera. J. Remote Sens 2013, 17, 1175–1190. [Google Scholar]
  12. Hu, F. Research on Inner FOV Stitching Theories and Algorithms for Sub-Images of Three Non-collinear TDI CCD Chips.
  13. Long, X.X.; Wang, X.Y.; Zhong, H.M. Analysis of image quality and processing method of a space-borne focal plane view splicing TDI CCD camera. Sci. China Inf. Sci 2011, 41, 19–31. [Google Scholar]
  14. Yang, X.H. Accuracy of rational function approximation in photogrammetry. Int. Arch. Photogramm. Remote Sens 2000, 33, 146–156. [Google Scholar]
  15. Zhang, G.; Liu, B; Jiang, W.S. Inner FOV stitching algorithm of spaceborne optical sensor based on the virtual CCD line. J. Image Graphics 2012, 17, 696–701. [Google Scholar]
  16. Pan, H.B.; Zhang, G.; Tang, X.M.; Zhou, P.; Jiang, Y.H.; Zhu, X.Y.; Jiang, W.S.; Xu, M.Z.; Li, D.R. The geometrical model of sensor corrected products for ZY-3 satellite. Acta Geod. Cartogr. Sin 2013, 42, 516–522. [Google Scholar]
  17. Li, S.W.; Liu, T.J.; Wang, H.Q. Image mosaic for TDI CCD push-broom camera image based on image matching. Remote Sens. Technol. Appl 2009, 24, 374–378. [Google Scholar]
  18. Lu, J.B. Automatic Mosaic Method of Large Field View and Multi-Channel Remote Sensing Images of TDICCD Cameras.
  19. Meng, W.C.; Zhu, S.L.; Zhu, B.S.; Bian, S.J. The research of TDI-CCDs imagery stitching using information mending algorithm. Proc. SPIE 2013, 89081C. [Google Scholar] [CrossRef]
  20. Poli, D.; Toutin, T. Review of developments in geometric modeling for high resolution satellite pushbroom sensors. Photogramm. Record Spec. Issue 2012, 27, 58–73. [Google Scholar]
  21. Toutin, T. Review article: Geometric processing of remote sensing images: Models, algorithms and methods. Int. J. Remote Sens 2004, 25, 1893–924. [Google Scholar]
  22. Wolf, P.R.; Dewitt, B.A. Principles of Photography and Imaging (Chapter2). In Elements of Photogrammetry with Applications in GIS, 3rd ed.; McGraw-Hill: Toronto, ON, Canada, 2000; p. 608. [Google Scholar]
  23. Weser, T.; Rottensteiner, F.; Willneff, J.; Fraser, C. A generic pushbroom sensor model for high-resolution satellite imagery applied to SPOT5, Quickbird and ALOS data sets. Proceedings of the International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences, Hannover, Germany, 29 May–1 June 2007.
  24. Yamakawa, T.; Fraser, C.S. The Affine projection model for sensor orientation: Experiences with high-resolution satellite imagery. Int. Arch. Photogramm. Remote Sens 2004, 35, 142–147. [Google Scholar]
  25. Li, D.R; Wang, M. On-orbit geometric calibration and accuracy assessment of ZY-3. Spacecr. Recover. Remote Sens 2012, 33, 1–6. [Google Scholar]
  26. Wang, M.; Yang, B.; Hu, F.; Zang, Z. On-orbit geometric calibration model and its applications for high-resolution optical satellite imagery. Remote Sens. 2014, 6, 4391–4408. [Google Scholar]
  27. Zhang, G. Rectification for High Resolution Remote Sensing Image Under Lack of Ground Control Points.
  28. Zhang, C.S.; Fraser, C.S.; Liu, S.J. Interior orientation error modeling and correction for precise georeferencing of satellite imagery. Proceedings of the 2012 XXII ISPRS Congress International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences, Melbourne, VIC, Australia, 25 August–1 September 2012.
  29. Mulawa, D. On-orbit geometric calibration of the orbview-3 high resolution imaging satellite. Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci 2004, 35, 1–6. [Google Scholar]
  30. Kocaman, S.; Gruen, A. Orientation and self-calibration of ALOS PRISM imagery. Photogramm. Rec 2008, 23, 323–340. [Google Scholar]
  31. Delussy, F.; Greslou, D.; Dechoz, C.; Amberg, V.; Delvit, J.; Lebegue, L.; Blanchet, G.; Fourest, S. PLEIADES HR in-flight geometric calibration: Location and mapping of the focal plane. Proceedings of the 2012 XXII ISPRS Congress International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences, Melbourne, VIC, Australia, 25 August–1 September 2012.
  32. Tao, C.V.; Hu, Y. A comprehensive study of the rational function model for photogrammetric processing. Photogramm. Eng. Remote Sens 2001, 67, 1347–1357. [Google Scholar]
  33. Fraser, C.S.; Hanley, H.B. Bias-compensated RFMs for sensor orientation of high resolution satellite imagery. Photogramm. Eng. Remote Sens 2005, 71, 909–915. [Google Scholar]
Figure 1. Layout of the multiple sensor lines in the field of TDI CCD cameras. (a) QuickBird panchromatic and multi-spectral [3]; (b) IKONOS panchromatic and multi-spectral [3]; (c) Worldview-2 panchromatic and multi-spectral [2]; (d) CBERS-02B HR [9,10,12]; (e) Pleiades panchromatic and multi-spectral [4]; (f) ZY-3 triple linear-array [8].
Figure 1. Layout of the multiple sensor lines in the field of TDI CCD cameras. (a) QuickBird panchromatic and multi-spectral [3]; (b) IKONOS panchromatic and multi-spectral [3]; (c) Worldview-2 panchromatic and multi-spectral [2]; (d) CBERS-02B HR [9,10,12]; (e) Pleiades panchromatic and multi-spectral [4]; (f) ZY-3 triple linear-array [8].
Remotesensing 06 06386f1
Figure 2. Sensor geometry for the TDI CCD images. (a) The geometric relation of each detector ray in the camera coordinates system; (b) The imaging characteristics; S denotes the projection center at the exposure moment.
Figure 2. Sensor geometry for the TDI CCD images. (a) The geometric relation of each detector ray in the camera coordinates system; (b) The imaging characteristics; S denotes the projection center at the exposure moment.
Remotesensing 06 06386f2
Figure 3. Two ways of establishing pixel corresponding relations between the stitched image and the original image. (a) Our method; (b) the virtual CCD line based method.
Figure 3. Two ways of establishing pixel corresponding relations between the stitched image and the original image. (a) Our method; (b) the virtual CCD line based method.
Remotesensing 06 06386f3
Figure 4. Analysis on the error sources: (a) sensor geometric constraints between adjacent sub-images; S(p1) and S(p2)denote the satellite positions at the exposure moments of p1 and p2, respectively; (b) Influence of terrain undulation.
Figure 4. Analysis on the error sources: (a) sensor geometric constraints between adjacent sub-images; S(p1) and S(p2)denote the satellite positions at the exposure moments of p1 and p2, respectively; (b) Influence of terrain undulation.
Remotesensing 06 06386f4
Figure 5. Workflow of the stitching method based on sensor geometry and projection plane in object space.
Figure 5. Workflow of the stitching method based on sensor geometry and projection plane in object space.
Remotesensing 06 06386f5
Figure 6. Relation between the pixels of a stitched image and grid units of AOI.
Figure 6. Relation between the pixels of a stitched image and grid units of AOI.
Remotesensing 06 06386f6
Figure 7. Visual check of the stitching results. (a) Overview of the original images; (b) Overview of the stitched images; (c) Local images of the stitching regions.
Figure 7. Visual check of the stitching results. (a) Overview of the original images; (b) Overview of the stitched images; (c) Local images of the stitching regions.
Remotesensing 06 06386f7
Table 1. Geometric parameters of high-resolution panchromatic TDI CCD cameras onboard ZY-3 and ZY-1 02C satellites.
Table 1. Geometric parameters of high-resolution panchromatic TDI CCD cameras onboard ZY-3 and ZY-1 02C satellites.
CameraZY-3 Triple Linear-ArrayZY-1 02C HR1/HR2
Pixel Size (μm)Nadir: 710
Forward: 10
Backward: 10

Focal Length (mm)17003300

Ground Sample Distance (m)Nadir: 2.12.36
Forward: 3.5
Backward: 3.5

No. of CCD detectorsNadir: 8192 × 34096 × 3
Forward: 4096 × 4
Backward: 4096 × 4

Swath Width (km)Nadir: 5127
Forward: 52
Backward: 52
Table 2. Specific information about the experimental data.
Table 2. Specific information about the experimental data.
Image Data 1Image Data 2Image Data 3Image Data 4
Sensor NameZY-3 Triple linear-arrayZY-1 02C HR1/HR2
Image Size (Pixels)Nadir: 24,576 × 24,576 (8192 × 3)17,575 × 12,288 (4096 × 3)
Forward: 16,384 × 16,384 (4096 × 4)
Backward: 16,384 × 16,384 (4096 × 4)

LocationDengfeng, Henan, ChinaAnyang, Henan, ChinaDengfeng, Henan, ChinaAnyang, Henan, China
Range51 km × 51 km51 km × 51 km27 km × 27 km27 km × 27 km
Acquisition Date23 March 20124 June 20137 April 201315 March 2013

Terrain TypeMountainous and hillyPlainMountainous and hillyPlain
Mean altitude: 340 mMean altitude: 340 m
Max. altitude: 1450 mMax. altitude: 1450 m

Reference DataDOMGSD: 1 mGSD: 1 mGSD: 1 mGSD: 1 m
Planimetric accuracyPlanimetric accuracyPlanimetric accuracyPlanimetric accuracy
(RMSE): 1 m(RMSE): 0.5 m(RMSE): 1 m(RMSE): 0.5 m
DEMGSD: 5 mGSD: 2 mGSD: 5 mGSD: 2 m
Height accuracyHeight accuracyHeight accuracyHeight accuracy
(RMSE): 2 m(RMSE):0.5 m(RMSE): 2 m(RMSE): 0.5 m
Table 3. The fitting precision of RFM compared with strict geometric model (in pixel).
Table 3. The fitting precision of RFM compared with strict geometric model (in pixel).
RowColumn
Max. ErrorStdDev. ErrorMax. ErrorStdDev. Error
Image Data 1ZY-3 Forward0.0003110.0000810.0100090.001687
ZY-3 Nadir0.0002740.0001280.0003060.000151
ZY-3 Backward0.0003230.0000810.0060110.001076

Image Data 3ZY-1 02C HR10.0002550.0001140.0002850.000134
Table 4. Standard deviations of image location errors before and after RFM compensation (in pixel).
Table 4. Standard deviations of image location errors before and after RFM compensation (in pixel).
Geometric Accuracy of Stitched Images Generated by Our Proposed Method

Image DataSensor NameNo. of Control Points for RFM Error CompensationNo. of Check PointsStdDev. Error

Cross-Track (Row)Along-Track (Column)
1ZY-3 Forward01650.541.87
85800.530.48
ZY-3 Nadir01571.371.19
74831.390.68
ZY-3 Backward01700.490.46
87830.470.45

2ZY-3 Forward01100.630.86
50550.610.86
ZY-3 Nadir0831.541.40
44390.601.02
ZY-3 Backward0560.550.78
32240.520.80

3ZY1-02C HR102093.752.17
1021073.712.09

4ZY1-02C HR20521.462.56
27250.850.78

Geometric Accuracy of Stitched Images Generated by the Virtual CCD Line based Method

Image DataSensor NameNo. of Control Points for RFM Error CompensationNo. of Check PointsStdDev. Error
Cross-Track (Row)Along-Track (Column)

1ZY-3 Forward01460.560.50
70760.550.50
ZY-3 Nadir01720.660.63
90820.590.62
ZY-3 Backward01420.490.58
60820.480.58

2ZY-3 Forward01050.630.83
50550.630.82
ZY-3 Nadir0890.761.07
40490.761.06
ZY-3 Backward01150.781.70
60550.711.19

3ZY1-02C HR101973.842.02
901073.841.99

4ZY1-02C HR20413.522.91
20211.711.27

Share and Cite

MDPI and ACS Style

Tang, X.; Hu, F.; Wang, M.; Pan, J.; Jin, S.; Lu, G. Inner FoV Stitching of Spaceborne TDI CCD Images Based on Sensor Geometry and Projection Plane in Object Space. Remote Sens. 2014, 6, 6386-6406. https://doi.org/10.3390/rs6076386

AMA Style

Tang X, Hu F, Wang M, Pan J, Jin S, Lu G. Inner FoV Stitching of Spaceborne TDI CCD Images Based on Sensor Geometry and Projection Plane in Object Space. Remote Sensing. 2014; 6(7):6386-6406. https://doi.org/10.3390/rs6076386

Chicago/Turabian Style

Tang, Xinming, Fen Hu, Mi Wang, Jun Pan, Shuying Jin, and Gang Lu. 2014. "Inner FoV Stitching of Spaceborne TDI CCD Images Based on Sensor Geometry and Projection Plane in Object Space" Remote Sensing 6, no. 7: 6386-6406. https://doi.org/10.3390/rs6076386

APA Style

Tang, X., Hu, F., Wang, M., Pan, J., Jin, S., & Lu, G. (2014). Inner FoV Stitching of Spaceborne TDI CCD Images Based on Sensor Geometry and Projection Plane in Object Space. Remote Sensing, 6(7), 6386-6406. https://doi.org/10.3390/rs6076386

Article Metrics

Back to TopTop