Next Article in Journal
Long-Term (2005–2017) View of Atmospheric Pollutants in Central China Using Multiple Satellite Observations
Previous Article in Journal
UAV and Structure from Motion Approach to Monitor the Maierato Landslide Evolution
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Influence of the Sun Position and Platform Orientation on the Quality of Imagery Obtained from Unmanned Aerial Vehicles

by
Aleksandra Sekrecka
*,
Damian Wierzbicki
and
Michal Kedzierski
Department of Remote Sensing, Photogrammetry and Imagery Intelligence, Institute of Geospatial Engineering and Geodesy, Faculty of Civil Engineering and Geodesy, Military University of Technology, 01-476 Warsaw, Poland
*
Author to whom correspondence should be addressed.
Remote Sens. 2020, 12(6), 1040; https://doi.org/10.3390/rs12061040
Submission received: 7 February 2020 / Revised: 14 March 2020 / Accepted: 20 March 2020 / Published: 24 March 2020

Abstract

:
Images acquired at a low altitude can be the source of accurate information about various environmental phenomena. Often, however, this information is distorted by various factors, so a correction of the images needs to be performed to recreate the actual reflective properties of the imaged area. Due to the low flight altitude, the correction of images from UAVs (unmanned aerial vehicles) is usually limited to noise reduction and detector errors. The article shows the influence of the Sun position and platform deviation angles on the quality of images obtained by UAVs. Tilting the camera placed on an unmanned platform leads to incorrect exposures of imagery, and the order of this distortion depends on the position of the Sun during imaging. An image can be considered in three-dimensional space, where the x and y coordinates determine the position of the pixel and the third dimension determines its exposure. This assumption is the basis for the proposed method of image exposure compensation. A three-dimensional transformation by rotation is used to determine the adjustment matrix to correct the image quality. The adjustments depend on the angles of the platform and the difference between the direction of flight and the position of the Sun. An additional factor regulates the value of the adjustment depending on the ratio of the pitch and roll angles. The experiments were carried out for two sets of data obtained with different unmanned systems. The correction method used can improve the block exposure by up to 60%. The method gives the best results for simple systems, not equipped with lighting compensation systems.

Graphical Abstract

1. Introduction

In recent years, the use of imagery from unmanned aerial vehicles (UAV) in many diverse fields of science and technology has been very popular. Blocks of images obtained from a low altitude give the possibility of generating very high accuracy orthophotomaps in both the visible and infrared range, even for areas which are inaccessible. In addition, images from a low altitude can also be considered as a source of accurate information about various environmental phenomena, which is why UAVs are often used to monitor agriculture [1], vegetation [2], waters [3], and others. In such works, the main task is to analyze images in order to detect selected phenomena based on a comparison of the spectral properties of the imaged objects. Each object has different reflective properties, which is translated into the image with different pixel values. These values result from the amount of light that is reflected from the object and reaches the detector. Often, however, these values are distorted by various factors (such as sensor errors, humidity, mist, change of sunlight, shadows). Therefore, image correction is necessary to reproduce the actual reflection coefficients of the objects.
Radiometric correction is particularly important when processing images obtained by satellite and aerial sensors, because the detector is located at the upper limit of the atmosphere, which results in a high degree of scattering of reflected light before it reaches the detector. The higher the imaging altitude, the greater the disturbance caused by the atmosphere. The information stored in the image is distorted by detector errors, light scattering in the atmosphere, the angle of sunlight and topographic conditions. Therefore, when processing this type of data, three main stages of radiometric correction can be distinguished: detector error correction, atmospheric correction, solar and topographic correction [4].
With the growing popularity of low-altitude imaging comes a need to develop correction techniques adapted to images recorded by UAVs, which is the subject of many modern studies [5]. Due to the low altitude, UAV image correction is usually limited to noise reduction and detector errors. The atmospheric and solar correction stage is neglected due to the short length of the atmospheric radiance path [6]. However, recent studies [7,8] have shown that atmospheric correction of such imagery cannot be completely omitted, especially when these are low-quality images obtained in adverse conditions, characterized by haze or blur [9,10]. For radiometric correction it is recommended to use the BRDF (bidirectional reflectance distribution function). In photogrammetry and remote sensing BRDF using a physical model describes the function of incidence and reflection of a light beam on the surface of the terrain. In addition, the BRDF function plays an important role in radiometric correction of satellite imagery. In the case of classic aerial photographs and images obtained from a low altitude, corrections using this function are still in the research phase. Recently, new methods based on the radiometric correction of UAV images have been proposed [1,11]. In research work conducted, among others, by [12,13], it has been proposed that the central radiation elements flowing into the camera sensor are the surface reflected sunlight (Lsu (λ)) and surface diffuse radiance (Lsu (λ)) [12]. Another aspect of research on radiometric correction is that very often in the case of UAV images the main reason for heterogeneity is the occurrence of the “hot spot” effect - part of the image appears brighter when the image’s viewing direction approaches the direction of the sun [1,14]. It is possible to correct this effect by BRDF, assuming that anisotropy of the ground reflectance and atmospheric effects coefficient is possible at the same time. Very often for images obtained from low altitude, the rays of the Sun reflected from the photographed objects is non-Lambertian and contributes to the increase of the “hot-spot” effect. BRDF models can be divided into two types: physical and empirical. For low-altitude images, the empirical model, for which the model parameters can be calculated, is a more practical model.
After reducing the impact of the atmosphere, it is still necessary to take into account the effect of the Sun and topography on the image quality. The solar and topographic correction reduces the uneven lighting of a scene, which affects pixel values from which you can then determine the reflective properties of imaged targets. The solar and topographic correction can be solved by a simple cosine model (cosine correction, which only takes into account the solar zenith angle and the distance of the Earth from the Sun. In some of its modifications, the incidence angle is also considered [15,16,17,18,19,20,21], and a C-correction coefficient is introduced to reduce excessive correction [15]. Another model for modifying the cosine approach is the Sun-Canopy-Sensor Correction (SCS) [22]. which normalizes the illuminated canopy area. This model takes into account the topography of the terrain by introducing the terrain slope factor. The SCS model is recommended for the development of forest areas, however, may cause excessive correction of areas located away from the light source [18]. This effect is reduced by the SCS + C method [18], which modifies the SCS approach by the parameter C added to the denominator and numerator. The Minneart correction introduces the constant k, which represents a non-Lambertian surface [23]. The Statistical-empirical model (Statictical -Empirical Correction), uses a numerical terrain model to determine the relationship between the terrain and the illumination of the object [24].
However, these models have been developed to process images obtained from a high altitude and give good results when processing such data. In such cases, a large area is imaged, so the slope of the terrain and the position of the Sun cause uneven lighting of the scene and introduce significant changes in the radiometry of the image (especially within shadows). When obtaining data from an UAV, one image represents a very small area, therefore the impact of the terrain and height of the Sun is negligible. It is reasonable to perform topographic corrections for mountainous areas, because then, even on a small area, differences in terrain height can be significant and cause uneven lighting on the image. So far, the proposed solutions for solar and topographic corrections were based on approaches dedicated to satellite imagery [25]. Another approach suggests using tensor decomposition for solar corrections, however this method focuses mainly on reducing cloud distortion and variable solar irradiance conditions [26]. We notice in this manuscript that solar correction of low-altitude images requires a different approach. As mentioned, such images are less sensitive to the effects of the topography (especially when flat terrain is imaged), but at the same time they are sensitive to platform angles much more than images from a satellite sensor. Therefore, in order to maintain the highest accuracy in remote sensing analyses, one should consider the impact of the platform angles and the position of the Sun. Tilting the camera placed on an unmanned platform causes errors in exposure of the image, and the direction of this error depends on the position of the Sun during image recording. Tilting the camera will cause more light to reach the section of the detector array, which is farther away from the terrain, compared to the opposite. Therefore, in this article, we examine the impact of the azimuth of the Sun and platform tilt angles on the quality of the image obtained from a low altitude, and we offer a solution to correct changes caused by these factors.

2. Materials

Data were obtained during three photogrammetric missions carried out over flat terrain to avoid additional errors caused by the effects of the topography. Two datasets were shelled by the same sensor, and the third dataset was obtained by another sensor. The first mission was carried out in Opatów (Poland) on July 13th at 10:00 UTC and the second mission was carried out in Warszawa (Poland) on April 11th at 12:50 UTC. Both of these sets were obtained at a height of 75m using a Trimble UX5 unmanned platform equipped with a Sony Nex 5T camera, recording imagery in the RGB system but in the green, red and near infrared radiation bands. About 1500 images were obtained during the first mission and about 350 images were obtained during the second mission. During the third photogrammetric mission, 50 images were obtained using a Parrot Sequoia multispectral camera carried by a Parrot Disco AG unmanned system. This mission was carried out on October 17th at 8:00 UTC at a height of 100 m above flat terrain (a meadow) near Warsaw (Figure 1). The basic parameters of sensors and flights are summarized in Table 1.
The UX5 is a fixed-wing UAV, made largely of lightweight EPP (Expanded Polypropylene Plastics) foam and equipped with an electric drive. The take-off of the aircraft takes place exclusively using a special mechanical launcher. The system can operate at wind speeds not exceeding 18 m/s and in conditions not worse than light rain. The Sony NEX-5T mounted on this platform is a compact digital camera equipped with a CMOS matrix with a resolution of 16.1 megapixels (maximum image resolution is 4912 × 3264 pixels). The camera is equipped with a Voigtlander lens with a fixed focal length of 15 mm and a maximum aperture of f/4.5. The NEX-5T camera has been modified as to allow imagery to be acquired in the full range of array’s sensitivity. The filter located directly in front of the camera array, whose task it was to cut off the electromagnetic wave range above 690 nm, was removed. The maximum wavelength that can be registered by the sensor is about 1050 nm. Images were obtained using the NIR-enabled camera with a black IR-only longpass filter, which cut on at 695 nm. Using an integrated filter means that the imagery contains NIR only, where the blue pixels (band 3) are again used to record nothing but pure NIR (roughly 800 nm–1050 nm, shaded area in Figure 2), while the red band (band 1) in the imagery obtained with this filter is in fact the red edge, roughly from 690–770 nm. All acquired images were saved with an 8-bit radiometric resolution [27,28].
The obtained series of images represented a non-urbanized area covered with arable fields. In the center of this area there was a lake, which was of great importance for this research. The water was imaged on the entirety or a substantial portion of some of the imagery. In such an instance, on an image acquired in ideal conditions, the reflection of light should be constant, so the pixel values on the whole image should be similar. In addition, many images showed a homogeneous area covered with grass or arable land. Such images were very well suited for examining whether the image was evenly exposed.
The Parrot Sequoia is a sensor dedicated to remote sensing analyses, especially for examining changes in vegetation and agriculture. The system includes four detectors recording images in the green (550 nm), red (660 nm), red edge (735 nm), and near infrared (790 nm) ranges. It has a lighting sensor and a system that compensates for variable lighting during image acquisition. The camera is equipped with a CMOS sensor array with a resolution of 16 megapixels (maximum image resolution is 4608 × 3456 pixels) [29].
All images were recorded in good weather conditions. There is no visible haze on the images, there are no cloud shadows, nor are there any significant differences in terrain height that could disturb the amount of light reaching the detector array. Both systems were equipped with a GPS/INS (global positioning system/inertial navigation system), thanks to which it is possible to perform a completely autonomous flight at a given altitude with a given longitudinal and transverse overlap of the images and registration of approximate of the exterior orientation elements for each image. Knowledge of these elements is key for carrying out the research described in this article. The flight was made by an unmanned aerial platform Trimble UX-5, equipped with a single-frequency GPS receiver recording data with a frequency of 10 Hz. Position accuracy was ± 3 m for horizontal and vertical plane. The accuracy of angle determination was 0.5° for pitch and roll angles, and for yaw angle was 1.0°. GPS/INS system placed on the Parrot Sequoia platform was characterized by similar accuracy.

3. Methods

Methods for performing solar and topographic corrections mainly refer to the angle of incidence of sunlight and terrain slope. These parameters are sufficient for the correction of satellite imagery, however, to process images from UAVs, we suggest introducing additional corrections due to the platform tilt angles and its position relative to the Sun. When the sensor is tilted, the light reaches the detectors in the array in unequal quantities. The detectors further away from the terrain are more exposed (Figure 3).
Exposure differences are seen in the form of asymmetrical brightness of the picture. The nadir image should be evenly or symmetrically (if vignetting occurs) illuminated. Figure 4 shows examples of three photos from the same mission with different platform angles. In each case, the brightest point is not in the center of the image, and this shift depends mainly on the pitch and roll angles.
For comparative analyses on the same series of images (relative analysis between images), a pixel correction based on the general Equation (1) can be performed, where the new pixel value is the sum of the original value and the adjustment:
L n e w   ( i , j ) = L ( i , j ) + v L ( i , j )
where L ( i , j ) ,   L n e w   ( i , j ) mean respectively the original and corrected values for pixels with the coordinates (i,j), while v L ( i , j ) are the adjustments determined for each pixel with coordinates (i,j).
To determine the adjustments for the pixel values, a matrix was generated showing differences in image exposure. For this purpose, a three-dimensional transformation based on the rotation angles of the system was used. It was assumed that the image is a three-dimensional system in which the x axis is consistent with the direction of flight of the platform. The y axis is perpendicular to it and coincides with the transverse direction to the direction of flight of the platform, while the z axis is perpendicular to the XY plane. The center of this system is in the center of the image. Each pixel can therefore be described with three x, y, z coordinates, which determine the distance of the pixel from the center of the system. Figure 5 shows the adopted coordinate system:
Each pixel in the image (with an x, y location) corresponds to a certain value of the amount of light registered by the dedicated array element. The amount of light reaching the array depends on the reflective properties of the imaged object, but can be further increased or decreased due to uneven lighting of the array. We can assume that the z axis presents the additional amount of light reaching the array. The additional amount of light in this article is understood as the amount of light that deviates from the amount of light reaching each detector in the event of even lighting of the array. Considering the case of an image acquired in ideal conditions and an ideal nadir, the value of each pixel is 0 (no pixel is additionally exposed or underexposed).
In practice, it is not possible to obtain perfectly nadir images from a UAV. A gust of wind is enough to make the platform swing. Firstly, because of sudden gusts of wind. Secondly, fixed-wing UAVs are very often used in large-scale measurement campaigns. In such constructions, cameras are usually devoid of gyroscopic stabilization and they are installed rigidly in the UAV platform. Thanks to INS systems it is possible to record approximate platform tilt angles (yaw, pitch, roll). If the image is recorded at an angle, it is unevenly exposed. Then the values on the z axis are no longer equal to 0. To determine them, the actual position of the image during recording must be calculated. This can be done by means of a simple transformation of the 3D system based on the rotation matrix A ω φ κ , where the individual elements of this matrix have been calculated according to Equation (2):
A ω φ κ = [ cos ( φ ) cos ( κ ) sin ( φ ) sin ( ω ) sin ( κ ) cos ( φ ) sin ( κ ) sin ( φ ) sin ( ω ) cos ( κ ) sin ( φ ) cos ( ω ) sin ( φ ) cos ( κ ) + cos ( φ ) sin ( ω ) sin ( κ ) sin ( φ ) sin ( κ ) + cos ( φ ) sin ( ω ) cos ( κ ) cos ( φ ) cos ( ω ) cos ( ω ) sin ( κ ) cos ( ω ) cos ( κ ) sin ( ω ) ]
The angles ω, φ, κ are the rotation angles of the system known from the INS measurement. The angle ω is the angle of rotation around the x axis and for a UAV it is designated as roll. The angle φ is the angle of rotation around the y axis and for a UAV it is designated as pitch. The angle κ is the angle of rotation around the z axis. For UAVs, it is marked as yaw and it determines the azimuth of the platform at the time of exposure. Sometimes, the direction of the platform compass and the angle yaw differ, and then it is necessary to modify the angle yaw according to this difference. In our research, the difference was 0, so it can be assumed that the angle yaw is the same as the azimuth of the platform. However, in this study, the difference between the azimuth of the Sun and the yaw angle is more important (this difference is designated here as the angle κ), because it dictates the direction of changes in the exposure of the image.
Based on the determined rotation parameters, new pixel coordinates of the image were calculated, corresponding to the image acquisition conditions. For platform inclination angles other than 0, the z coordinate values are also different from 0 and correspond to the vL adjustments. An adjustment matrix was generated for each image, shown as an image in which the pixel values indicate the size of the adjustment (Figure 6)
Figure 4 shows the adjustments in binary form, where line along which the adjustments change from negative to positive is clearly visible (referred to as the adjustment axis in this article). It is an axis perpendicular of the platform tilt axis during image acquisition. The adjustment image in greyscale, however, shows the irregularity of the lighting of the detector array. In this case, the azimuth of the Sun (As) was 162°, and the yaw angle was equal to 359°. The dark pixels in the adjustment image (negative values) mean that the image has been overexposed here and inversely, bright pixels in the adjustment image (positive values) mean that the image has been underexposed here. These results are consistent with the azimuth of the Sun. From the side illuminated by the Sun, the pixel values should be corrected by reducing the value corresponding to excessive light.
The direction of uneven lighting is mainly influenced by the difference in azimuth of the Sun and the yaw angle, but in addition, this direction changes depending on the pitch and roll angles. The amount of exposure correction depends on the value of these angles. The z values are determined in the XYZ image system, so they depend on the image size. This value cannot be directly translated into the amount of excess or deficiency of light. To correctly determine the value of the vL correction, the z coordinate value should be multiplied by the k factor calculated for each image (Equation (3)).
v L ( i , j ) = z ( i , j ) · k
where z ( i , j ) is the z coordinate value for the pixel with the coordinates (i,j) in the original coordinate system of the image, and k is the coefficient determining the weight of the adjustment. The coefficient k describes the relationship between the unevenness of the illumination in the x and y axis. If the image is evenly illuminated in both directions, then k = 0 and the image pixels do not require any modification. In the vL adjustment image in this case, the azimuth of the inflection line is a multiple of the 45° angle.

The k Coefficient

The effect of the difference between the azimuth of the Sun and yaw angle on the image exposure was examined, assuming the same pitch and roll values. No distortion occurs only when all angles are zero. However, if any of the angles is different from 0, then uneven illumination of the array can be expected. As the figure shows, the direction of the uneven illumination of the array depends on the difference angle between the azimuth of the Sun and yaw. If the difference is a multiple of 45°, then the adjustments are distributed evenly and symmetrically along the x or y axis. However, if the difference between the azimuth of the Sun and the yaw angle is 0° or is a multiple of 90°, then the correction axis is inclined to the edge of the image at an angle of 45° and symmetrical adjustments are located on opposite corners of the image (Table 2).
In the above case, a case where the pitch and roll angles are equal was considered. In reality, however, this is very rare. The figure below (Table 3) represents a study of changes in the angle of inclination of the correction axis depending on the pitch/roll ratio.
Based on the above analysis, knowing the yaw angle and the azimuth of the Sun, you can deduce the slope of the adjustment axis assuming the pitch/roll ratio is equal to 1 (theoretical slope). Then, by comparing the actual slope of the adjustment axis for the processed image with the calculated one, it is possible to determine the value of the k coefficient (Equation (4)).
k = | α t | |   α | | α t |
where α t is the theoretical slope and α is the actual slope of the correction axis that can be calculated according to Equation (5):
α = a t a n ( x P y P )
where x P , y P are coordinates of the point P in the original image XYZ system (before 3D the transformation). Point P is the intersection of the adjustment axis with the edge of the image, as shown in Figure 7:

4. Results

The method was tested for image blocks acquired by two different UAV systems. Section 4.1 shows the results of testing for the UX5 system, and Section 4.2 contains the results for the Parrot Sequoia system.

4.1. UX5 System (Sony Nex 5T Camera)

Figure 8 shows the results for several different cases for dataset 1. Case 1 yaw: = 6.75°, pitch = 3.77°, roll = −2.90°. Case 2: yaw = 358.17°, pitch = −0.22°, roll = −2.97°. Case 3: yaw = 358.96°, pitch = 12.20°, roll = −1.21°. Case 4: yaw = 173.39°, pitch = 8.60°, roll = 2.61°. Images come from one photogrammetric mission, all were obtained at the same time, i.e., July 13 at 11:03, when the azimuth of the Sun A S was 162.58°.
After applying the proposed exposure adjustment, the image brightness has a symmetrical distribution. The magnitude of the changes depends on the value of the vL adjustments. The changes can be seen most clearly in the image depicting the water. However, most often, these are very small changes, difficult to notice with the naked eye, but of great importance when performing many analyses of the amount of light reflected from imaged objects. Therefore, image profiles (Figure 9) showing the distribution of pixel values were used for more detailed analyses. Thanks to the image profiles, it is possible to examine the uniformity of the illumination of the image at the time of exposure, along the x axis, along flight path of the platform (longitudinal sections) and along the y axis, perpendicular to the direction of flight of the platform (cross sections). Images showing a homogeneous area were used for the analyses. In case 1, the image used for the most part depicted a lake. Water is a smooth surface with low light scattering, which is why (incorporating vignetting) the profile has a parabolic shape. The most light reaches the central part of the array, where the influence of tilt angles is negligible. The image of the water’s surface, due to its homogeneity, is a good example for analyzing the impact of UAV’s tilt angles on the uniformity of image exposure. In the ideal case (without platform tilt), the profiles should have a parabolic shape. The tilt of the platform means that more light reaches the section of the array which is further away from the terrain, making the image brighter in this area and the graph skewed.
The method used makes it possible to adjust the image exposure. In the case where water was being imaged, the profiles on the adjusted images have a parabolic shape. The more varied the structure of the imaged area, the more the profiles resemble a straight line, which under the influence of uneven lighting can be tilted at a certain angle. The adjustment minimizes this angle. The pixel values’ rate of change grows with the distance from the centre of the image. In extreme cases (like case 1) the values changed by up to 20, which constituted over 20% of the original value. The smaller the pitch and roll angle ratio, the smaller the changes. For some images, they may be only 1%–2% in the regions of greatest change.
The research was verified by testing the proposed method on another data set obtained by UX5 with a Sony Nex 5T camera in similar conditions. Figure 10 shows the results for several different cases. Case 1 yaw: = 273.50°, pitch = 5.33°, roll = 2.27°. Case 2: yaw = 277.55°, pitch = 12.51°, roll = −3.42°. Case 3: yaw = 89.32°, pitch = 5.16°, roll = −4.62°. Case 4: yaw = 269.68°, pitch = 7.75°, roll = 1.23°, when the azimuth of the Sun A S was 223.76°.
Three images show the meadow area covered with grass. One image shows a forested area. In all cases the vignetting effect is noticeable, but before correction the most exposed place is not always in the center of the image. The edges are not symmetrically exposed. After correction, the brightness of the images has changed and has a symmetrical distribution. To better visualize the changes, the image cross-sections and longitudinal section are shown (Figure 11).
The results are similar to the examples for dataset 1. Sections confirm the improvement of image quality and equalization of their exposure. The shape of the charts changes significantly and they become symmetrical, which means that the edges of the images are equally illuminated. The slope of the charts changes by a maximum of about 30°, and this size depends on the angle of the platform, the Sun azimuth and the variation in land cover in the photo.

4.2. ParrotSequoia

Similar tests were performed for the data set obtained by the Parrot Sequoia (PS). The results for several different cases are presented below. Case 1 yaw = 143.65°, pitch = 4.60°, roll = 2.17°, A S = 127.90°. Case 2 yaw = 72.99°, pitch = 5.14°, roll = −1.6°, A S = 130.15°. Case 3: yaw = −67.50°, pitch = 5.58°, roll = 0.74°, A S = 134.05°. Case 4: yaw = 144.23°, pitch = 5.55°, roll = −0.80°, A S = 127.90°. According to the manufacturer’s estimates, the pitch and roll angles were obtained with precision 0.5° and the yaw angle was obtained with precision 1.0°. The images come from one photogrammetric mission, all were obtained at the same time, i.e., on October 17 at 8:00 UTC, when the azimuth of the Sun was close to 130° (Figure 12).
For a series of images acquired by Parrot Sequoia, an exposure correction was also performed in accordance with the proposed approach. The calculated vL adjustments caused a symmetrical lightening and darkening of the image with an intensity depending on the parameters of the camera position during the exposure. The figures show four examples that represent the entire data sample. As can be seen in this series of measurements, in many cases the total influence of factors is not as strong as in the case of measurements from the UX5. Thanks to systems which compensate for solar variability, the images acquired by the Sequoia sensor are already pre-corrected. Therefore, the profiles (Figure 13) do not show such a clear slope as in the previous examples. When processing images which had already been subjected to earlier lighting compensation, it may be necessary to determine an additional parameter scaling the k factor. This parameter should have a unique value for each camera, as it would depend on the compensation systems used. Such a parameter should be determined empirically, based on a series of acquired images, taking into account different pitch/roll ratios.
Figure 13 shows examples of images and their graphs. The adjustments were performed for a whole series of images. In order to verify the correctness of the obtained results, an analysis of changes in cross-sections and longitudinal sections was carried out for each image. Each profile of the image shows the pixel values of the central row or column of the image. The images were not taken in the laboratory but in real conditions, therefore the imaged area is not completely homogeneous. Different spectral properties of the imaged objects (for example, interpenetrating bare soil and grass) cause the values on the graphs to fluctuate. However, the profiles shown in the graphs can be approximated using a polynomial function. The approximation function’s gradient indicates uneven lighting. When the detector array is unevenly illuminated, the profile should be approximated by a simple horizontal or symmetrical parabola, where the start and end points have the same values. Thus, the smaller the gradient of the function, the more even the illumination. For each image, the angle of the gradient line of the approximating function was calculated before and after correction, and Table 4 summarizes the statistical analysis for the whole set of images.
In all datasets, the maximum gradient of the approximating functions were about 50° and were reduced by about 10°. Exceptions occur only for dataset 2, because this data was obtained in stronger winds, resulting in larger camera angles. The proposed method allows to significantly improve these errors for data obtained by UX5. In each set of images obtained by the UAV there are also images obtained without tilting the camera or with a tilt guaranteeing uniform lighting. Then the approximation function is a horizontal line (with a gradient of 0°). In this case, the coefficient k has values that minimize corrections and after correction the gradient of the profile is still 0°. Comparing all sets of data, the main noticeable difference lays in the average values, which are higher for the data obtained by the Parrot Sequoia. High average values before the adjustment suggest that the profiles were skewed. In most cases, it was a tilt in the same direction (regardless of the camera tilt angles), suggesting a systematic error due to the camera’s properties. Systematic error can be caused by calibration of lighting compensation. This problem especially applies to the initial photos in the mission, which for small blocks significantly affects the statistics of the entire data set. Figure 8 shows that regardless of the azimuth of the Sun and the pitch and roll angles, the images are darker on the right. For this reason, the method used does not reduce distortions as effectively as in the case of data from the UX5 platform, and may require the introduction of an additional parameter that takes into account the parameters of cameras with lighting compensation. Considering the average values, it can be stated that the data blocks from the Sony Nex 5T camera improved by about 60%, while the block from the Parrot Sequoia system improved by about 24%.

5. Discussion

The problem presented in this manuscript concerns the impact of platform inclination angles and the position of the Sun on the quality of images acquired by unmanned aerial systems. Usually, the quality correction of images obtained from a low-altitude relate to sensor errors and the influence of the atmosphere on the final image [6,30,31].
The problem of uneven exposure of UAV images is also discussed in [26], however, this method focuses mainly on reducing distortions caused by cloud cover and variable solar irradiance conditions. This is a good method to reduce uneven lighting of images in a photogrammetric block when it is caused by shadows and external conditions during image acquisition. However, uneven lighting of the sensor array appears even if the images are obtained in very good weather conditions, in the absence of cloud cover or with regular cloud cover. As we have shown in our research, the cause of uneven lighting may be the sensor’s tilt at the time of exposure, and the method we propose makes it possible to correct this distortion. The article shows changes in the direction of the illumination of the array depending on the difference of yaw angle and Sun azimuth. Including these parameters in the UAV image correction is necessary because (as shown by other studies), the solar irradiance angle is influenced not only by the height of the Sun, but also by its azimuth [32].
The impact of the platform’s instability was often overlooked because its impact was minimized when processing an entire image block. However, this is not always necessary, for example, when monitoring a selected object or structure, one image (or several images) from a series of many images is important. Individual images are however processed to correct for vignetting [31,33], which also corrects an image’s illumination, but does not remove deformations caused by tilt angles and the effect of the Sun’s position.
The method is recommended primarily for the processing of images acquired by fixed wing UAV systems, since these are particularly vulnerable to the effects of wind and in this case the influence of tilt angles will be significant. At present, many unmanned platforms are equipped with camera tilt compensation systems. However, minimizing them does not guarantee perfectly nadir imagery, and in many detailed analyses, changing pixel values by even a few percent may be important in further stages of image processing, where the aim is to determine the reflectivity of the object. In addition, the correction of the exposure of images obtained for the same object or structure will allow for a more accurate analysis of changes based on a sequence of images obtained with a certain time interval. The method will work similarly for video images if the platform tilt angles for selected frames are known.
Processing time can be a limitation of this method because each image is analyzed separately and with large blocks this process may be time consuming. In our experiment, the proposed method was implemented by MATLAB R2019a on a PC with Windows 10. Hardware configuration: 2.8 GHz Intel i7 Quad, 16 GB RAM and GeForce graphic card with CUDA technology. Table 5 shows the per-image computation time of each algorithm,
Processing time is strictly dependent on the original resolution of the input images. At the moment, this approach can’t be implemented inside camera system. This is due to electronic restrictions. However, in the future, along with the miniaturization and reduction of the cost of GPUs capable of installing UAVs, our method will be possible to implement. It will significantly improve the radiometric quality of images already at the stage of their acquisition.
However, making individual adjustments to each image, before combining them, will increase the spectral quality of the final mosaic. When mosaicing uncorrected images, artificially lightened or darkened areas may appear, which will have a negative effect on remote sensing analyses in these areas. In addition, the proposed approach cannot be separated from other aspects of the radiometric correction process of images acquired by UAVs. It is recommended that the image is first corrected to reduce noise and eliminate image registration errors. Also, if the effects of atmospheric factors such as fog or cloudiness are noticeable in the images, a separate correction is necessary to minimize these errors regardless of the correction due to the angles of inclination and the Sun azimuth. The approach proposed here should be considered as only one aspect of the complex process of radiometric correction of images acquired by UAVs. Moreover, the Parrot Sequoia sensor includes an up-looking irradiance sensor included in the calculation. The up-looking sensors zenith and azimuth angles are also affected by the roll, pitch and yaw, so the same method could be used to correct irradiance.
In recent years, the use of consumer cameras with automatic exposure has become popular. In the case of consumer cameras, exposure parameters are very often automatic. From the point of view of obtaining images from UAVs, this is a negative phenomenon. Images obtained as part of one campaign cannot have different exposure parameters, as they will result in errors in the development of the data block. Therefore, when planning a photogrammetric mission from a low altitude, exposure parameters are constant. Very often they are only conditioned by setting the shutter release time before the UAV starts, which depends on the current lighting conditions. Therefore, during one UAV mission, the acquired images are characterized by the same exposure parameters. The method we propose considers each photo individually, therefore it will be effective also when using automatic exposure. However, if the final goal is to combine all the photos from the block, it is recommended to use fixed exposure parameters.

6. Conclusions

Research focused on determining the effect of the Sun position and platform tilt angles on the quality of images acquired by unmanned aerial vehicles (UAV) and proposing a solution to correct this impact. Tilting the camera placed on an unmanned platform leads to incorrect exposure of the image, and the direction of this disorder depends on the position of the Sun during image acquisition. Considering image exposure as its third dimension, a three-dimensional transformation by rotation is used to determine a matrix of adjustments to increase image quality. The corrections depend on the angles of the platform and the difference between the direction of flight and the position of the Sun. An additional factor regulates the value of the adjustment depending on the ratio of the pitch and roll angles. The tests were carried out for two sets of data obtained with different unmanned systems (a UX5 with a Sony Nex 5T camera and a Parrot Sequoia). The correction method used can improve the block exposure by up to 60%. The method gives the best results for simple systems, not equipped with lighting compensation systems. This solution is especially applicable when processing imagery obtained from fixed wing systems, since these are particularly vulnerable to the effects of the wind and in this case the platform angles will have a big impact on the image quality. The proposed approach should be considered as only one aspect of the complex process of radiometric correction of images acquired by UAVs, and it does not exclude other aspects of correction, such as noise reduction, vignetting correction, and atmospheric correction.

Author Contributions

All authors contributed to the experimental design and participated in the collection of UAV data. All authors provided editorial advice and participated in the review process. Conceptualisation, M.K.; methodology, A.S.; data analysis D.W. They also interpreted the results and wrote the paper; data acquisition M.K and A.S. All authors have read and agreed to the published version of the manuscript.

Funding

This research is funded by the Ministry of National Defence, Republic of Poland, Grant No. GB/1/2018/205/2018/DA-990.

Acknowledgments

This paper has been supported by the Military University of Technology, the Faculty of Civil Engineering and Geodesy, Institute of Geospatial Engineering and Geodesy.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Honkavaara, E.; Saari, H.; Kaivosoja, J.; Pölönen, I.; Hakala, T.; Litkey, P.; Mäkynen, J.; Pesonen, L. Processing and Assessment of Spectrometric, Stereoscopic Imagery Collected Using a Lightweight UAV Spectral Camera for Precision Agriculture. Remote. Sens. 2013, 5, 5006–5039. [Google Scholar] [CrossRef] [Green Version]
  2. Qiu, Z.; Feng, Z.; Wang, M.; Li, Z.; Lu, C. Application of UAV Photogrammetric System for Monitoring Ancient Tree Communities in Beijing. Forestry 2018, 9, 735. [Google Scholar] [CrossRef] [Green Version]
  3. Vivoni, E.R.; Rango, A.; Anderson, C.A.; Pierini, N.A.; Schreiner-McGraw, A.; Saripalli, S.; Laliberte, A.S. Ecohydrology with unmanned aerial vehicles. Ecosphere 2014, 5, 1–14. [Google Scholar] [CrossRef] [Green Version]
  4. Jakomulska, A.; Sobczak, M. Radiometric Correction of Satellite Images—Methodology and Exemplification; Teledetekcja Środowiska: Warsaw, Poland, 2001. [Google Scholar]
  5. Clemens, S.R. Procedures for Correcting Digital Camera Imagery Acquired by the AggieAir Remote Sensing Platform; Utah State University: Logan, UT, USA, 2012. [Google Scholar]
  6. Honkavaara, E.; Khoramshahi, E. Radiometric Correction of Close-Range Spectral Image Blocks Captured Using an Unmanned Aerial Vehicle with a Radiometric Block Adjustment. Remote. Sens. 2018, 10, 256. [Google Scholar] [CrossRef] [Green Version]
  7. Kedzierski, M.; Wierzbicki, D.; Sekrecka, A.; Fryskowska, A.; Walczykowski, P.; Siewert, J. Influence of Lower Atmosphere on the Radiometric Quality of Unmanned Aerial Vehicle Imagery. Remote. Sens. 2019, 11, 1214. [Google Scholar] [CrossRef] [Green Version]
  8. Yu, X.; Liu, Q.; Liu, X.; Liu, X.; Wang, Y. A physical-based atmospheric correction algorithm of unmanned aerial vehicles images and its utility analysis. Int. J. Remote. Sens. 2016, 38, 3101–3112. [Google Scholar] [CrossRef]
  9. Yoon, I.; Jeong, S.; Jeong, J.; Seo, D.; Paik, J. Wavelength-Adaptive Dehazing Using Histogram Merging-Based Classification for UAV Images. Sensors 2015, 15, 6633–6651. [Google Scholar] [CrossRef] [Green Version]
  10. Huang, Y.; Ding, W.; Li, H. Haze removal for UAV reconnaissance images using layered scattering model. Chin. J. Aeronaut. 2016, 29, 502–511. [Google Scholar] [CrossRef] [Green Version]
  11. Honkavaara, E.; Hakala, T.; Markelin, L.; Rosnell, T.; Saari, H.; Mäkynen, J. A Process for Radiometric Correction of UAV Image Blocks. Photogramm. Fernerkund. Geoinf. 2012, 2012, 115–127. [Google Scholar] [CrossRef]
  12. Hakala, T.; Honkavaara, E.; Saari, H.; Mäkynen, J.; Kaivosoja, J.; Pesonen, L.; Pölönen, I. Spectral Imaging from Uavs under Varying Illumination Conditions. ISPRS Int. Arch. Photogramm. Remote. Sens. Spat. Inf. Sci. 2013, 189–194. [Google Scholar] [CrossRef] [Green Version]
  13. Schowengerdt, R.A. Remote Sensing, Models and Methods for Image Processing, 3rd ed.; Academic Press Inc.: San Diego, CA, USA, 2007. [Google Scholar]
  14. Beisl, U.; Woodhouse, N. Correction of atmospheric and bidirectional effects in multispectral ADS40 images for mapping purposes. Internet Intl. Arch. Photogramm. Remote Sens. 2004. [Google Scholar]
  15. Teillet, P.; Guindon, B.; Goodenough, D. On the Slope-Aspect Correction of Multispectral Scanner Data. Can. J. Remote. Sens. 1982, 8, 84–106. [Google Scholar] [CrossRef] [Green Version]
  16. Smith, J.A.; Lin, T.L.; Ranson, K.J. The Lambertian assumption and Landsat data. Photogramm. Eng. Remote Sens. 1980, 46, 1183–1189. [Google Scholar]
  17. Cavayas, F. Modelling and Correction of Topographic Effect Using Multi-Temporal Satellite Images. Can. J. Remote. Sens. 1987, 13, 49–67. [Google Scholar] [CrossRef]
  18. Soenen, S.A.; Peddle, D.; Coburn, C. SCS+C: A modified Sun-canopy-sensor topographic correction in forested terrain. IEEE Trans. Geosci. Remote. Sens. 2005, 43, 2148–2159. [Google Scholar] [CrossRef]
  19. Proy, C.; Tanre, D.; Deschamps, P. Evaluation of topographic effects in remotely sensed data. Remote. Sens. Environ. 1989, 30, 21–32. [Google Scholar] [CrossRef]
  20. Kimes, D.S.; Kirchner, J.A. Modeling the Effects of Various Radiant Transfers in Mountainous Terrain on Sensor Response. IEEE Trans. Geosci. Remote. Sens. 1981, GE-19, 100–108. [Google Scholar] [CrossRef]
  21. Conese, C.; Gilabert, M.A.; Maselli, F.; Bottai, L. Topographic normalization of TM scenes through the use of an atmospheric correction method and digital terrain models. Photogramm. Eng. Remote Sens. 1993, 59, 1745–1753. [Google Scholar]
  22. Gu, D.; Gillespie, A. Topographic Normalization of Landsat TM Images of Forest Based on Subpixel Sun–Canopy–Sensor Geometry. Remote. Sens. Environ. 1998, 64, 166–175. [Google Scholar] [CrossRef]
  23. Minnaert, M. The reciprocity principle in lunar photometry. Astrophys. J. 1941, 93, 403. [Google Scholar] [CrossRef]
  24. Meyer, P.; Itten, K.I.; Kellenberger, T.; Sandmeier, S.; Sandmeier, R. Radiometric corrections of topographically induced effects on Landsat TM data in an alpine environment. ISPRS J. Photogramm. Remote. Sens. 1993, 48, 17–28. [Google Scholar] [CrossRef]
  25. Lorenz, S.; Zimmermann, R.; Gloaguen, R. The Need for Accurate Geometric and Radiometric Corrections of Drone-Borne Hyperspectral Data for Mineral Exploration: MEPHySTo—A Toolbox for Pre-Processing Drone-Borne Hyperspectral Data. Remote. Sens. 2017, 9, 88. [Google Scholar]
  26. Wang, S.; Baum, A.; Zarco-Tejada, P.J.; Dam-Hansen, C.; Thorseth, A.; Bauer-Gottwein, P.; Bandini, F.; Garcia, M. Unmanned Aerial System multispectral mapping for low and variable solar irradiance conditions: Potential of tensor decomposition. ISPRS J. Photogramm. Remote. Sens. 2019, 155, 58–71. [Google Scholar] [CrossRef]
  27. Wierzbicki, D.; Fryskowska, A.; Kedzierski, M.; Wojtkowska, M.; Deliś, P. Method of radiometric quality assessment of NIR images acquired with a custom sensor mounted on an unmanned aerial vehicle. J. Appl. Remote. Sens. 2018, 12, 1. [Google Scholar] [CrossRef]
  28. Trimble UAS. Trimble UX5 Aerial Imaging Solution Vegetation Monitoring Frequently Asked Questions. 2013. Available online: http://surveypartners.trimble.com (accessed on 27 December 2019).
  29. Parrot. Available online: https://www.parrot.com/business-solutions-us/parrot-professional/parrot-sequoia (accessed on 20 December 2019).
  30. Kelcey, J.; Lucieer, A. Sensor Correction of a 6-Band Multispectral Imaging Sensor for UAV Remote Sensing. Remote. Sens. 2012, 4, 1462–1493. [Google Scholar] [CrossRef] [Green Version]
  31. Minařík, R.; Langhammer, J.; Hanuš, J. Radiometric and Atmospheric Corrections of Multispectral μMCA Camera for UAV Spectroscopy. Remote. Sens. 2019, 11, 2428. [Google Scholar] [CrossRef] [Green Version]
  32. Rajendran, P.; Smith, H. Modelling of solar irradiance and daylight duration for solar-powered UAV sizing. Energy Explor. Exploit. 2016, 34, 235–243. [Google Scholar] [CrossRef] [Green Version]
  33. Hakala, T.; Suomalainen, J.; Peltoniemi, J. Acquisition of Bidirectional Reflectance Factor Dataset Using a Micro Unmanned Aerial Vehicle and a Consumer Camera. Remote. Sens. 2010, 2, 819–832. [Google Scholar] [CrossRef] [Green Version]
Figure 1. Location of where the Unmanned Aerial Vehicle missions were performed.
Figure 1. Location of where the Unmanned Aerial Vehicle missions were performed.
Remotesensing 12 01040 g001
Figure 2. Sony NEX5T spectral response functions [28].
Figure 2. Sony NEX5T spectral response functions [28].
Remotesensing 12 01040 g002
Figure 3. Photo recording with (a) nadir camera (b) tilted camera.
Figure 3. Photo recording with (a) nadir camera (b) tilted camera.
Remotesensing 12 01040 g003
Figure 4. Examples of three photos from the same mission with different platform angles.
Figure 4. Examples of three photos from the same mission with different platform angles.
Remotesensing 12 01040 g004
Figure 5. (a) assumed three-dimensional coordinate system on the image (b) projection of the system into the two-dimensional image space.
Figure 5. (a) assumed three-dimensional coordinate system on the image (b) projection of the system into the two-dimensional image space.
Remotesensing 12 01040 g005
Figure 6. Adjustment image vL show in binary and greyscale forms.
Figure 6. Adjustment image vL show in binary and greyscale forms.
Remotesensing 12 01040 g006
Figure 7. Point P—intersection of the adjustment axis with the edge of the image.
Figure 7. Point P—intersection of the adjustment axis with the edge of the image.
Remotesensing 12 01040 g007
Figure 8. Images before adjustment (left) and after adjustment (right) and images of their adjustments (bottom) in binary and grayscale form for four sample images (a) case 1: yaw = 6.75°, pitch = 3.77°, roll=2.90° (b) case 2: yaw = 358.17°, pitch = −0.22°, roll = −2.97° (c) case 3: yaw = 358.96°, pitch = 12.20°, roll = −1.21° (d) case 4: yaw = 173.39°, pitch = 8.60°, roll = 2.61°.
Figure 8. Images before adjustment (left) and after adjustment (right) and images of their adjustments (bottom) in binary and grayscale form for four sample images (a) case 1: yaw = 6.75°, pitch = 3.77°, roll=2.90° (b) case 2: yaw = 358.17°, pitch = −0.22°, roll = −2.97° (c) case 3: yaw = 358.96°, pitch = 12.20°, roll = −1.21° (d) case 4: yaw = 173.39°, pitch = 8.60°, roll = 2.61°.
Remotesensing 12 01040 g008
Figure 9. Longitudinal section (above) and cross section (below) for images before adjustment (left) and after adjustment (right) for four sample images (a) case 1: yaw = 6.75°, pitch = 3.77°, roll = −2.90° (b) case 2: yaw = 358.17°, pitch = −0.22°, roll = −2.97° (c) case 3: yaw = 358.96°, pitch = 12.20°, roll = −1.21° (d) case 4: yaw = 173.39°, pitch = 8.60°, roll = 2.61°.
Figure 9. Longitudinal section (above) and cross section (below) for images before adjustment (left) and after adjustment (right) for four sample images (a) case 1: yaw = 6.75°, pitch = 3.77°, roll = −2.90° (b) case 2: yaw = 358.17°, pitch = −0.22°, roll = −2.97° (c) case 3: yaw = 358.96°, pitch = 12.20°, roll = −1.21° (d) case 4: yaw = 173.39°, pitch = 8.60°, roll = 2.61°.
Remotesensing 12 01040 g009
Figure 10. Images before adjustment (left) and after adjustment (right) and images of their adjustments (bottom) in binary and grayscale form for four sample images (a) case 1 yaw: = 273.50°, pitch = 5.33°, roll = 2.27° (b) case 2: yaw = 277.55°, pitch = 12.51°, roll = −3.42° (c) case 3: yaw = 89.32°, pitch = 5.16°, roll = −4.62° (d) case 4: yaw = 269.68°, pitch = 7.75°, roll = 1.23°.
Figure 10. Images before adjustment (left) and after adjustment (right) and images of their adjustments (bottom) in binary and grayscale form for four sample images (a) case 1 yaw: = 273.50°, pitch = 5.33°, roll = 2.27° (b) case 2: yaw = 277.55°, pitch = 12.51°, roll = −3.42° (c) case 3: yaw = 89.32°, pitch = 5.16°, roll = −4.62° (d) case 4: yaw = 269.68°, pitch = 7.75°, roll = 1.23°.
Remotesensing 12 01040 g010
Figure 11. Longitudinal section (above) and cross section (below) for images before adjustment (left) and after adjustment (right) for four sample images (a) case 1 yaw: = 273.50°, pitch = 5.33°, roll = 2.27° (b) case 2: yaw = 277.55°, pitch = 12.51°, roll = −3.42° (c) case 3: yaw = 89.32°, pitch = 5.16°, roll = −4.62° (d) case 4: yaw = 269.68°, pitch = 7.75°, roll = 1.23°.
Figure 11. Longitudinal section (above) and cross section (below) for images before adjustment (left) and after adjustment (right) for four sample images (a) case 1 yaw: = 273.50°, pitch = 5.33°, roll = 2.27° (b) case 2: yaw = 277.55°, pitch = 12.51°, roll = −3.42° (c) case 3: yaw = 89.32°, pitch = 5.16°, roll = −4.62° (d) case 4: yaw = 269.68°, pitch = 7.75°, roll = 1.23°.
Remotesensing 12 01040 g011
Figure 12. Images before adjustment (left) and after adjustment (right) and images of their adjustments (bottom) in binary and grayscale form for four sample images (a) case 1 yaw = 143.65°, pitch = 4.60°, roll = 2.17°, AS = 127.90°. Case 2 yaw = 72.99°, pitch = 5.14°, roll = −1.6°, AS = 130.15°. Case 3: yaw = −67.50°, pitch = 5.58°, roll = 0.74°, AS = 134.05°. Case 4: yaw = 144.23°, pitch = 5.55°, roll = −0.80°, AS = 127.90°.
Figure 12. Images before adjustment (left) and after adjustment (right) and images of their adjustments (bottom) in binary and grayscale form for four sample images (a) case 1 yaw = 143.65°, pitch = 4.60°, roll = 2.17°, AS = 127.90°. Case 2 yaw = 72.99°, pitch = 5.14°, roll = −1.6°, AS = 130.15°. Case 3: yaw = −67.50°, pitch = 5.58°, roll = 0.74°, AS = 134.05°. Case 4: yaw = 144.23°, pitch = 5.55°, roll = −0.80°, AS = 127.90°.
Remotesensing 12 01040 g012
Figure 13. Longitudinal section (above) and cross section (below) for images before adjustment (left) and after adjustment (right) for four sample images (a) case 1 yaw = 143.65°, pitch = 4.60°, roll = 2.17°, As = 127.90°, (b) case 2 yaw = 72.99°, pitch = 5.14°, roll = −1.63°, As = 130.15°, (c) case 3: yaw = −67.50°, pitch = 5.58, roll = 0.74°, As = 134.05°, (d) case 4: yaw = 144.23°, pitch = 5.55°, roll = −0.80°, As = 127.90°.
Figure 13. Longitudinal section (above) and cross section (below) for images before adjustment (left) and after adjustment (right) for four sample images (a) case 1 yaw = 143.65°, pitch = 4.60°, roll = 2.17°, As = 127.90°, (b) case 2 yaw = 72.99°, pitch = 5.14°, roll = −1.63°, As = 130.15°, (c) case 3: yaw = −67.50°, pitch = 5.58, roll = 0.74°, As = 134.05°, (d) case 4: yaw = 144.23°, pitch = 5.55°, roll = −0.80°, As = 127.90°.
Remotesensing 12 01040 g013
Table 1. The basic parameters of sensors and flights.
Table 1. The basic parameters of sensors and flights.
Dataset 1Dataset 2Dataset 3
UAV platformUX5UX5Parrot Disco AG
CameraSony Nex 5TSony Nex 5TParrot Sequoia
Altitude [m]7575100
GSD [m]0.0240.0240.130
FOV [°]11011074
Forward and side overlap [%]808080
Cover area [ha]40501
Flight time [min]302015
Table 2. The impact of the difference between the azimuth of the Sun and the yaw angle on image exposure, assuming the same pitch and roll values.
Table 2. The impact of the difference between the azimuth of the Sun and the yaw angle on image exposure, assuming the same pitch and roll values.
45°90°135°
Remotesensing 12 01040 i001 Remotesensing 12 01040 i002 Remotesensing 12 01040 i003 Remotesensing 12 01040 i004
Remotesensing 12 01040 i005 Remotesensing 12 01040 i006 Remotesensing 12 01040 i007 Remotesensing 12 01040 i008
180°225°270°315°
Remotesensing 12 01040 i009 Remotesensing 12 01040 i010 Remotesensing 12 01040 i011 Remotesensing 12 01040 i012
Remotesensing 12 01040 i013 Remotesensing 12 01040 i014 Remotesensing 12 01040 i015 Remotesensing 12 01040 i016
Table 3. Changes in the angle of inclination of the correction axis depending on the pitch/roll ratio.
Table 3. Changes in the angle of inclination of the correction axis depending on the pitch/roll ratio.
Pitch/Roll=5°/1°Pitch/Roll=5°/2°Pitch/Roll=5/°3°Pitch/Roll=5°/4°
Remotesensing 12 01040 i017 Remotesensing 12 01040 i018 Remotesensing 12 01040 i019 Remotesensing 12 01040 i020
Pitch/Roll=5°/6°Pitch/Roll=5°/7°Pitch/Roll=5°/8°Pitch/Roll=5°/15°
Remotesensing 12 01040 i021 Remotesensing 12 01040 i022 Remotesensing 12 01040 i023 Remotesensing 12 01040 i024
Table 4. Statistics of the gradient lines’ angles of approximating functions for the cross-sections and longitudinal sections of images obtained by an UAV before and after exposure adjustment.
Table 4. Statistics of the gradient lines’ angles of approximating functions for the cross-sections and longitudinal sections of images obtained by an UAV before and after exposure adjustment.
UX5 (Sony Nex 5T)UX5 (Sony Nex 5T)Parrot Sequoia
Pre AdjustmentPost AdjustmentPre AdjustmentPost AdjustmentPre AdjustmentPost Adjustment
cross sectionmax50°39°65°30°53°25°
min18°
mean16°12°23°10°34°14°
std19°14°15°11°16°10°
longitudinal sectionmax43°17°55°25°36°42°
min15°
mean18°27°10°21°26°
std18°17°12°16°11°
Table 5. Comparison of the average per-image computation time (second).
Table 5. Comparison of the average per-image computation time (second).
Our Method
Sony Nex 5T
(Dataset 1)
Our Method
Sony Nex 5T
(Dataset 2)
Our Method
Parrot Sequoia
(Dataset 3)
Time [s]2.432.383.34

Share and Cite

MDPI and ACS Style

Sekrecka, A.; Wierzbicki, D.; Kedzierski, M. Influence of the Sun Position and Platform Orientation on the Quality of Imagery Obtained from Unmanned Aerial Vehicles. Remote Sens. 2020, 12, 1040. https://doi.org/10.3390/rs12061040

AMA Style

Sekrecka A, Wierzbicki D, Kedzierski M. Influence of the Sun Position and Platform Orientation on the Quality of Imagery Obtained from Unmanned Aerial Vehicles. Remote Sensing. 2020; 12(6):1040. https://doi.org/10.3390/rs12061040

Chicago/Turabian Style

Sekrecka, Aleksandra, Damian Wierzbicki, and Michal Kedzierski. 2020. "Influence of the Sun Position and Platform Orientation on the Quality of Imagery Obtained from Unmanned Aerial Vehicles" Remote Sensing 12, no. 6: 1040. https://doi.org/10.3390/rs12061040

APA Style

Sekrecka, A., Wierzbicki, D., & Kedzierski, M. (2020). Influence of the Sun Position and Platform Orientation on the Quality of Imagery Obtained from Unmanned Aerial Vehicles. Remote Sensing, 12(6), 1040. https://doi.org/10.3390/rs12061040

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop