Next Article in Journal
Retrieval of Daily Mean VIIRS SST Products in China Seas
Previous Article in Journal
Artificial Neural Network-Based Microwave Satellite Soil Moisture Reconstruction over the Qinghai–Tibet Plateau, China
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Feasibility of Ground-Based Sky-Camera HDR Imagery to Determine Solar Irradiance and Sky Radiance over Different Geometries and Sky Conditions

by
Pedro C. Valdelomar
*,
José L. Gómez-Amo
,
Caterina Peris-Ferrús
,
Francesco Scarlatti
and
María Pilar Utrillas
Departament de Física de la Terra i Termodinàmica, Universitat de València, 46100 Burjassot, Spain
*
Author to whom correspondence should be addressed.
Remote Sens. 2021, 13(24), 5157; https://doi.org/10.3390/rs13245157
Submission received: 11 November 2021 / Revised: 12 December 2021 / Accepted: 14 December 2021 / Published: 19 December 2021

Abstract

:
We propose a methodological approach to provide the accurate and calibrated measurements of sky radiance and broadband solar irradiance using the High Dynamic Range (HDR) images of a sky-camera. This approach is based on a detailed instrumental characterization of a SONA sky-camera in terms of image acquisition and processing, as well as geometric and radiometric calibrations. As a result, a 1 min time resolution database of geometrically and radiometrically calibrated HDR images has been created and has been available since February 2020, with daily updates. An extensive validation of our radiometric retrievals has been performed in all sky conditions. Our results show a very good agreement with the independent measurements of the AERONET almucantar for sky radiance and pyranometers for broadband retrievals. The SONA sky radiance shows a difference of an RMBD < 10% while the broadband diffuse radiation shows differences of 2% and 5% over a horizontal plane and arbitrarily oriented surfaces, respectively. These results support the developed methodology and allow us to glimpse the great potential of sky-cameras to carry out accurate measurements of sky radiance and solar radiation components. Thus, the remote sensing techniques described here will undoubtedly be of great help for solar and atmospheric research.

Graphical Abstract

1. Introduction

In recent years, interest in the accurate measurements of solar radiation on the Earth’s surface has resumed. It was mainly triggered by the large increase in solar energy plants that have been deployed all over the world as part of the most ambitious strategies defined by different international administrations to mitigate climate change [1,2]. Therefore, there is now a need to improve the solar energy projections of these plants for different time scales. It is especially important for photovoltaic and thermosolar plants, due to their increasing development and perspectives over the coming years [3,4,5]. The success of these projections highly depends on improving solar radiation measurements, both in accuracy and worldwide availability. For short time scales, the solar energy forecast is strongly related to the atmospheric conditions and especially with the number of clouds and their temporal evolution [6]. Consequently, different instruments and measurement techniques have been developed.
The measurements of solar radiation on the Earth’s surface usually rely on broadband radiometers, covering the entire solar spectrum (280–3000 nm). These radiometers are commonly installed on solar trackers and the different components of the solar radiation are measured with moderate–high temporal resolution (~1–15 min). This instrumental setup is widespread, especially in the most populated areas around the world [7,8]. In addition, several methods to estimate the cloud cover, optical depth and aerosol optical depth (AOD) have been developed from the combination of different components of broadband solar radiation [9,10,11]. Unfortunately, in most ground-based stations, only the global horizontal irradiance (GHI) component is measured and the other components may be only estimated considering other atmospheric factors (e.g., clearness index) and using different parametric, physical or statistical methods [12,13]. This induces additional limitations on the spatial resolution of accurate ground-based measurements of all solar radiation components.
Conversely, geostationary satellites have been used to overcome these limitations because they provide global coverage with enough spatial resolution (~2–5 km) and moderately high temporal resolution (~10–30 min) that allows the determining of some cloud properties (cloud cover, cloud type and top height), its daily evolution and, consequently, the solar radiation throughout the day. The retrievals of solar radiation components on the surface rely on broadband satellite reflectance measurements and radiative transfer modelling. On the other hand, the spectral measurements of solar radiation from the ground and satellites have been used to determine the columnar content and properties of different atmospheric components (aerosol, cloud and gases) that significantly affect solar radiation on the Earth’s surface. Most of these measurements are also used to constrain solar radiation databases and forecasts provided from physical and statistical models [14,15,16]. The satellites and modelling estimations of solar radiation agree with ground-based measurements within ±20%, depending on the component [16,17,18,19]. The best results have been obtained in clear sky situations and therefore, there is room for the improvement of these estimations, especially when different cloud layers are present.
Sky-cameras combine the characteristics of broadband instruments, since they are able to measure the solar incident light in three RGB channels with the bandwidth varying from 20 to 100 nm, depending on the sensors and filters used. These cameras use array sensors and fisheye lens that allows individual pixels to observe portions of the atmosphere with a narrow field of view (FOV). Thanks to their panoramic view of the sky-dome, around 180° FOV, they were originally designed to determine the cloud cover. However, much progress has been made recently in instrumental development and data processing techniques. Consequently, the different properties of clouds (cloud amount, type and optical depth) have been obtained [20,21,22], as well as calibrated radiances for each channel [23]. In addition, the aerosol optical properties have also been inferred using different techniques [24,25,26,27].
The hemispherical FOV and high angular resolution make sky-cameras an interesting instrument to use to obtain the solar radiation over different geometries, with moderate spectral resolution. The attempts to obtain and forecast the broadband solar radiation are usually based on statistical correlations and machine learning methods that frequently determine the GHI and direct normal irradiance (DNI) components [19,28,29,30]. For solar energy applications, especially in those photovoltaic plants with fixed tilted angles in which the direct horizontal irradiance (DHI) is also important, solar radiation components need to be additionally projected on the plane of the array.
Therefore, the main objective of this work is to provide the accurate and calibrated measurements of sky radiance and broadband solar irradiance using High Dynamic Range (HDR) images from a sky-camera and, by taking advantage of its high angular resolution and spectral characteristics, develop a methodology to derive the sky radiance and the solar DHI component under arbitrary geometries. These methodologies and retrievals may be of great interest to the solar energy community, since they allow the monitoring and forecasting of solar radiation incidents on an arbitrary plane with a single and affordable instrumentation under all sky conditions.
The paper is organized as follows: the site and instrumentation are detailed in Section 2; an exhaustive characterization of the sky-camera is tackled in Section 3; Section 4 describes the methodological approach to derive the solar radiation over arbitrary geometries; we discuss our results in Section 5; and the main outcomes of this work are summarized in the conclusion.

2. Site Measurements and Data

The data used in this work were provided by the instrumentation deployed in the Burjassot Atmospheric Station (BASS), which is maintained by the Solar Radiation Group of the University of Valencia. It is located northwest of the city of Valencia, within its metropolitan area, and around 10 km away from the Mediterranean Sea. The BASS is devoted to measuring and monitoring radiation, aerosols, clouds and some other atmospheric components by means of a large set of remote sensing and in situ instrumentation [31,32,33,34,35]. It also forms part of different international measurement networks, such as the aerosol robotic network (AERONET) [36], European Skynet Radiometers [36], e-profile and EARLINET, and belongs to the ACTRIS research infrastructure.
The main instrument used in this work was a SONA-201D sky-camera integrated by SIELTEC Canarias S.L. and originally designed to determine the cloud cover using all sky images. The system was composed of a body camera based on a color CMOS sensor and a fisheye lens with a 180° FOV that was encapsulated in an IP67 (dust-tight and water immersion resistant up to 1 m depth) enclosure with a borosilicate dome, and thermally stabilized to maintain the internal temperature at 45 °C, regardless of environmental conditions. The camera sensor was made by e2v (model EV76C570), had a 1600 × 1200 pixel resolution and mounted an RGB filter with an infrared protection window, which gave an exposure time range from 16 µs to 1 s. This sensor had several light blocked lines that were used to implement an adaptive dark current filter, conferring a sensor signal to noise ratio (SNR) of about 38 dB. Its sensitivity and SNR were adequate for obtaining night images with a low dark noise by using a long integration time while also being able to register day images with a short integration time without risk to the sensor integrity. In spite of the raw images having a 10-bit depth, they were forced into an 8-bit truncation. The images were scheduled and stored through a SIELTEC proprietary server connection, based on a PC platform, using an Ethernet network and GiGe protocol.
We used an automatic Cimel CE-318 sun photometer that used a double collimator with a 1.2° FOV to measure the direct sun and sky radiance in different bands, from UV to near infrared spectral regions. The bandwidth for the UV channels was 2 nm, and 10 nm for the others. The direct sun intensity was measured at 340, 380, 440, 500, 675, 870, 940, 1020 and 1640 nm, and was used to determine the spectral aerosol optical depth (AOD), Angström exponent (AE) and water vapor content using the 940 nm channel. Sky radiance was also measured in seven channels (380, 440, 500, 675, 870, 1020 and 1640 nm) over different geometries. The combination of both measurement schedules was used to determine the aerosol’s microphysical and intensive radiative properties [37,38].
The measurements of the broadband solar radiation were carried out by Kipp and Zonen radiometers mounted on a Kipp and Zonen Solys-2 solar tracker: two CMP21 pyranometers for global and diffuse horizontal components and a CHP1 pyrheliometer for the direct. Additionally, the measurements of the broadband down- and upward radiation were performed in solar and thermal spectral regions by CM11 pyranometers and CGR4 pyrgeometers, respectively. Another CM11 pyranometer measured the solar broadband radiation on a 30° tilted surface, oriented to the South.
In this work, we have used data from February 2020 to June 2021, in which our sky-camera was set up to provide HDR images every minute. Broadband radiation was also stored at a 1-min resolution with two Campbell CR-1000 data-loggers. The Cimel CE-318 was configured to perform the direct sun measurements with ~10-min frequency while the sky radiance schedule was more variable, with a typical temporal resolution of 30 min, depending on the solar zenith angle (SZA).

3. Characterization of the SONA Sky-Camera

3.1. Geometric Calibration

Our sky-camera was set horizontally in a still location, so each pixel always received light from the same portion of the sky-dome. Therefore, several empiric methods were applied to properly characterize the fisheye lens of the sky-camera, in order to establish the observation geometry. This implied a projection of the celestial coordinates on the images in order to obtain the polar coordinates of every pixel and the position of the optical center, and to study the possible distortion pattern of the fisheye lens.
To geometrically characterize the sky-camera lens, we developed a Celestial Bodies Tracker (CBT) software. The CBT automatically identified a multitude of celestial bodies with well-known trajectories visible under the FOV of our camera and tracked them during the several days of observation. This allowed us to obtain enough samples to establish the relationship between the sky coordinates of the celestial bodies, taken from an ephemerides database [39], and their relative coordinates in the sky-camera polar coordinate system. The software provided the projection type, optical center and focal length, as well as the coefficients that related to the polar and Cartesian coordinates of our sky-camera.
The pixel zenith angle (PZA, θ) varied with the radius, measured from the optical center to each pixel (Figure 1a), and several fisheye projections [40] were used to correlate them by a regression fit. We chose the equidistant projection since it showed the best correlation coefficient (Table 1). Therefore, the optical center of the image was located at the pixel coordinates (X = 577, Y = 584) and the focal length was (f = 0.84 µm). The Pixel Azimuth Angle (PAA, φ) increased counterclockwise, with the North of the image (φ = 0°) in the upper part. There was an indication of a misalignment of 5.8° towards the East (left side) with respect to Geographic North (Figure 1b).
We performed additional accuracy tests of the CBT projections, which resulted in sky-coordinates position uncertainties of ±2 pixels.
Once the PZA and PAA were calculated, the pixel solid angle (PSA, Ω) could be obtained from Equation (1). The integration limits were derived for every pixel of the image from the PZA and PAA matrices (Figure 1a,b).
Ω = d S r 2 = θ 1 θ 2 φ 1 φ 2 sin ( θ ) d θ   d φ
The PSA slightly decreased with the radius and showed small values in the range of [6.5, 10] × 10−6 sr (Figure 1c). Thus, each pixel of the camera observed a different portion of the sky-dome with a narrow FOV.

3.2. Sensor Characterization

Our sky-camera sensor was a Focal Plane Array (FPA) based on CMOS technology, which meant that each cell of the matrix was independent from the surrounding cells. Under illumination, each cell stored a charge, which was proportional to the irradiance received and created an electric current that could be measured by an analog-to-digital converter.
The main sources of uncertainty in this kind of sensor were the non-uniformity effects, due to the manufacturing process: dead and hot pixels, which worsen with the aging of the sensor, and thermal noise and dark current due to the physics of the device [26]. To estimate these effects, a set of measurements was collected in a dark environment. As expected, the non-uniformity, thermal noise and dark current effects were negligible (results not shown) thanks to the thermal stabilization and the dark current correction loop built into the sensor. The non-uniformity effect was also negligible, mainly due to the intrinsic characteristics of the sensor and its production process. The sensor did not present dead or hot pixels at exposure times below 25 ms. Considering that the maximum time that we used the sensor during day was about 36 µs, this issue only became apparent during nighttime measurements, which were not used in this study.

3.3. Camera Response Function

The camera response function, or “characteristic curve”, represented the sensor response to variations in the exposure that were the product of exposure time and the irradiance [41].
We empirically obtained the response function, as part of the HDR constructing method, following the methodology proposed in [41], which was based on the physical property of reciprocity. This method used a sequence of images to obtain the relationship between the exposure and its corresponding digital counts (DCs). Assuming that this function was monotonically increasing and capturing different exposures times of a static scene, we should have obtained different fragments of the curve (Figure 2a). Finally, a linear optimization process based on mean squared error minimization was used to fit these fragments together (Figure 2b).
Using this technique, the precise knowledge of the exposure times was critical to obtain the response function, and especially to build a HDR image properly. In our sky-camera, the nominal exposure time differed from the real exposure time. Thus, we calculated this non-linear effect by comparing the nominal and real exposure time ratios of two images taken with different exposure times. By doing this iteratively and varying the exposure times, we finally obtained the function that related the nominal to the real exposure times [42,43]. To improve the performance of the response function under any conditions, we recalculated it as the average of the response functions in different sky conditions, according to their corresponding SZA and considering different exposition times in each sequence (not shown).
The original algorithm [41] proposed a triangular weighting function to minimize the non-linear effects of the sensor when low or high DCs were measured, emphasizing the smoothness and fitting terms toward the middle of the curve at around 128 DC (Figure 3). In practical terms, we did not observe a significant dependency of the resulting response function with different weighting functions. However, the use and the choice of the appropriate weighting function were decisive in constructing the HDR images accurately (Section 3.4).
While the triangular function enhanced the contrast of the image, giving more weight in the middle of dynamic range, we used a weighting function w(z) in Equation (2). It gave the same weight within the entire linear range of the sensor (50 < DC < 200), reducing the weight in the non-linear region (DC < 50 and DC > 200) (Figure 3).
w ( z ) = 1 ( 2 z 2 b + 1 ) a
In Equation (2), z referred to the measured DCs, while b represented the number of bits of each sample (8-bit in our case) and a was an even factor greater than 2. The a coefficient modified the width of the plateau and the slope of the non-linear region of the function. The appropriate values for our camera were between 12 and 24 (Figure 3). To use this function properly, it had to be normalized with respect to the maximum in absolute value and the offset be adjusted to finally obtain a function that was worth 1 in the plateau and 0 in the tails of the function.
In looking for a good compromise between computational complexity and smoothness of result, we selected 900 pixel locations per sequence, where each sequence had different pixel locations and consequently, had its own response function. The pixel locations were chosen randomly but with restrictions based on our experience to guarantee the convergence of the fitting algorithm, thereby avoiding ”anomalous” pixels. These restrictions were that the pixel value had to be monotonically increasing (with at least a 33% margin) and the distance between pixels had to be greater than 16 pixels. If not enough pixels were suitable, the sequence was discarded.
To obtain the response function, we used 1000 images, with exposure times of between 0.016 ms and 1s, over several days under different conditions of illumination, in terms of SZA, and environmental conditions. Then the response function for the three channels (RGB) of our sky-camera was the average of all of them (Figure 4). The resulting average response function presented a non-linear behavior in the lowest and highest DCs range, specifically from 0 to 40 DCs for all channels and above 227, 211 and 215 for blue, green and red channels, respectively. The shape of the sensor response function did not depend on the number of pixels chosen to construct it, nor on their location. On the contrary, the use of anomalous pixels, such hot or dead pixels, significantly affected the shape of the curve (bumpy curve) and the use of insufficient pixel locations resulted in a poorly defined curve.

3.4. Constructing the High Dynamic Range Imagery

The function response g(Zi,j) of the sky-camera could be used to convert the pixel values (in DCs) of any image into relative irradiance values Eri (in arbitrary units) by knowing its exposure time Δtj and following Equation (3) [41].
ln E r i = g ( Z i , j ) ln Δ t j
Therefore, the HDR image could be constructed by a superposition of several images taken at different exposure times. The relative irradiance Eri could be obtained by Equation (4) [41], where the sub-index i represented the position of each pixel in the corresponding image j within the sequence, while w(Zi,j) was the weighting function.
ln E r i = j w ( Z i , j ) · ( g ( Z i , j ) ln Δ t j ) j w ( Z i , j )
For daylight scenes, we took six images with times of between 16 µs and 36 µs, using 8 bits of bit-depth per channel. In addition, we used our weighting function (Section 3.3) to guarantee the same weight for all pixel values within the linear range of every image of the sequence.
The main advantage of using our HDR images was to largely expand the dynamic range within the scene, and the extremely linear behavior achieved, thanks to our improvements. In addition, we also obtained a significant reduction in the background noise, improving the response and the signal to noise ratio.

3.5. Radiometric Calibration

The radiometric calibration of the sky-camera was completed by a comparison against the radiative transfer simulations carried out by libRadTran model [44] and the relative irradiance from our HDR imagery, following the methodology proposed in [23,26]. It was based on the monochromatic simulations of the sky radiance, using the effective wavelength of each channel of the camera [45]. The effective wavelength (λeff) was spectrally weighted by the illumination conditions (i.e., the atmospheric irradiance), following (5).
λ e f f = λ 1 λ 2 λ   · E ( λ )   ·   S ( λ )   d λ λ 1 λ 2 E ( λ )   ·   S ( λ )   d λ
where λ was the wavelength, E(λ) was the spectral irradiance reaching the instrument and S(λ) was the spectral response of the channel (Figure 5). The integration interval was from λ1 = 400 nm to λ2 = 998 nm. To calculate λeff, a set of 200 libRadTran simulations for different spectral irradiance corresponding to different clear sky conditions were carried out by changing the aerosol Angström parameters and the SZA [23]. The Henyey–Greenstein phase function [46] and the fixed values of the asymmetry parameter, single scattering albedo (SSA) and total ozone column (TOC) were used in all simulations.
The average values of the effective wavelengths for each channel of the sky-camera are shown in Table 2.
The radiometric calibration established a relationship between the relative radiance provided by the HDR images (in arbitrary units) for each pixel (i.e., observation geometry) and channel (RGB) and the corresponding radiance values obtained from the RTM simulations in radiometric units (W m−2 sr−1 nm−1). To calibrate our HDR images, we used 800 clear sky images from February to October of 2020, accounting for a large number of cases under different atmospheric conditions and sun locations.
We used monochromatic simulations using the λeff (Table 2) with an angular resolution of 1° for the viewing geometry (PZA and PAA) at the geographical coordinates of the BASS. The range of the variation of the most important parameters used in the RTM simulations are provided in the Supplementary Material. A standard mid-latitude atmosphere and the surface albedo were set as constant in all simulations. The surface albedo did not show a significant seasonal variation since the BASS is within the metropolitan area and the vegetation does not change in the surroundings. Therefore, it was obtained as the 5-year average, in the period 2015–2020, from the AERONET data. The concentration of the atmospheric components (TOC and column water vapor, CWV) and aerosol properties (AOD, SSA, asymmetry parameter and the phase function) were obtained from the AERONET database, as well as solar geometry that was varied in the simulations in correspondence with the selected images. The AERONET derived aerosol phase function was converted to 200 Legendre moments with the pmom tool in the libRadtran package before running the simulations. The AERONET data for all spectral values were interpolated to the λeff of our sky-camera.
We obtained a calibration factor kλn for each channel (sub-index λ) and each image (super-index n) fitting the simulated radiance Lλn (i, j) by linear regression to match the relative Irradiance Erλn (i, j) following the Equation (6), where σ (i, j) was the sky-camera PSA and i and j were the matrix indeces.
L λ n ( i , j ) = k λ n · E r λ n ( i , j ) Ω ( i , j )
Figure 6 shows the probability density function of the clear sky imagery calibration factors resulting from each camera channel. Finally, the calibration coefficient (Kλ) was obtained as the average of the whole sample of every kλn and ± 1σλ was considered its associated uncertainty (Table 3). The relative uncertainty of the calibration coefficients for the blue and green channels was 7%, and increased to 10% in the red channel.

4. Methods to Obtain Solar Radiation

4.1. Determination of the Sky-Radiance

Once the camera was calibrated, we knew the irradiance reaching every pixel of an image for the three RGB channels Eλ(θ, φ) and it was straightforward to determine the sky radiance considering the solid angle by following Equation (7).
L λ ( θ , φ ) = E λ ( θ , φ ) Ω ( θ , φ )
The radiance from any point of the sky-dome could be extracted from an image by choosing the appropriated viewing angles (θ, φ) and translating them into the Cartesian coordinates (x, y) of the image.
Once the sun coordinates (θs and φs) in an image were known, the radiance in the almucantar plane of the sun could be obtained by just fixing θ = θs and varying φ. The radiance in the principal plane could be extracted by fixing the (φ = φs) or (φ = φs + π) for any θ. In addition, the radiance at different isolines of the scattering angles could also be extracted from the images, using the relationship between the pixel scattering angle (Θ) and PZA and PAA for any given sun position following Equation (8) [47,48].
cos ( Θ ) = cos ( θ s ) cos ( θ ) + s i n ( θ s ) sin ( θ ) cos ( | φ s φ | )
Figure 7 shows an example of the representation of these three sky geometries in a SONA image. Due to the viewing geometry described in Section 3, the almucantar is represented by a radial circle at a fixed θ = θs, while the principal plane is a straight line that passes through the optical center and through sun position at fixed φ = ±φs.

4.2. Broadband Diffuse Irradiance on a Horizontal Plane

Our sky-camera did not use any shadowing element to blind the sun. Therefore, the circumsolar area in most of the images remained saturated in some or all RGB channels, even when using the shortest available exposure time. The size of this saturated region of the images was highly dependent on the atmospheric conditions, with it being smaller in clear sky situations and then increasing with the amount and size of atmospheric scatters (aerosols and clouds). Consequently, only the diffuse radiation could be obtained using our sky-camera.
The broadband diffuse horizontal irradiance of each channel (DHIch) could be obtained from the spectral and angular integration of the product between the radiance projected from every viewing direction to the horizontal plane Lch(θ, φ)·cosθ and the suitable spectral response of the sensor Sch(λ), following Equation (9).
D H I c h = 0 π 2 0 2 π λ 1 λ 2 P ( Θ )   S c h ( λ )   L c h ( θ , φ ) cos θ   s i n θ   d θ   d φ   d λ
P ( Θ ) = { 0 ,     Θ     5 ° 1 ,     Θ > 5 °
In Equation (9), ch referred to R, G and B channels, and λ1 and λ2 were the spectral integration limits of each channel. P(Θ) was the logical NOT of the penumbra function [49,50] reformulated in terms of the scattering angle (Equation (10)) to subtract the circumsolar area. We chose a limit angle of 5 degrees to mimic the FOV of the shadow ball used in the measurement of the horizontal diffuse irradiance by the pyranometer deployed in the solar tracker.
The numerical integration in Equation (9) was calculated by uncoupling and discretizing the spectral and angular terms Equations (11) and (12), and by considering the calibrated irradiance Ech(i, j), PZA (i, j) and scattering angle (Θij) matrices at pixel positions (i, j), where DHIch is the sum of all pixels of an irradiance calibrated sky-camera image with a scattering angle of more than 5° and a zenith angle below 90° as is shown in Figure 8.
D H I c h = i = 1 N j = 1 N Q c h   P ( Θ i , j )   E c h ( i , j )   cos ( θ i , j )
Q c h = k = 1 M 1 S c h ( k ) · ( λ k + 1 λ k )
D H I S O N A = D H I R + D H I G + D H I B
Qch was the convolution of the spectral response Sch(k) of each channel with 1-nm wavelength step, where k = [1, M − 1] corresponded to λ = [300, 1100] nm. The broadband diffuse horizontal radiation (DHI) was obtained by adding up the DHIch of each sky-camera channel following Equation (13).
In practical terms, Pij) was a mask matrix, where pixels with scattering angle below 5° or with zenith angle above 90° were zero and the rest were one (Figure 8a). Multiplying this mask by the irradiance matrices (Figure 8b–d) and summing up the resulting product of all pixels, we finally obtained the DHIch and then the sky-camera DHI (DHISONA).

4.3. Broadband Diffuse Irradiance on Arbitrary Planes

A similar approach to that described in Section 4.2 could be used to determine the broadband diffuse tilted irradiance (DTI) on an arbitrary plane DTIβ,ϕs, with tilt angle (β) and orientation angle (ϕs), measured from the South (positive to the West) (Figure 9). In practice, it was reduced to construct a mask that only considered the light directions reaching the tilted plane, obtaining the equivalent pixel zenith angle (PZA’; θ’) and appropriately changing the limits of the angular integration in Equation (9).
In our PAA matrix, the North direction corresponded with (φ = 0°) and φ increased counterclockwise, so by naming the PAA matrix on the rotated plane as PAA’, φ’ is therefore the difference of the φ and ϕs angles (Equation (14)).
The PZA’ on the tilted plane depended on the tilted beta angle and varied with φ’ (Figure 9). For pixels over the horizontal plane, θ was |θ – γ| and for pixels under the horizontal plane θ was |θ + γ|, as we can see in Equations (14)–(16).
φ = φ ϕ s
r = x 2 + y 2 ;       y = r · cos ( φ ) ;       γ = arcsin ( y · sin ( β ) r )
i f       φ   > 90   and   φ   < 270     τ = 1 ,   else     τ = 0 θ = | θ + ( 1 ) τ ·   γ |
To obtain the DTIβ,ϕs, we projected the normal direction by multiplying by cos(θ’) and applying the same spectral correction as in the horizontal plane. Note that the G(θi, j’) function limited the pixels that we had to consider to calculate the DTIβ,ϕs.
D T I β , ϕ s = i = 1 N j = 1 N Q c h   P ( Θ i , j )   G ( θ i , j )   E R , G , B c h ( i , j )   cos ( θ i , j )
G ( θ ) = { 1 ,       θ 90 ° 0 ,       θ > 90 °
Figure 10 shows an example of a PZA’ matrix obtained with the method described above for a tilted plane with β = 30° and ϕs = 0° (South oriented).
Note that the matrix PZA’ represented the zenith viewing angles of the sensor pixels on a tilted plane. In Figure 10, the zenith angles between 90° and 70° in the North direction are shaded by the plane itself. In contrast, the South direction is not shaded. In the tilted plane, the pixel viewing angles depended on the γ angle (derived from the calculation of the Euler angles for the rotation of the coordinate axes). The heart shape effect obtained depended on the tilt angle of the plane (the steeper the plane, the greater the effect).

5. Results

Different sets of RGB sky radiance in the almucantar plane, DHI irradiance and DTIβ,ϕ were extracted from the SONA HDR images. These sets were used for testing the reliability of our results in comparison to independent and well-calibrated measurements.

5.1. Validation of Sky Radiance

The validation of the RGB sky radiance obtained from HDR imagery was carried out from February to October 2020 by comparing the sky-camera radiance results to the AERONET almucantar radiance dataset at 440, 500 and 675 nm (the closest wavelengths to SONA λeff). To ensure that both instruments observed the same portion of the sky, the SONA almucantar radiance was extracted from the images averaging within the FOV of the Cimel (1.5°). Note that we did not apply any spectral band shift or correction to account for the difference between the Cimel (10 nm spectral band at 440, 500 and 675 nm) and our camera (~300 nm spectral band at 480, 541 and 618).
For scattering angles below 10°, we observed a camera sensor saturation, different for each of the RGB channels, which increased with AOD and the presence of some types of clouds surrounding the sun. Furthermore, in clear sky conditions with an AOD of aerosols of below 0.1, we had no saturated pixels, but a glare around the sun appeared, perhaps due to undesirable optical issues. To avoid these issues, measurements with a scattering angle of below 10° or with a dirty dome were rejected for the validation.
This validation was carried out for all sky conditions, which were classified as clear, partially cloudy and overcast skies and attended to the symmetry with respect to the principal plane and the average blue to red ratio (BRR) of each image. To determine how symmetrical the image was, we used the mean square deviation (MSD) between the left and right sides of the image with respect to the principal plane. Partially cloudy skies did not present any symmetry, while clear and overcast skies did. Thus, the BRR was used as a second criteria to distinguish between them. We empirically determined that HDR images with an MSD < 30 guaranteed symmetry, which corresponded to clear sky conditions if the average BRR > 2.8 as well.
Note that AERONET only provided almucantar radiance measurements in clear sky and partially cloudy conditions when the sun was not covered. This implied additional restrictions to our validation dataset.
We used the Relative Mean Bias Deviation (RMBD) to evaluate the differences between the sky-camera and the AERONET measurements, and the confidence level (CL) of the sky-camera measurements was determined by the percentage of cases with an RMBD of below 10% (CL10).
A significant agreement between AERONET and SONA was observed in all cases, with correlation coefficients of more than 0.90 in most of them (Table 4). However, this agreement was different with respect to sky conditions and spectral dependence.
In clear sky conditions, a very good agreement was observed in 25% of cases. The results indicate a high degree of correlation (R > 0.96) for all channels, with an RMBD < 10%. However, better performance was observed in the blue, green and red channels with the RMBD increasing to 4%, 6% and 9% at 440, 500 and 675 nm, respectively. In addition, most of the cases showed an RMBD smaller than 10%, with high values of CL10 (91%, 85% and 91% for 440, 500 and 675 nm, respectively).
Under cloudy sky conditions, 75% of cases also showed high correlation coefficients, but in this case with a larger RMBD compared to the clear sky conditions. Notable RMBD differences were observed among the red (RMBD = 28%) channel. And the blue and green channels, both showing an RMBD = 10%. In addition, low CL10 values, especially on the red channel, indicated a larger spread of the data with respect to the clear sky conditions. We suspect that at least part of these differences was due to a disparity in the base time between the AERONET and SONA measurements. The whole AERONET almucantar measurement sequence took a few minutes (~5–15 min, depending on the SZA), having only one timestamp for the entire almucantar. Opposingly, we extracted the complete almucantar from a single HDR image of the sky-camera, which were stored every minute. Note that under changing sky conditions, any given position in the sky could change from clear to cloudy, and vice versa, in just a few seconds. This could be the origin of the large differences between the results, and may be especially important in partially cloud conditions.
The results obtained for all scenarios sum up the characteristics of clear sky and partially cloudy sky conditions. For this reason, it behaved as a weighted average due to the proportion of cases and was closer to the results showed in partially cloudy conditions, since they represented 75% of the cases.
Our results for clear sky conditions were similar to [43] and demonstrated an improved performance relative to [23,26]. However, to our knowledge, this was the first time that sky radiance was measured and validated in all atmospheric conditions using a sky-camera. Moreover, despite all issues of the comparison discussed above, our results show a good agreement with those of AERONET.

5.2. Determination of Broadband Horizontal Diffuse Irradiance

We obtained the DHI from our radiometric HDR imagery dataset, from February 2020 to March 2021, following the methodology explained in Section 4.2 and applying a 5° scattering angle mask to compute the DHISONA. Hereafter, in this section, when speaking of saturated pixels, we will refer to those that have not already been considered within the mask. Therefore, the DHISONA was determined following two different approaches in terms of how we treated the saturated pixels within the images. In the first approach, saturated pixels were not used to compute the DHISONA. In the second, saturated pixels were considered by using the maximum radiative value that the camera can measure. Therefore, we have two sets of DHISONA in all kinds of sky scenarios. Both sets of DHI results derived from the sky-camera were compared for validation against those acquired by a CMP21 pyranometer mounted on a sun tracker with a shadow ball (Figure 11). In addition, a third dataset, using only images with no saturated pixels, was used in the comparison (grey points in Figure 11).
The SONA and pyranometer DHI show a high degree of correlation in all cases, with R > 0.98 independently of the way that saturated pixels were considered to compute DHISONA (Figure 11a,b). In general terms, all fittings produced a very similar slope of around two (Table 5). This slope largely accounts for the different spectral range covered by the instruments and their different spectral response. Our results for the slope are as expected, since the SONA sky-camera is a three broadband channel sensor covering the 300−1100 nm spectral range (Figure 5), which represents approximatively half of the quasi-flat spectral response of the pyranometer in the 340−2800 nm range. This fact was also supported by the simple test of calculating the ratio between the pyranometer spectral response and the sum of the spectral responses of the sky-camera channels. A ratio of 2.06 was obtained, when the spectral responses of the devices were convolved with the sun irradiance model ASTM G173-03 and an air mass of 1.5 (not shown).
No substantial differences in the fit parameters were observed regarding the way saturated pixels were used to obtain the DHISONA. This is mainly due to the fact that, from a statistical point of view, the images containing saturated pixels do not represent a large proportion of the number of considered images. However, the best fit is obtained using data from the images with no saturated pixels, since the offset and RMSE were significantly reduced with respect to the other two cases (Table 5). These images correspond to a pyranometer DHI of lower than 500 Wm−2 (grey points in Figure 11).
In spite of that, DHISONA presents a notable dependence on the percentage of saturated pixels, especially at higher values of DHI (>400 Wm−2) measured by the pyranometer (Figure 11a,b). In fact, an increasing underestimation of DHISONA with respect to the pyranometer values was observed as the fraction of saturated pixels increased. As expected, this is especially relevant when the saturated pixels are not used in the computations (Figure 11a,c), since a substantial part of the sky radiance may not have been considered in the retrieval. This underestimation effect is more evident for images with percentages of saturated pixels of larger than 3.5%, for which most of these DHI results fall out of the 95% confidence interval for the predictions (±50 Wm−2) (Figure 11c).
Conversely, the dependence on the percentage of saturated pixels was significantly reduced when the DHISONA was obtained following our second approach (Figure 11b,d). In this case, the 95% confidence interval was reduced to ± 40 Wm−2. In spite of the inherent uncertainty of the radiative value of the saturated pixels, they had a significant effect in reducing the DHISONA, even for low saturated pixel percentages.
Usually, a large occurrence of saturated pixels is associated with the presence of a high aerosol load that produces a strong scattering of sun light in the circumsolar area or under partially cloudy sky conditions, in which very reflective clouds do not cover the sun. Both atmospheric conditions produced an increase in DHI that was not captured well by the SONA.
The best results of the comparison were obtained when the images were free of saturated pixels, even though the data used in this model did not cover all possible DHI values. It may represent an issue in the attempt to obtain the DHISONA in all sky conditions, since the underestimation effect of the saturated pixels appears for higher DHI values. Therefore, for operational purposes, we chose the model that accounted the saturated pixels as the maximum value. It allowed the SONA to determine the DHI in a more reliable way by accounting for the missing spectral region and regarding the saturated pixels, with less dependency on the sky conditions (not shown in this paper).
Figure 12 shows the predictions of the DHISONA by applying the spectral response correction factor and the offset provided by the chosen fit model, compared to the pyranometer DHI from 28 March to 27 July 2021 (not used in Figure 11). The comparison results indicate a high degree of agreement between data, with a slope of close to one and the offset relative to the maximum irradiance being well below the pyranometer uncertainty (~5%). The maximum difference between the pyranometer and DHISONA is 2% considering all the sky conditions in the validation. In addition, 95% of the images (±2 standard deviation) show DHI values within the ±50 Wm−2. These results suggest an optimal capability of our methodology to determine the DHI from a well-calibrated sky-camera.
Our validation results are satisfactory, since they improve those reported by [19] that showed lower correlations in comparison to the pyranometer measurements using a sky-camera and a different methodological approach.

5.3. Validation of Broadband Diffuse on an Arbitrarily Oriented Plane

The broadband diffuse radiation on an arbitrary oriented plane DTIβ,ϕ can be determined from sky-cameras. In our case, we used β = 30° and ϕ = 0° to determine DTI30,0 from SONA since the validation was carried out against the measurements of a CM11 pyranometer set up on a 30° tilted and South-oriented plane, from 4 August to 5 September 2021. As discussed in Section 5.2, the DHISONA was corrected by a spectral response range factor to match the pyranometer values, giving saturated pixels the maximum possible measured radiative value.
The CM11 pyranometer also measured the global irradiance on the selected tilted plane GTIβ,ϕ. However, to be able to carry out the comparison we determined the DTIβ,ϕ from the CM11 by subtracting the projection of the DNI, measured by a pyrheliometer on a solar tracker, from the beta tilted plane DNI·cosω following Equations (19) and (20).
D T I β , ϕ = G T I β , ϕ D N I · cos ω
cos ω = cos θ s cos β + sin θ s sin β cos ( π φ s ϕ )
The results of this comparison are shown in Figure 13. The SONA retrieval of DTIβ,ϕ offered an extremely high correlation with the independent pyranometer measurements (R = 0.99). The slope deviated 5% from the unity, indicating a slight overestimation of the SONA DTIβ,ϕ with respect to the pyranometer. Moreover, 95% of the data used in the comparison falls within a ± 50 W/m2 interval around the fit line. No particular differences were observed regarding the variability of the percentage of the saturated pixels.
After the necessary processing and corrections, the sky-camera may be a useful instrument for measuring the solar radiation in different geometries.

6. Conclusions

In this work, we have presented a methodological approach that is able to provide accurate and calibrated measurements of sky radiance and broadband solar irradiance using HDR images from a sky-camera.
Firstly, we performed a very detailed characterization of the SONA, in terms of electronics and image processing, and a significant improvement of the dynamic range of the images, conferring on them an extremely linear behavior under different illumination scenarios. In addition, we developed a methodology to automatically perform the geometric calibration of the camera. In that way, an optical characterization of the camera was carried out, obtaining the center of the image, the focal length of the fisheye lens and consequently, the observation geometry (in polar coordinates of the sky-dome). Then, the HDR images were calibrated using a radiative transfer model to obtain the sky radiance of each pixel, with an uncertainty smaller than 10%. Finally, we constructed a 1-min temporal resolution calibrated HDR imagery database, which was available from February 2020 and updated once a day.
The resulting sky radiance from these images was compared to the AERONET spectral radiances in the almucantar plane in different sun geometries and sky conditions. The results of this validation presented a very good agreement between our estimations and the AERONET measurements in all sky conditions. This agreement is especially relevant in clear sky cases, showing correlation coefficients higher than 0.96 and an RMSD < 10% for all spectral channels.
In addition, we developed methods to determine the shortwave diffuse radiation over arbitrary geometries and planes, starting from calibrated sky radiance images obtained from our sky-camera. In this sense, we obtained the diffuse radiation over a horizontal plane DHISONA. It showed a very good correlation with the independent pyranometer measurements in all sky conditions. The results of this DHISONA and pyranometer comparison were used to correct DHISONA retrievals due two facts: the limited spectral range of SONA sky-camera that does not cover the whole solar spectrum and how the saturated pixels should be treated in the retrieval of DHISONA. When these corrections were applied, the difference between pyranometer and our DHISONA retrieval was 2% and, in 95% of the images (±2 standard deviations), the DHI values were within the ±50 Wm−2.
Finally, we tested our methodology to estimate the diffuse irradiance over an arbitrarily tilted and oriented plane from SONA by comparing it to measurements collected by a pyranometer tilted 30° with a South orientation. Our results showed an extremely high correlation between both estimates (R = 0.99) in all sky conditions, which also indicates a slight overestimation of 5% of the SONA DTIβ,ϕ with respect to the pyranometer.
All results suggest an optimal capability of our methodologies to determine the sky radiance and the diffuse irradiance over arbitrary geometries from a well-calibrated sky-camera. In turn, they represent a significant contribution to the improvement of the capabilities of the sky-cameras to provide accurate and well-calibrated radiometric images of the whole sky, making them suitable for solar and atmospheric research. In addition, the algorithms and retrievals developed in this work may be of great interest for both solar energy producers and energy network operators, allowing them to monitor and estimate solar radiation incidents on an arbitrary plane with a single and affordable instrumentation on a steady setup and in all sky conditions.

Supplementary Materials

The following supporting information can be downloaded at: https://www.mdpi.com/article/10.3390/rs13245157/s1.

Author Contributions

Conceptualization, P.C.V. and J.L.G.-A.; Methodology, P.C.V., J.L.G.-A., C.P.-F. and F.S.; Software, data processing and validation, P.C.V. and C.P.-F.; Formal analysis, P.C.V. and J.L.G.-A.; Original draft preparation, P.C.V., J.L.G.-A. and C.P.-F., Review and editing, P.C.V., J.L.G.-A., C.P.-F., F.S. and M.P.U.; Project supervisor, J.L.G.-A. and M.P.U., Funding acquisition, J.L.G.-A. and M.P.U. All authors have read and agreed to the published version of the manuscript.

Funding

This work was financed jointly by the Spanish Ministry of Economy and Competitiveness (MINECO) and the European Regional Development Fund (FEDER) through RTI2018-096548-B-I00 and by the Valencia Autonomous Government through project AICO/2021/341. The participation of F. Scarlatti was supported by a PhD grant, PRE2019-089986, funded by the Spanish Ministry of Economy and Competitiveness.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

Available upon request.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. IEA. World Energy Outlook 2017; International Energy Agency and Organisation for Economic Co-operation and Development OECD: Paris, France, 2017; ISBN 978-92-64-28230-8. [Google Scholar]
  2. Tsiropoulos, I.; Nijs, W.; Tarvydas, D.; Ruiz Castello, P. Towards Net-Zero Emissions in the EU Energy System by 2050; EUR 29981 EN; Publications Office of the European Union: Luxembourg, 2020; ISBN 978-92-76-13096-3. [Google Scholar] [CrossRef]
  3. Crook, J.A.; Jones, L.A.; Forster, P.M.; Crook, R. Climate change impacts on future photovoltaic and concentrated solar power energy output. Energy Environ. Sci. 2011, 4, 3101–3109. [Google Scholar] [CrossRef]
  4. Gaetani, M.; Huld, T.; Vignati, E.; Monforti-Ferrario, F.; Dosio, A.; Raes, F. The near future availability of photovoltaic energy in Europe and Africa in climate-aerosol modeling experiments. Renew. Sustain. Energy Rev. 2014, 38, 706–716. [Google Scholar] [CrossRef]
  5. Nieto, J.; Carpintero, O.; Lobejón, L.F.; Miguel, L.J. An ecological macroeconomics model: The energy transition in the EU. Energy Policy 2020, 145, 111726. [Google Scholar] [CrossRef]
  6. Alonso-Montesinos, J.; Batlles, F.; Portillo, C. Solar irradiance forecasting at one-minute intervals for different sky conditions using sky camera images. Energy Convers. Manag. 2015, 105, 1166–1177. [Google Scholar] [CrossRef]
  7. Ohmura, A.; Gilgen, H.; Hegner, H.; Müller, G.; Wild, M.; Dutton, E.G.; Forgan, B.; Fröhlich, C.; Philipona, R.; Heimo, A.; et al. Baseline Surface Radiation Network (BSRN/WCRP): New Precision Radiometry for Climate Research. Bull. Am. Meteorol. Soc. 1998, 79, 2115–2136. [Google Scholar] [CrossRef] [Green Version]
  8. Nowak, D.; Vuilleumier, L.; Long, C.N.; Ohmura, A. Solar irradiance computations compared with observations at the Baseline Surface Radiation Network Payerne site. J. Geophys. Res. Space Phys. 2008, 113. [Google Scholar] [CrossRef] [Green Version]
  9. Long, C.N.; Ackerman, T.P. Identification of clear skies from broadband pyranometer measurements and calculation of downwelling shortwave cloud effects. J. Geophys. Res. Space Phys. 2000, 105, 15609–15626. [Google Scholar] [CrossRef]
  10. Barnard, J.C.; Long, C.N. A Simple Empirical Equation to Calculate Cloud Optical Thickness Using Shortwave Broadband Measurements. J. Appl. Meteorol. 2004, 43, 1057–1066. [Google Scholar] [CrossRef] [Green Version]
  11. Long, C.N.; Ackerman, T.P.; Gaustad, K.L.; Cole, J.N.S. Estimation of fractional sky cover from broadband shortwave radiometer measurements. J. Geophys. Res. Space Phys. 2006, 111. [Google Scholar] [CrossRef] [Green Version]
  12. Ridley, B.; Boland, J.; Lauret, P. Modelling of diffuse solar fraction with multiple predictors. Renew. Energy 2010, 35, 478–483. [Google Scholar] [CrossRef]
  13. Aler, R.; Galvan, I.M.; Ruiz-Arias, J.A.; Gueymard, C.A. Improving the separation of direct and diffuse solar radiation components using machine learning by gradient boosting. Sol. Energy 2017, 150, 558–569. [Google Scholar] [CrossRef]
  14. Perez, R.; Lorenz, E.; Pelland, S.; Beauharnois, M.; Van Knowe, G.; Hemker, K.; Heinemann, D.; Remund, J.; Müller, S.C.; Traunmüller, W.; et al. Comparison of numerical weather prediction solar irradiance forecasts in the US, Canada and Europe. Sol. Energy 2013, 94, 305–326. [Google Scholar] [CrossRef]
  15. Camargo, L.R.; Dorner, W. Comparison of satellite imagery based data, reanalysis data and statistical methods for mapping global solar radiation in the Lerma Valley (Salta, Argentina). Renew. Energy 2016, 99, 57–68. [Google Scholar] [CrossRef]
  16. Kim, C.K.; Kim, H.-G.; Kang, Y.-H.; Yun, C.-Y.; Lee, Y.G. Intercomparison of Satellite-Derived Solar Irradiance from the GEO-KOMSAT-2A and HIMAWARI-8/9 Satellites by the Evaluation with Ground Observations. Remote Sens. 2020, 12, 2149. [Google Scholar] [CrossRef]
  17. Cristóbal, J.; Anderson, M.C. Validation of a Meteosat Second Generation solar radiation dataset over the northeastern Iberian Peninsula. Hydrol. Earth Syst. Sci. 2013, 17, 163–175. [Google Scholar] [CrossRef] [Green Version]
  18. Greuell, W.; Meirink, J.F.; Wang, P. Retrieval and validation of global, direct, and diffuse irradiance derived from SEVIRI satellite observations. J. Geophys. Res. Atmos. 2013, 118, 2340–2361. [Google Scholar] [CrossRef]
  19. Alonso-Montesinos, J.; Batlles, F. The use of a sky camera for solar radiation estimation based on digital image processing. Energy 2015, 90, 377–386. [Google Scholar] [CrossRef]
  20. Mejia, F.A.; Kurtz, B.; Murray, K.; Hinkelman, L.M.; Sengupta, M.; Xie, Y.; Kleissl, J. Coupling sky images with radiative transfer models: A new method to estimate cloud optical depth. Atmos. Meas. Tech. 2016, 9, 4151–4165. [Google Scholar] [CrossRef] [Green Version]
  21. Lothon, M.; Barnéoud, P.; Gabella, O.; Lohou, F.; Derrien, S.; Rondi, S.; Chiriaco, M.; Bastin, S.; Dupont, J.C.; Haeffelin, M.; et al. ELIFAN, an algorithm for the estimation of cloud cover from sky imagers. Atmos. Meas. Tech. 2019, 12, 5519–5534. [Google Scholar] [CrossRef] [Green Version]
  22. Xie, W.; Liu, D.; Yang, M.; Chen, S.; Wang, B.; Wang, Z.; Xia, Y.; Liu, Y.; Wang, Y.; Zhang, C. SegCloud: A novel cloud image segmentation model using a deep convolutional neural network for ground-based all-sky-view camera observation. Atmos. Meas. Tech. 2020, 13, 1953–1961. [Google Scholar] [CrossRef] [Green Version]
  23. Román, R.; Antón, M.; Cazorla, A.; de Miguel, A.; Olmo, F.J.; Bilbao, J.; Alados-Arboledas, L. Calibration of an all-sky camerasky-camera for obtaining sky-radiance at three wavelengths. Atmos. Meas. Tech. 2012, 5, 2013–2024. [Google Scholar] [CrossRef] [Green Version]
  24. Cazorla, A.; Olmo, F.; Alados-Arboledas, L.; Reyes, F.J.O. Using a Sky Imager for aerosol characterization. Atmos. Environ. 2008, 42, 2739–2745. [Google Scholar] [CrossRef]
  25. Ghonima, M.S.; Urquhart, B.; Chow, C.W.; Shields, J.E.; Cazorla, A.; Kleissl, J. A method for cloud detection and opacity classification based on ground based sky imagery. Atmos. Meas. Tech. 2012, 5, 2881–2892. [Google Scholar] [CrossRef] [Green Version]
  26. Román, R.; Torres, B.; Fuertes, S.; Cachorro, V.E.; Dubovik, O.; Toledano, C.; Cazorla, A.; Barreto, A.; Bosch, J.L.; Lapyonok, T.; et al. Remote sensing of lunar aureole with a sky camera: Adding information in the nocturnal retrieval of aerosol properties with GRASP code. Remote Sens. Environ. 2017, 196, 238–252. [Google Scholar] [CrossRef] [Green Version]
  27. Román, R.; Antuña-Sánchez, J.C.; Cachorro, V.E.; Toledano, C.; Torres, B.; Mateos, D.; Fuertes, D.; López, C.; González, R.; Lapionok, T.; et al. Retrieval of aerosol properties using relative radiance measurements from an all-sky camera. Atmos. Meas. Tech. Discuss. 2021. [Google Scholar] [CrossRef]
  28. Chu, Y.; Pedro, H.; Li, M.; Coimbra, C.F. Real-time forecasting of solar irradiance ramps with smart image processing. Sol. Energy 2015, 114, 91–104. [Google Scholar] [CrossRef]
  29. Kurtz, B.; Kleissl, J. Measuring diffuse, direct, and global irradiance using a sky imager. Sol. Energy 2017, 141, 311–322. [Google Scholar] [CrossRef]
  30. Scolari, E.; Sossan, F.; Haure-Touzé, M.; Paolone, M. Local estimation of the global horizontal irradiance using an all-sky camera. Sol. Energy 2018, 173, 1225–1235. [Google Scholar] [CrossRef]
  31. Estellés, V.; Martínez-Lozano, J.A.; Utrillas, M.P.; Campanelli, M. Columnar aerosol properties in Valencia (Spain) by ground-based Sun photometry. J. Geophys. Res. 2007, 112, D11201. [Google Scholar] [CrossRef] [Green Version]
  32. Segura, S.; Estellés, V.; Esteve, A.; Marcos, C.; Utrillas, M.; Martínez-Lozano, J. Multiyear in-situ measurements of atmospheric aerosol absorption properties at an urban coastal site in western Mediterranean. Atmos. Environ. 2016, 129, 18–26. [Google Scholar] [CrossRef]
  33. Marcos, C.R.; Gómez-Amo, J.L.; Péris, C.; Pedrós, R.; Utrillas, M.P.; Martínez-Lozano, J.A. Analysis of four years of ceilometer-derived aerosol backscatter profiles in a coastal site of the western Mediterranean. Atmos. Res. 2018, 213, 331–345. [Google Scholar] [CrossRef]
  34. Gómez-Amo, J.L.; Estellés, V.; Marcos, C.; Segura, S.; Esteve, A.R.; Pedrós, R.; Utrillas, M.P.; Martínez-Lozano, J.A. Impact of dust and smoke mixing on column-integrated aerosol properties from observations during a severe wildfire episode over Valencia (Spain). Sci. Total Environ. 2017, 599–600, 2121–2134. [Google Scholar] [CrossRef]
  35. Gómez-Amo, J.; Freile-Aranda, M.; Camarasa, J.; Estellés, V.; Utrillas, M.; Martínez-Lozano, J. Empirical estimates of the radiative impact of an unusually extreme dust and wildfire episode on the performance of a photovoltaic plant in Western Mediterranean. Appl. Energy 2018, 235, 1226–1234. [Google Scholar] [CrossRef]
  36. Holben, B.N.; Eck, T.F.; Slutsker, I.; Tanré, D.; Buis, J.P.; Setzer, A.; Vermote, E.; Reagan, J.A.; Kaufman, Y.J.; Nakajima, T.; et al. AERONET—A Federated Instrument Network and Data Archive for Aerosol Characterization. Remote Sens. Environ. 1998, 66, 1–16. [Google Scholar] [CrossRef]
  37. Estellés, V.; Campanelli, M.; Smyth, T.J.; Utrillas, M.P.; Martínez-Lozano, J.A. AERONET and ESR sun direct products comparison performed on Cimel CE318 and Prede POM01 solar radiometers. Atmos. Chem. Phys. 2012, 12, 11619–11630. [Google Scholar] [CrossRef] [Green Version]
  38. Dubovik, O.; Holben, B.; Eck, T.F.; Smirnov, A.; Kaufman, Y.J.; King, M.D.; Tanré, D.; Slutsker, I. Variability of absorption and optical properties of key aerosol types observed in worldwide locations. J. Atmos. Sci. 2002, 59, 590–608. [Google Scholar] [CrossRef]
  39. Ma, C.; Arias, E.F.; Eubanks, T.M.; Fey, A.L.; Gontier, A.-M.; Jacobs, C.S.; Sovers, O.J.; Archinal, B.A.; Charlot, P. The International Celestial Reference Frame as Realized by Very Long Baseline Interferometry. Astron. J. 1998, 116, 516–546. [Google Scholar] [CrossRef]
  40. Cox, W. The theory of perspective. Br. J. Photogr. 1969, 116, 628–659. [Google Scholar]
  41. Debevec, P.E.; Malik, J. Recovering high dynamic range radiance maps from photographs. SIGGRAPH 2003, 1, 374–380. [Google Scholar] [CrossRef]
  42. Mitsunaga, T.; Nayar, S. Radiometric self calibration. In Proceedings of the 1999 IEEE Computer Society Conference on Computer Vision and Pattern Recognition (Cat. No PR00149), Fort Collins, CO, USA, 23–25 June 1999; Volume 1, pp. 374–380. [Google Scholar] [CrossRef]
  43. Antuña-Sánchez, J.C.; Román, R.; Cachorro, V.E.; Toledano, C.; López, C.; González, R.; Mateos, D.; Calle, A.; de Frutos, Á.M. Relative sky-radiance from multi-exposure all-sky camerasky-camera images. Atmos. Meas. Tech. 2021, 14, 2201–2217. [Google Scholar] [CrossRef]
  44. Emde, C.; Buras-Schnell, R.; Kylling, A.; Mayer, B.; Gasteiger, J.; Hamann, U.; Kylling, J.; Richter, B.; Pause, C.; Dowling, T.; et al. The libRadtran software package for radiative transfer calculations (version 2.0.1). Geosci. Model Dev. 2016, 9, 1647–1672. [Google Scholar] [CrossRef] [Green Version]
  45. Kholopov, G.K. Calculation of the effective wavelength of a measuring system. J. Appl. Spectrosc. 1975, 23, 1146–1147. [Google Scholar] [CrossRef]
  46. Henyey, L.C.; Greenstein, J.L. Diffuse radiation in the Galaxy. Astrophys. J. 1941, 93, 70–83. [Google Scholar] [CrossRef]
  47. Vermeulen, A. Caractérisation des Aérosols á Partir de Mesures Optiques Passives au Sol: Apport des Luminances Totale et Polarisée Mesurées dans le Plan Principal. Ph.D. Thesis, Université de Lille, Lille, France, 1996. [Google Scholar]
  48. Torres, B.; Toledano, C.; Berjón, A.; Fuertes, D.; Molina, V.; González, R.; Canini, M.; Cachorro, V.E.; Goloub, P.; Podvin, T.; et al. Measurements on pointing error and field of view of Cimel-318 Sun photometers in the scope of AERONET. Atmos. Meas. Tech. 2013, 6, 2207–2220. [Google Scholar] [CrossRef] [Green Version]
  49. Blanc, P.; Espinar, B.; Geuder, N.; Gueymard, C.; Meyer, R.; Pitz-Paal, R.; Reinhardt, B.; Renné, D.; Sengupta, M.; Wald, L.; et al. Direct normal irradiance related definitions and applications: The circumsolar issue. Sol. Energy 2014, 110, 561–577. [Google Scholar] [CrossRef]
  50. Major, G. Estimation of the error caused by the circumsolar radiation when measuring global radiation as a sum of direct and diffuse radiation. Sol. Energy 1992, 48, 249–252. [Google Scholar] [CrossRef]
Figure 1. Sky dome projection on our SONA sky-camera: (a) Pixel Zenith Angle; (b) Pixel Azimuth Angle; (c) Pixel Solid Angle.
Figure 1. Sky dome projection on our SONA sky-camera: (a) Pixel Zenith Angle; (b) Pixel Azimuth Angle; (c) Pixel Solid Angle.
Remotesensing 13 05157 g001
Figure 2. Example of the determination to obtain the response function of the sensor for the blue channel: (a) fragments of the characteristic curve at nine exposition times; (b) the fitting characteristic curve.
Figure 2. Example of the determination to obtain the response function of the sensor for the blue channel: (a) fragments of the characteristic curve at nine exposition times; (b) the fitting characteristic curve.
Remotesensing 13 05157 g002
Figure 3. Comparison of weighting functions. Original triangular function [41] and our own function w(z), with b = 8 and a = 12 and 24.
Figure 3. Comparison of weighting functions. Original triangular function [41] and our own function w(z), with b = 8 and a = 12 and 24.
Remotesensing 13 05157 g003
Figure 4. Sky-camera response function obtained for the three channels (RGB) using the weighting function in Equation (2).
Figure 4. Sky-camera response function obtained for the three channels (RGB) using the weighting function in Equation (2).
Remotesensing 13 05157 g004
Figure 5. Normalized Spectral Response of BFLY-PGE-20E4C-CS Point Grey Camera for RGB channels. (From Point Gray Imaging Performance Specification v7.0).
Figure 5. Normalized Spectral Response of BFLY-PGE-20E4C-CS Point Grey Camera for RGB channels. (From Point Gray Imaging Performance Specification v7.0).
Remotesensing 13 05157 g005
Figure 6. Probability density function of the calibration factor for each channel at its effective wavelength: 480 nm, 541 nm and 615 nm for the blue, green and red channels, respectively.
Figure 6. Probability density function of the calibration factor for each channel at its effective wavelength: 480 nm, 541 nm and 615 nm for the blue, green and red channels, respectively.
Remotesensing 13 05157 g006
Figure 7. Examples of the representation of the almucantar (red dots) and principal (green dots) planes of the sun in an image on 27 June 2021 at 07:30. Isolines of different scattering angles are also shown with a 10° step.
Figure 7. Examples of the representation of the almucantar (red dots) and principal (green dots) planes of the sun in an image on 27 June 2021 at 07:30. Isolines of different scattering angles are also shown with a 10° step.
Remotesensing 13 05157 g007
Figure 8. (a) Mask to obtain the DHI. (b) Red calibrated Irradiance in µW cm−2. (c) Green calibrated Irradiance in µW cm−2. (d) Blue calibrated irradiance in µW cm−2. From a calibrated HDR sky-camera image on 27 June 2021 at 07:30 h under clear sky conditions.
Figure 8. (a) Mask to obtain the DHI. (b) Red calibrated Irradiance in µW cm−2. (c) Green calibrated Irradiance in µW cm−2. (d) Blue calibrated irradiance in µW cm−2. From a calibrated HDR sky-camera image on 27 June 2021 at 07:30 h under clear sky conditions.
Remotesensing 13 05157 g008
Figure 9. Sketch of an arbitrary tilted and oriented plane and the transposition of the coordinated reference system.
Figure 9. Sketch of an arbitrary tilted and oriented plane and the transposition of the coordinated reference system.
Remotesensing 13 05157 g009
Figure 10. Example of a PZA’ matrix for a tilted plane of 30° and oriented to geographical South with an azimuthal misalignment of 4.58°.
Figure 10. Example of a PZA’ matrix for a tilted plane of 30° and oriented to geographical South with an azimuthal misalignment of 4.58°.
Remotesensing 13 05157 g010
Figure 11. Comparison of pyranometer DHI vs. SONA-DHI from 12 February 2020 to 26 March 2021: (a) scatter plot, not considering saturated pixels; (b) scatter plot including saturated pixels; (c) residuals, not considering saturated pixels; and (d) residuals including the saturated pixels. The color bar accounts for the percentage of saturated pixels in each image, while the grey color represents the images with no saturated pixels. f(x) is the fit function for all images, represented by the red colored line, and g(x) is the fit line using only images without saturated pixels, represented by the green colored line. Two red dashed lines represent the 95% confidence interval of the residuals. Note that the residual is defined as the pyranometer DHI minus SONA DHI.
Figure 11. Comparison of pyranometer DHI vs. SONA-DHI from 12 February 2020 to 26 March 2021: (a) scatter plot, not considering saturated pixels; (b) scatter plot including saturated pixels; (c) residuals, not considering saturated pixels; and (d) residuals including the saturated pixels. The color bar accounts for the percentage of saturated pixels in each image, while the grey color represents the images with no saturated pixels. f(x) is the fit function for all images, represented by the red colored line, and g(x) is the fit line using only images without saturated pixels, represented by the green colored line. Two red dashed lines represent the 95% confidence interval of the residuals. Note that the residual is defined as the pyranometer DHI minus SONA DHI.
Remotesensing 13 05157 g011
Figure 12. Comparison of the pyranometer DHI vs. SONA DHI from 28 March to 27 July 2021. SONA DHI were computed by applying the spectral response correction obtained in the comparison against the pyranometer, treating the saturated pixels as the maximum radiance. The color bar accounts for the percentage of saturated pixels in each image, while the grey color represents the images with no saturated pixels. f(x) is the fit function for all the images, represented by the red colored line, and g(x) is the fit line using only images without saturated pixels, represented by the green colored line. Two red dashed lines represent the 95% confidence interval of the residuals.
Figure 12. Comparison of the pyranometer DHI vs. SONA DHI from 28 March to 27 July 2021. SONA DHI were computed by applying the spectral response correction obtained in the comparison against the pyranometer, treating the saturated pixels as the maximum radiance. The color bar accounts for the percentage of saturated pixels in each image, while the grey color represents the images with no saturated pixels. f(x) is the fit function for all the images, represented by the red colored line, and g(x) is the fit line using only images without saturated pixels, represented by the green colored line. Two red dashed lines represent the 95% confidence interval of the residuals.
Remotesensing 13 05157 g012
Figure 13. Comparison of the Pyranometer DHI on a 30° tilted plane oriented to South and the DHI retrieval from 4 to 5 September 2021. The samples are gradated according to their percentage of saturated pixels. f(x) is the fit function, represented by the red colored line. Two red dashed lines represent the 95% confidence interval of the residuals.
Figure 13. Comparison of the Pyranometer DHI on a 30° tilted plane oriented to South and the DHI retrieval from 4 to 5 September 2021. The samples are gradated according to their percentage of saturated pixels. f(x) is the fit function, represented by the red colored line. Two red dashed lines represent the 95% confidence interval of the residuals.
Remotesensing 13 05157 g013
Table 1. Regression fit results for different fisheye projections, where r represents the radius from the pixel to optical center, f is the focal length and θ is the PZA.
Table 1. Regression fit results for different fisheye projections, where r represents the radius from the pixel to optical center, f is the focal length and θ is the PZA.
Centre
Fisheye Projectionf (µm)X ± 2Y ± 2R2
Equidistant r = f · θ 0.845775841.0
Orthographic r = 2 · f · s i n ( θ / 2 ) 0.015945720.8
Stereographic r = 2 · f · t a n ( θ / 2 ) 0.025765890.9
Table 2. Effective wavelengths calculated for each channel of our sky-camera. The ± 1 standard deviation is assumed as the uncertainty.
Table 2. Effective wavelengths calculated for each channel of our sky-camera. The ± 1 standard deviation is assumed as the uncertainty.
Wavelength (nm)λeff ± 1 σ(nm)
Blue480 ± 6
Green541 ± 5
Red615 ± 6
Table 3. Calibration coefficients (Kλ) of each channel of the sky-camera and its absolute and relative uncertainty.
Table 3. Calibration coefficients (Kλ) of each channel of the sky-camera and its absolute and relative uncertainty.
λ (nm) K λ   ±   σ λ   ( μ W   c m 2   n m 1 ) Relative Uncertainty (%)
4800.26 ± 0.027%
5410.34 ± 0.027%
6150.50 ± 0.0510%
Table 4. Comparison between AERONET almucantar and SONA estimated radiances from 12 February to 31 October 2020. N represents the number of samples for each scenario and R is the correlation factor.
Table 4. Comparison between AERONET almucantar and SONA estimated radiances from 12 February to 31 October 2020. N represents the number of samples for each scenario and R is the correlation factor.
Clear SkyPartially Cloudy SkyAll Scenarios
λRMBDCL10NRRMBDCL10NRRMBDCL10NR
(nm)(%)(%)(#)(#)(%)(%)(#)(#)(%)(%)(#)(#)
67599112,6800.96284236,2790.81235148,9590.82
50068587200.98107523,1160.9287931,8360.93
44049112,3200.98107635,5130.9088147,8330.91
Table 5. Linear fit parameters obtained from the validation of the broadband diffuse irradiance.
Table 5. Linear fit parameters obtained from the validation of the broadband diffuse irradiance.
Saturated Pixels for DHI CalculationSky ConditionsSlopeOffsetRRMSEN
Not consideredAll2.10−20.600.9821.40287,398
Considered as max IrradianceAll2.04−19.610.9819.8928,7398
Not presentAll1.97−7.281.006.8486,108
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Valdelomar, P.C.; Gómez-Amo, J.L.; Peris-Ferrús, C.; Scarlatti, F.; Utrillas, M.P. Feasibility of Ground-Based Sky-Camera HDR Imagery to Determine Solar Irradiance and Sky Radiance over Different Geometries and Sky Conditions. Remote Sens. 2021, 13, 5157. https://doi.org/10.3390/rs13245157

AMA Style

Valdelomar PC, Gómez-Amo JL, Peris-Ferrús C, Scarlatti F, Utrillas MP. Feasibility of Ground-Based Sky-Camera HDR Imagery to Determine Solar Irradiance and Sky Radiance over Different Geometries and Sky Conditions. Remote Sensing. 2021; 13(24):5157. https://doi.org/10.3390/rs13245157

Chicago/Turabian Style

Valdelomar, Pedro C., José L. Gómez-Amo, Caterina Peris-Ferrús, Francesco Scarlatti, and María Pilar Utrillas. 2021. "Feasibility of Ground-Based Sky-Camera HDR Imagery to Determine Solar Irradiance and Sky Radiance over Different Geometries and Sky Conditions" Remote Sensing 13, no. 24: 5157. https://doi.org/10.3390/rs13245157

APA Style

Valdelomar, P. C., Gómez-Amo, J. L., Peris-Ferrús, C., Scarlatti, F., & Utrillas, M. P. (2021). Feasibility of Ground-Based Sky-Camera HDR Imagery to Determine Solar Irradiance and Sky Radiance over Different Geometries and Sky Conditions. Remote Sensing, 13(24), 5157. https://doi.org/10.3390/rs13245157

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop