Next Article in Journal
The Baltic States’ Move toward a Sustainable Energy Future
Next Article in Special Issue
Boosting the Transesterification Reaction by Adding a Single Na Atom into g-C3N4 Catalyst for Biodiesel Production: A First-Principles Study
Previous Article in Journal
Abnormal Time-Domain Current Spectrum of Inorganic Insulating Powder under DC Voltage
Previous Article in Special Issue
Regional Photovoltaic Power Forecasting Using Vector Autoregression Model in South Korea
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Comparative Study on Shading Database Construction for Urban Roads Using 3D Models and Fisheye Images for Efficient Operation of Solar-Powered Electric Vehicles

1
Energy Resources Institute, Pukyong National University, Busan 48513, Korea
2
Department of Energy Resources Engineering, Pukyong National University, Busan 48513, Korea
*
Author to whom correspondence should be addressed.
Energies 2022, 15(21), 8228; https://doi.org/10.3390/en15218228
Submission received: 11 October 2022 / Revised: 31 October 2022 / Accepted: 2 November 2022 / Published: 4 November 2022

Abstract

:
Accounting for shadows on urban roads is a complex task in the operation of solar-powered electric vehicles. There have been few opportunities to compare the methods and tools for the construction of an effective shading database for urban roads. This study quantitatively investigated and compared shading matrices generated from 3D models or fisheye images. Skymaps were formed considering the geometry of nearby shading obstructions. Sun-path diagrams tracking the position of the sun by time and season were overlaid on the skymaps, and month-by-hour shading matrices were calculated. Mean squared error (MSE) was used to clarify the quantitative differences between the shading matrices. The cases were divided into A, B, and C according to the presence of buildings and trees around the survey points. Under case A (trees), case B (buildings and trees), and case C (buildings), the average MSEs between the matrices were 24.5%, 23.9%, and 2.1%, respectively. The shading matrices using either 3D models or fisheye images provided accurate shading effects caused by buildings. In contrast, the shading effects of trees were more accurately analyzed when using fisheye images. The findings of this study provide a background for constructing shading databases of urban road environments for the optimal operation of solar-powered electric vehicles.

1. Introduction

The Paris Agreement adopted at the 21st Conference of the Parties (COP21) in 2015 established goals to mitigate global warming [1]. In this agreement, countries defined nationally determined contributions (NDCs), which are greenhouse gas (GHG) reduction targets that reflect their specific circumstances. Moreover, they agreed to implement global stocktaking every five years [2]. To achieve the GHG reduction goals of the 2030 NDCs and net-zero emissions by 2050, countries have been defining government-level policies and legal regulations, and they have been implementing detailed action plans for social and industrial sectors. Companies worldwide are making efforts to reduce GHG emissions, such as limiting fossil fuel use, constructing new and renewable energy facilities, and adopting carbon capture, utilization, and storage (CCUS) technology.
To reach net-zero emissions, the automobile industry is actively developing and distributing eco-friendly electric vehicles (EVs) as an alternative to internal combustion engine (ICE) vehicles. For example, the German automaker Volkswagen announced a “New Auto” strategy, which aims to launch 70 new EV models by 2028 and convert 50% of its global sales to battery-electric vehicles (BEVs) by 2040 [3]. Meanwhile, Volvo (Swedish automaker) announced its goal to achieve net-zero GHG emissions by 2040 by converting half of its global sales to fully electric cars and the other half to hybrid electric vehicles (HEVs) by 2025 [4]. Motivated by the continuous launches of EV models and subsidy policies, the global EV market is experiencing rapid growth. Despite the relatively low global vehicle sales owing to the COVID-19 pandemic, approximately 6.3 million EVs were sold in 2021, which represented a 102% increase from 2020 [5]. The International Energy Agency (IEA) estimated that EV sales in 2030 will reach approximately 20 million, and the average annual growth rate will exceed 30% [6].
Automakers and startups worldwide have launched solar-powered electric vehicles (SPEVs), which are EVs equipped with solar panels [7]. SPEVs convert incident solar energy into electrical energy, which charges the vehicle battery [8]. Electricity can be charged while driving or parked, and the charged electricity is used to increase the driving range or operate the air conditioning system [9]. Many factors have led to the rapid rise of SPEVs, including the long and frequent EV battery charging, the improved efficiency of solar cells, the global solar cell price decline, global EV market growth, etc. The Ford Motor Company (Dearborn, MI, USA) was the first to reveal the C-MAX Solar Energi Concept at the International Consumer Electronics Show (CES) in 2014 [10], and, since then, companies have continuously announced diverse SPEV concepts and models. Representative SPEVs include Lightyear One (Lightyear) [11], Solar Roof (Hyundai Motor Group) [12], Luna (Aptera Motors) [13], Sion (Sono Motors) [14], Vision EQXX (Mercedes-Benz) [15], and bZ4X (Toyota Motor Corporation) [16]. Many researchers have actively aimed to evaluate the feasibility of SPEVs [17,18,19,20]. Meanwhile, several researchers have argued that the introduction of SPEVs is hasty due to the PV power generation sensitivity to regional climate conditions [21] and efficiency reduction by static and low-concentration concentrator PVs (CPVs) installed on a horizontal surface [22]. Therefore, continuous developments and research on SPEVs are required.
Many studies have been conducted to evaluate the potential of SPEVs. There are two main research topics, namely the solar irradiance estimation and PV power output estimation of SPEVs. Studies have focused on various types of vehicles, such as trucks, vans, buses, trains, as well as passenger cars. Araki et al. [23,24,25] developed the 3D sunshine irradiation model of the 3D curved solar panels. Lodi et al. [26] predicted solar irradiance in driving and parking conditions considering the shading effect from physical obstacles. Ota et al. [27,28,29] measured the solar irradiance incident on a car-roof PV using a mobile multi-pyranometer according to the weather conditions. Wetzel et al. [30,31] and Ng et al. [32] measured continuous solar irradiance along the roads and inferred the shading losses on the car-roof PV from the measurement data. Klutter et al. [33] investigated solar ranges and average annual mileage according to the different vehicle types (trucks and vans) and operating scenarios using the hourly global horizontal irradiance (GHI) data of European cities. Oh et al. [34] estimated the PV power of public solar buses considering the solar irradiance and shadow effects with high spatial and temporal resolution. Kim et al. [35] estimated the power generation of solar trains considering the solar irradiance and the shadow effects along the railroad using geographical information system (GIS) data.
As various SPEV models have been launched in the EV market, software technologies to efficiently operate them are being actively developed. SPEV software technologies include two main types. The first is navigation technology, which predicts energy consumption and photovoltaic (PV) generation to analyze the optimal driving route for maximum energy efficiency. The second analyzes optimal parking locations to maximize PV power generation. Many researchers have estimated the energy consumption and PV power generation of SPEVs and analyzed optimal driving routes while considering solar insolation, shading effects from objects around the road, driving time, and traffic flow [36,37,38,39,40,41]. For instance, a previous study selected optimal parking lot locations considering the shadows of buildings and trees [42].
Unlike fixed PV systems, which are installed at large sites such as building rooftops, mountain slopes, water bodies, or parking lots, the solar panels in SPEVs are expected to generate PV power while moving along road networks. Therefore, the shadows in urban environments represent a major influence on the PV power generation of SPEVs. Buildings, roadside trees, telephone poles, and the terrain around roads shadow the solar panels and reduce the PV power generation efficiency. To improve the PV power generation of SPEVs, driving routes with no shading are preferred. However, accounting for shadows on urban roads is a complex task. The shape and size of shading obstructions and of shadows differ at each point on the road. Moreover, because the sun’s position changes with time and season, the shape of the shadows also changes over time, even at the same point. Therefore, predicting the SPEV’s total PV generation and analyzing optimal driving routes requires the analysis of the shape of shadows according to season and time at each point of the road and the construction of a shading database.
Constructing a shading database on an urban road involves the determination of a shading matrix at numerous points of the road. When constructing a shading database for an urban road network environment, it is necessary to consider the accuracy and costs of analyzing shadow shapes and calculating the shaping matrix using either 3D models or fisheye images. The use of 3D models enables the quick analysis of shadow shapes in many points. In contrast, the use of fisheye images reflects the shape of the actual object in the photo, which enables an accurate analysis of shadow effects while considering all shading obstructions. Previous studies mainly use these two tools to estimate solar irradiance and predict the potential power generation of SPEVs [23,24,25,26,27,28,29,30,31,32,33,34,35,36,37,38,39,40,41,42]. However, to the best of our knowledge, no study has discussed specific methods for the construction of a large-scale urban road shading database to analyze optimal driving routes and parking lot locations for SPEVs. Furthermore, although a previous study quantitatively compared the PV power generation of SPEVs using 3D models and fisheye images [23,24,25,26,27,28,29,30,31,32,33,34,35,36,37,38,39,40,41,42], no study has quantitatively compared these two methods and their applicability for urban roads.
The purpose of this study was to quantitatively compare the shading matrices formed by 3D models and fisheye images to construct a shading database for urban roads. Figure 1 represents a diagram outlining the methodology of this study. Using 3D models and fisheye images, skymaps were generated to capture the open sky and geometry of shading obstructions around the survey points. The sun-path diagram was overlaid on the skymap to calculate the monthly and hourly shading factors. Shading matrices were defined at the point of interest, the mean squared error (MSE) between the two shading matrices was calculated, and the quantitative difference was estimated. Twelve survey points at the Pukyong National University Daeyeon Campus were evaluated, and the quantitative differences between the shading matrices were analyzed. The advantages and disadvantages of the two methods are discussed, and specific applications for the construction of a shading database for large-scale urban areas are presented.
This paper is organized as follows. Section 2 describes the literature review on SPEVs. Section 3 suggests the comparison methods of shading matrices derived from 3D models and fisheye images. Section 4 presents the results of the quantitative comparison of shading matrices generated at survey points in three scenarios, followed by the discussion and conclusions in Section 5 and Section 6.

2. Literature Review

2.1. Commercialized Solar-Powered Electric Vehicles

This section introduces SPEV models launched by various automakers and startups and performance test cases. The list of SPEV models released from January 2014 to June 2022 is presented in Appendix A. In collaboration with SunPower Co. and the Georgia Institute of Technology, Ford installed a 300 W solar panel with an area of 1.5 m2 on the C-MAX plug-in hybrid electric vehicle (PHEV) model, and confirmed a driving range of approximately 33 km through PV power generation [10]. In the same year, Mahindra and Mahindra developed the Reva e2o prototype. In 2017, the Toyota Motor Corporation collaborated with Panasonic to develop a prototype Prius Prime PHEV model with a 180 W solar panel installed on the roof [43]. Afterwards, they collaborated with NEDO and the Sharp Corporation to launch a new prototype with 860 W solar panels installed on the roof, hood, and rear hatch door, and experiments demonstrated a driving range of approximately 44 km with electricity charged during the day [44]. They then added a solar roof option to the bZ4X EV model and began the full-scale production of SPEVs [16]. The Dutch startup Lightyear presented the Lightyear One concept car at the International CES 2018 and was awarded the Climate Change Innovator Award in recognition of its environmental contributions [11]. Lightyear One recently completed its final performance tests and is expected to enter full-scale production in mid-2022. Since 2019, the Hyundai Motor Group has been producing a solar roof model of the Sonata HEV, in which a 205 W solar panel is installed on the roof [12]. The electricity charged through the solar panel is directly charged to the starter and drive battery instead of the solar battery, which improves its energy efficiency.
In 2021, Eindhoven University developed a solar-powered family car named Stella Vita. An 8.8 m2 fixed solar panel and 17.5 m2 unfolding solar panel were installed on the roof of the vehicle [45]. In October 2021, the vehicle completed a month-long trip of 3000 km from Eindhoven (Netherlands) to Tarifa (south Spain) using only battery power charged from solar energy, thus demonstrating the effectiveness of SPEVs. In 2021, Aptera Motors released the Luna model, which was the third alpha version prototype, and conducted a beta test [13]. In the same year, Sono Motors introduced the Sion prototype installed with 6 m2 of solar panels on the roof, hood, rear hatch door, and both side doors [14]. In 2022, they also developed a solar bus trailer installed with 2000 W of solar panels covering 12 m2, which has entered pilot operation [46]. The charged electricity is used to operate the heating, ventilation, and air conditioning (HVAC) system of the bus and assist the steering system. Mercedes-Benz announced the SPEV project Maybach in 2021 [47] and the Vision EQXX concept car in 2022 [15].

2.2. SPEV Operating Software Technologies

A variety of software technologies are being developed for efficient SPEV operation. Table 1 summarizes the research cases on developed SPEV operation software technologies and the parameters considered for each case. SPEV operation technologies include speed planning on road segments, optimal route planning, solar PV energy estimation on urban roads, and optimal parking location analysis. The speed planning of SPEVs involves the analysis of speed in each road section to minimize driving time while considering the EV energy consumption and PV power generation. Lv et al. [36] developed models of the solar PV generation and EV energy consumption of SPEVs and analyzed the optimal speed in sections of both illuminated and shaded roads through dynamic programming. Liu et al. [37] generated maps to predict the PV power generation in each urban road section considering solar insolation, shadows, road gradients, and traffic flow. Route planning involves the analysis of driving routes that maximize energy efficiency while considering the EV energy consumption and potential PV generation. Researchers have overlaid the maps of seasonal insolation, traffic flow, and sky view factor to create a PV power generation prediction map and identify road routes with the highest PV power generation potential. Hasicic et al. [38] developed a Solar Car Optimized Route Estimation (SCORE) for solar vehicles considering solar irradiance and target distance using Dijkstra’s algorithm. Mobile sensors were installed at selected places by road and measured solar irradiance data in real time to analyze the optimal route. However, they could not consider the shading effect of buildings and trees for route optimization. Jiang et al. [39] suggested the SunChase based on the multi-label correcting algorithm for the route planning of SPEVs, balancing energy harvesting and consumption. They estimated PV power generation considering solar irradiance, shading effects, travel time, traffic flow, and the energy consumption of SPEVs. To analyze the shaded/illuminated road segments in the map, they utilized a shadow analysis tool using the 3D model. Schuss et al. [40] developed a route planning model for SPEVs based on the experimental measurements of PV power output while driving. Zhou et al. [41] designed an autonomous vehicle system of SPEVs for smart cities and analyzed the optimal vehicle route using the traveling sales problem (TSP). This method compared the energy to be harvested and consumed while driving the chosen paths and selected the optimal path, taking into account the SPEVs’ residual energy. To enhance the efficiency of PV power generation when parked, the selection of parking locations with high solar insolation and no shadows is also crucial. Choi et al. [42] used a fisheye lens to analyze monthly solar access considering the shadows from buildings and trees in off-street and on-street parking lots and identified optimal parking lots for SPEVs, aiming to maximize PV power generation.

3. Methods

3.1. Skymap Generation

The skymaps displayed shading obstructions and the visible sky appearing in the upward direction from the area of interest (circular shape). Skymaps were formed from 3D models or fisheye images. When applying the 3D models, we first produced 3D models of shading obstructions such as buildings, trees, and telephone poles. The 3D shape and dimensions of the shading obstructions were identified from digital surface models, orthomosaic images, and land use maps. The skymap was determined using the hemispherical viewshed technique [48]. Viewshed analysis is a spatial analysis that identifies the visibility of all points on the terrain surface and displays the visible area on a raster image [49]. Hemispherical viewshed identifies shading obstructions in the 360° azimuth direction around the survey point and represents the horizon angles between the survey point and shading obstructions in the hemispherical coordinates. Figure 2 illustrates the hemispherical viewshed analysis. An LOS was formed from the survey point in an arbitrary azimuth angle direction (Figure 2a). LOS refers to the straight-line distance to an object within the observer’s field of view. The angle between the LOS and horizon was measured to calculate the horizon angle (Figure 2b). The horizon angle was calculated at regular intervals for azimuth angles from 0° to 360°, and it was 0 when there were no shading obstructions. The calculated horizon angle was expressed as hemispherical coordinates (zenith, azimuth, radial distance). The skymap was applied to a 2D plane through an equiangular hemispherical projection. Figure 3a illustrates the skymap formed through the hemispherical viewshed technique. Gray areas indicate shading obstructions; larger horizon angles are closer to the center of the image. White areas indicate open sky.
To form the other skymap, a camera equipped with a fisheye lens was placed in an upward-looking direction in the survey point, and a fisheye image was obtained (Figure 3b). It is preferred to select the survey point on the driving road at equal intervals. A fisheye lens is an ultra-wide-angle lens with 180° vertical, horizontal, and diagonal angles of view. They capture images using a short focal length, so that the perspective and depth of the subject are distorted. A fisheye lens was used to capture a wide range of hemispherical images in the horizontal direction in the survey point.

3.2. Shading Matrix Calculation

The sun-path diagram was determined by tracking the sun’s trajectory for 365 d of the year. The sun-path diagram expresses the sun’s position according to the date and season at regular time intervals through discrete time sectors, which were set to 15 min, 30 min, and 1 h. The position of the sun was calculated by latitude, day of year, and time of day, which were defined as the zenith angle ( θ ), azimuth angle ( α ), and altitude angle ( h ). Figure 4 shows the position of the sun on a celestial sphere. Equation (1) was used to calculate the zenith angle, where ϕ is the latitude, which ranges from −90° (southern hemisphere) to +90° (northern hemisphere); and H is the hour angle (−180° to 180°), where 0° represents noon, and 15° represents 13 h. Equation (2) was used to calculate the declination angle ( δ ), in which n is the day of the year, e.g., n = 365 for 31 December. Equations (3) and (4) were used to calculate the altitude (0° to 90°) and azimuth (−180° to 180°) angles. The solar zenith and azimuth angles of the 3D hemispherical coordinate system were expressed on a 2D plane to form the sun-path diagram. Two sun-path diagrams were formed for the period between the winter and summer solstices (22 December to 22 June) and the remaining period (22 June to 22 December).
cos θ = sin ϕ cos δ + cos ϕ cos δ cos H
δ = 23.45 ° sin [ 360 × ( 284 + n ) 365 ]
sin h = sin δ sin ϕ + cos δ cos H cos ϕ
cos α = sin h sin ϕ sin δ cos h cos ϕ
Monthly and hourly shading matrices were calculated using sun-path diagrams and skymaps. The sun-path diagram was overlaid on the skymap (Figure 5), and the shaded areas overlapping the sun-path diagram were analyzed. The skymap formed from the fisheye image was classified into shading obstruction and open sky in the sun-path diagram area via automatic image processing. Table 2 shows the equations used to calculate the shading factors and matrices. S I ( d a y , h o u r ,   D T S ) is the shading index calculated for all discrete time sectors (DTS) on the sun-path diagram. S I ( d a y , h o u r ,   D T S ) is 0 if the discrete time sector is shaded by obstructions, and before sunrise or after sunset. Otherwise, it is considered 1. N D T S indicates the number of discrete time sectors for 1 h, and I D T S indicates the interval (h) of the discrete time sector in the sun-path diagram. If the discrete time sector is formed at 15-min intervals, N D T S is 4. S F ( d a y ,   h o u r ) is the average of the shading index over 1 h. For example, among four discrete time sectors (13:15, 13:30, 13:45, 14:00) at 13:00 on 31 December, if two time sectors are not shaded, S F ( 365 ,   13 ) is 2/4 = 0.5. S F ( m o n t h ,   h o u r ) is the average of the hourly shading index of all days in the month. N d a y s is the number of days in the month. For example, S F ( D e c , 13 ) is the average hourly shading index at 13:00 from 1 to 31 December ( ( S F ( 335 ,   13 ) + + S F ( 365 ,   13 ) ) / 31 ). M S F ( m o n t h ,   h o u r ) is the month-by-hour shading matrix, and it is expressed as the shading factor of 12 rows (months) and 24 columns (h).
An example of the shading matrix is presented in Appendix A. This matrix was formed based on the skymap in Figure 5a. A value of 0 indicates a time sector without incident sunlight, whereas 1 indicates complete sunlight incidence. The shading matrix suggests that in winter mornings, sunlight is not incident owing to the southeastern building, and in summer and winter afternoons, sunlight is barely incident owing to the western building.

3.3. Comparison of Shading Matrices

The quantitative differences between the shading matrices formed using 3D models or fisheye images were calculated based on two factors, which were calculated as per the equations in Table 3. The first factor was the mean shading factor (MSF), which is the average of shading factors of the month-by-hour shading matrix by month, season, and year. The MSF for each month ( M S F ( m o n t h ) ) is the average shading factor during the possible duration of sunshine of each month. The possible duration of sunshine refers to the time between sunrise and sunset. N P D S is the length of the average possible duration of sunshine, and it was calculated differently for each month. By calculating the MSF during the possible duration of sunshine, the shading factor before or after sunrise can be excluded from the average calculation. The MSF over one year ( M S F ( a n n u a l ) ) is the average MSF from January to December. M S F ( M a y O c t ) and M S F ( N o v A p r ) are the average monthly MSFs for the periods from summer to autumn (May–October) and winter to spring (November–April), respectively. The second factor was the monthly and annual MSE of the two shading matrices. The squared error matrix ( M S E ( m o n t h , h o u r ) ) , which squares the difference in the two shading matrices ( M S F 1 and M S F 2 ), was calculated. S F 1 ( m o n t h ,   h o u r ) and S F 2 ( m o n t h ,   h o u r ) are the shading factors of M S F 1 and M S F 2 , respectively. M S E ( m o n t h ) is the average of the squared error during the possible duration of sunshine of each month, expressed as a percentage. M S E ( a n n u a l ) is the average of M S E ( m o n t h ) from January to December.

3.4. Study Area and Materials

The Daeyeon Campus of Pukyong National University (35°08′02″ N, 129°06′10″ E), located in Nam-gu, Busan, South Korea, was selected as the study site. The campus is located on relatively flat terrain, with an elevation of 3–10 m above sea level and a total area of 359,509 m2. Hwangnyeong Mountain (a.s.l. 427 m) and Yongho Bay are located north and southeast of the study area, respectively. The campus has 38 buildings, including a university administration building, lecture building, laboratory building, and dormitory, and their number of floors varies from 5 to 20. The campus has roads with 1 to 2 lanes, and they amount to approximately 6.3 km in length. There are trees alongside some parts of the road.
In the study area, telephone poles are buried underground, and there are no particular man-made structures other than buildings. Therefore, only buildings and roadside trees were considered to shadow the roads for the calculation of shading matrices. Buildings usually maintain a fixed shape, whereas trees have irregular shapes that change with type and over time. Therefore, it was necessary to evaluate whether the shading effects of trees could be accurately analyzed when calculating the shading matrices using 3D models and fisheye images. This study set three cases according to the type of shading obstruction and selected four survey points for each case (total of 12 survey points). In cases A, B, and C, the roads are surrounded by only trees, trees and buildings, and only buildings, respectively. Figure 6 and Figure 7 show, respectively, the locations and surrounding views of the 12 survey points.

4. Results

The 3D models were developed considering the shape and dimensions of 38 buildings on the campus (Figure 8). Tree models were excluded due to the difficulty of delicate modeling. The fisheye images were obtained using Solmetric’s SunEye 210 [50], which is a tool for evaluating shading effects in PV systems. It includes a digital camera equipped with a calibrated fisheye lens, electronic inclinometer, compass, and GPS. Figure 9 shows the skymaps of 12 survey points generated from the 3D models, where the shape of buildings around the survey points can be visualized. The buildings were classified as shading obstructions in the sun-path diagrams overlaid on the skymaps. Figure 10 shows fisheye images of the 12 survey points, which capture the buildings and vegetation around the campus road. In the sun-path diagrams of case A (Figure 10a–d), case B (Figure 10e–h), and case C (Figure 10i–l), only vegetation, vegetation and buildings, and only buildings were classified as green shading obstructions, respectively.
The shading matrices were calculated using 3D models and fisheye images of the 12 survey points in cases A, B, and C, and a squared error matrix between the shading matrices was calculated. Figure 11 shows the shading matrices at points A3 and A4. The shading matrices show the shading factors from 5:00 to 19:00 by month. In the fisheye image of A3 (Figure 10c), tree shadows were observed from 6:00 to 10:00 and 16:00 to 18:00 in all seasons. Therefore, most shading factors in the time sectors of the shading matrix were calculated as 1 (Figure 11c). Conversely, as the skymaps of the 3D models did not consider the shading effects of vegetation, they considered no shadows during the corresponding time sectors (Figure 11a). Consequently, large squared errors were observed between the two shading matrices under the corresponding time sectors (Figure 11e).
In the fisheye image of A4 (Figure 10d), tree shades were observed from 6:00 to 18:00 in most seasons, except part of the summer (May, June, July). In the shading matrix of the 3D model, the shading factors were close to 1 in the winter (November, December, January) owing to the shadows of high-rise buildings (Figure 11b). However, as tree shadows were not considered, the shading factors were 0 for most time sectors. In contrast, in the shading matrix of the fisheye image, shading factors were greater than 0 owing to the tree shadows observed in most time sectors, except in May, June, and July (Figure 11d). Large squared errors were observed between the two shading matrices in the morning hours of February, March, August, and September (Figure 11f).
Table 4 shows the sunrise time, sunset time, and possible duration of sunshine in the study area. The average possible duration of sunshine was approximately 12.1 h from May to October and approximately 9.8 h from November to April. MSF and MSE were calculated considering the possible duration of sunshine between sunrise and sunset. Figure 12 shows the monthly MSFs and MSEs of the two shading matrices calculated using 3D models and fisheye images for case A. At the four survey points, the monthly MSFs of the fisheye images were higher than those of the 3D models, particularly in seasons when trees were distributed on the sun-path diagrams. At point A1, because the trees formed shadows between 6:00 and 10:00 in all seasons except for May, June, and July, the average MSE between the two shading matrices was approximately 38% (Figure 12a). At A2, the average MSE from March to September was approximately 21%, which was consistent with the locations of trees and the sun’s trajectory (Figure 12b). At A3, where trees were distributed throughout all seasons, the average annual MSE was approximately 32% (Figure 12c). At A4, the average MSE in February–March and September–October was approximately 38% (Figure 12d).
Figure 13 shows the shading matrices and squared error matrices calculated for survey points B1 and B4, under case B. At B1, trees to the southeast and buildings to the southwest produce shading effects (Figure 10e). Particularly from January to March and September to December, trees formed shadows in the morning and buildings formed shadows in the afternoons. In this period, the shading matrices formed from the 3D models accurately reflected building shadows in the afternoons, with a shading factor of 1 (Figure 13a). However, as tree shadows in the mornings could not be reflected, the respective shading factors were close to 0. Conversely, the shading matrices formed from fisheye images for the same period reflected both building and tree shadows in the mornings and afternoons, and the respective shading factors were accurately calculated as 1 (Figure 13c). Therefore, squared errors at B1 were observed only for the mornings from January to March and September to December (Figure 13e).
At B4, tree shadows were observed from 6:00 to 10:00 in all seasons, whereas building shadows were observed from 14:00 to 18:00 in January–March and September–December (Figure 10h). The shading matrices formed from the 3D models accounted for these building shadows, and the respective shading factors were accurately calculated as 1 (Figure 13b). However, as tree shadows were not considered, the shading factors at 6:00–10:00 were 0 in all seasons. In contrast, the shading matrices formed from the fisheye images accounted for building and tree shadows, so the morning and afternoon shading factors were accurately calculated as 1 (Figure 13d). Therefore, squared errors between the two shading matrices were observed at 6:00–10:00 in all seasons when tree shadows formed (Figure 13f).
Figure 14 shows the monthly MSFs and MSEs of the two shading matrices at survey points B1, B2, B3, and B4. At B1, where there were no shading effects from April to August, the MSFs of the two shading matrices were nearly identical, and the MSE was close to 0. Conversely, when trees produced shading effects in January–March and September–December, the monthly MSE was high, with an average of approximately 21%. At B2, B3, and B4, where trees produced shading effects in all seasons, the average annual MSEs were approximately 25%, 24%, and 33%, respectively.
Figure 15 shows the shading matrices formed using 3D models and fisheye images at survey points C2 and C3. In C2, buildings produced shading effects throughout all seasons, and shadows formed at all times except 10:00 in May–July and 12:00 to 14:00 in all seasons (Figure 10j). The shading matrices estimated from the 3D models and fisheye images reflected the building shading effects and were nearly identical (Figure 15a,c). However, owing to the slight difference between the height and shape of a 3D building model and the actual building, an MSE close to 1 appeared at 11:00 and 15:00 in June (Figure 15e). At C3, the buildings formed shadows from 10:00 to 13:00 in all seasons (Figure 10k). As the shading effects of the buildings were accurately considered, the shading matrices were almost identical in all seasons (Figure 15b,d). The difference between the shading matrices was nearly 0 for all seasons and times (Figure 15f).
Figure 16 shows the monthly MSFs and MSEs of the shading matrices formed from 3D models and fisheye images at survey points C1, C2, C3, and C4. At these four points, the monthly MSFs of the two shading matrices were nearly identical. However, owing to the difference between the actual and modeled shapes of buildings, a low squared error was obtained in certain time sectors. At C1, the average monthly MSE from April to September was approximately 5%, and at C2, the MSE in June was approximately 8%.
Table 5 shows the annual MSEs and the annual, May–Oct, and Nov–April MSFs of the shading matrices formed with 3D models and fisheye images at the 12 survey points under cases A, B, and C. In cases A and B, the annual, May–Oct, and Nov–April MSFs of the shading matrices formed using fisheye images were higher than those of matrices formed using 3D models at all survey points. This occurred because the shading factors of the 3D models did not consider the shading effects of trees, which resulted in high shading factors during seasons and time sectors when tree shades were observed. Meanwhile, in case C, the 3D building models accurately reflected shading effects, so the annual, May–Oct, and Nov–April MSFs of the shading matrices formed using 3D models were nearly identical to those formed using fisheye images. Unlike cases A and B, the annual MSE under case C was substantially low (2.1%), which indicated that both methods for matrix calculation provided accurate analyses of the shading effects of buildings. These results also suggest that in survey points around buildings, similar shading matrices were obtained, regardless of the method.
Shading matrices formed using 3D tree models were quantitatively compared with those formed with fisheye images. The tree models were produced only for survey points B1 and B4 in case B. Aerial photogrammetry was conducted using a UAV around the survey point, and the exact number, shape, and dimension of trees were surveyed from the 3D point cloud. Figure 17 shows the 3D buildings and tree models of survey points B1 and B4. Shading matrices were formed using the 3D models of buildings and trees, and the monthly MSFs were calculated. Figure 18 shows the monthly MSFs of the shading matrices formed using 3D models including buildings and trees and those formed with fisheye images. At survey points B1 and B4, the difference in the monthly MSFs was relatively small. Specifically, the difference in the monthly MSFs was greatly reduced compared to the results based only on 3D building models (Figure 14a,d). Table 6 shows the annual, May–Oct, and Nov–Apr MSEs of the two shading matrices with and without tree representation in the 3D models. When tree models were considered, the annual, May–Oct, and Nov–Apr MSEs between the two shading matrices decreased by 80.7%, 64.6%, and 87.4% at B1, and by 81.6%, 74.5%, and 87.6% at B4, respectively. This demonstrated that more accurate shading matrices can be formed using both 3D building models and 3D tree models simultaneously.

5. Discussion

5.1. Advantages and Disadvantages of Using 3D Models and Fisheye Images

We discussed the advantages and disadvantages of using 3D models and fisheye images in an urban road environment. The advantages include the possibility to generate the model and construct the shading database using a computer in an office, without the need to directly visit the site. Satellite images, orthographic images, topographic maps, and building blueprints of the study area can be referenced to determine the geometry and dimensions of the buildings in order to create the 3D models without a prior survey. The 3D model of a large urban area enables the calculation of shading matrices for many points at dense intervals along the roads. However, pre-modeling can be time-consuming because 3D models must be created for the entire urban area. Moreover, to obtain more precise 3D models and increase the accuracy of shading matrices, high-resolution survey data are necessary to model irregularly shaped shading obstructions. In particular, satellite images cannot express the geometry of thick tree trunks accurately, so ground Light Detection and Ranging (LiDAR) surveys may be needed to model the geometry of trees. Furthermore, frequent and continuous surveys and modifications of 3D models are necessary to express the geometry of trees throughout the seasons.
Fisheye images capture the actual shape of shading obstructions, which enables an accurate and precise shading matrix analysis. Moreover, fisheye images can be obtained immediately at the survey point, without the need to create large-scale 3D models beforehand. Even if a building is extended, a new shading matrix can be formed by capturing new photographs without a precise survey. Photos of trees in each season can provide information about the seasonal changes of trees. This method enables a flexible response to temporal and seasonal changes in shading obstructions. However, these surveys are time-consuming because all survey points must be photographed, and it is difficult to obtain precise photos where human access is limited. Owing to the precision of fisheye lenses, the actual vertical field of view is less than 180°. Therefore, objects with an elevation of approximately 5 to 10° from the horizontal plane are not captured in the photo, which hinders the accurate accounting of shading obstructions. In addition, most fisheye cameras cannot perform real-time image processing to classify shading obstructions. In other words, it is not possible to simultaneously capture photos and perform shading analysis at the survey point.
Shading databases from 3D models can be used in large urban areas. Unmanned aerial vehicles (UAVs) can be used to remotely survey the urban area and create a digital surface model, and the edge lines of buildings can be extracted to generate 3D models of the buildings. Survey points are then selected at dense intervals along urban roads and shading matrices are formed. To consider the shading effects of vegetation around the roads, fisheye images of roads with roadside trees can be obtained. Shading matrices based on fisheye images can be used to correct the matrices calculated from 3D models. Furthermore, to account for seasonal changes in vegetation, fisheye images can be obtained in each season to construct a shading database according to the season.

5.2. Comparative Analysis of This Study with Previous Studies on SPEVs

We analyzed and compared the methods for calculating the shading effects of roadside buildings and trees in this study and previous studies. Several featured studies on SPEVs were selected for comparative analysis. Table 7 lists the comparison results between this study and previous studies on shading effect consideration methods. Most of the previous studies considered the shading effects of urban buildings and trees to estimate solar irradiance and PV power output. There were four tools used to consider the shading effects, namely probability models, empirical assumptions, taking fisheye images, and calculating shading fractions derived from digital surface models. Akari et al. [23,24,25] assumed that the buildings and other shading objects are randomly distributed along the road with random heights and calculated the shading angle using the accumulated distribution. To simplify the calculation, Lodi et al. [26] substituted the mean height of the obstacles, mean road width, and mean vehicle width into the shading fraction calculation. The above two studies would be unable to reflect the temporal and spatial variation in shading effects and the exact height of physical obstacles. Ota et al. [27,28,29] captured fisheye images in various road sections (open-air, high-rise, and urban sections) and calculated the mean effective shading angles using fisheye images. However, they could not calculate the temporal variation in the effective shading angles using the fisheye images. Oh et al. [34] introduced a hemispherical VIEWMAP for each road section to represent the shadowed area around each position using a digital surface model (DSM). Kim et al. [35] calculated the shading fraction at each railroad position by considering the relationship between the relative solar position and the height of buildings, terrain, and trees around the railroad using DSM. Although Oh et al. [34] and Kim et al. [35] calculated the temporal and spatial variation in shading effects considering urban buildings and trees, they did not verify the accuracy of the shading fractions derived from DSMs by comparing the results with those derived from fisheye images. DSMs with a low resolution cannot express the detailed geometry and seasonal variations in shading obstructions. Therefore, it is essential to compare the 3D models and fisheye images to clarify which methods are accurate to calculate the shading effects and can be effectively used to generate shading databases on urban roads.
In this study, we could consider the temporal and spatial variation in shading effects by calculating month-by-hour shading matrices of road sections using 3D models and fisheye images. However, we could not consider the shading effects of trees when generating shading matrices using 3D models. Most commercial 3D modeling software cannot express the detailed geometry of trees (i.e., branches, leaves, trunk), so it can only model tree geometries in a simplified shape (e.g., rounded or conical). This results in low modeling accuracy, so that shading matrices cannot be accurately calculated. Precise ground LiDAR surveys are necessary to enhance the precision of 3D tree models. However, a precision survey of trees in an entire urban area would be highly time-consuming and expensive. Therefore, it is effective to design a new method combining two shading matrices derived from 3D models and fisheye images to accurately calculate the shading effects of trees.

5.3. Challenges for Building Shading Database on Urban Roads

This novel design of shading matrices using both 3D models and fisheye images enables the simultaneous analysis of the shading effects of buildings and trees in urban areas. The use of 3D models provides large-scale shading databases at dense intervals along roads in urban areas, which represent large quantities of hard data that consider the shading effects of buildings. In contrast, the use of fisheye images near roadside trees provides small quantities of highly accurate soft data. Therefore, the prediction of monthly and hourly shading factors and the simultaneous consideration of hard and soft data enable the simultaneous calculation of the shading effects of buildings and trees. Universal Kriging, which is a geostatistical estimation method, can be used to predict new values by considering spatial distribution characteristics. To calculate the spatial change function of the values for the estimation of shading factors, the distribution of roadside trees should be considered. The precision of shading factor estimations can also be improved by assigning different reflection ratios (weights) to hard and soft data according to the distribution of roadside trees.
If SPEVs become fully commercialized, the construction of shading databases to predict the power generation will be further extended from developed to developing countries. However, as remote surveying is not as frequently performed in developing countries as in developed countries, they often lack sufficient data for 3D modeling. In such cases, rather than creating 3D building models for an entire city to construct the shading database, a method should be devised to quickly and inexpensively acquire fisheye lens images in large scale—for example, by attaching a fisheye lens digital camera or a 3D virtual reality (VR) camera to a vehicle and capturing images while driving. This could be used to quickly provide several fisheye images for each dense road interval. Furthermore, by driving a vehicle in areas in which it is difficult to walk, precise data can be safely obtained.

6. Conclusions

This study presented a basic analysis for the construction of shading databases of urban roads. For this, we generated and compared shading matrices by forming skymaps and tracking the position of the sun using 3D models and fisheye images. Skymaps were formed using the hemispherical viewshed technique applied to 3D building models, or using fisheye images captured by a fisheye lens digital camera placed in an upward-looking direction. Sun-path diagrams were overlaid on the two sets of skymaps, the presence of shadows according to the sun’s position by hour and season was determined, and month-by-hour shading matrices were calculated. To quantitatively compare the differences between the two shading matrices, the MSE was calculated according to month, season, and year. The Pukyong National University Daeyeon Campus was selected as the study area. Cases A, B, and C were classified according to the presence of buildings and trees, and two methods were used to calculate the shading matrices at four survey points for each case. The results showed large differences between the two shading matrices at the four survey points in case A (shade only from trees). In case B (shade from trees and buildings), the monthly MSEs between the two matrices were low during hours and seasons when the most shadows were formed by buildings, but high when trees produced shadows. In case C (shade only from buildings), the two matrices were nearly identical for all survey points. Therefore, we observed that the shading effects of buildings can be accurately analyzed using either 3D models or fisheye images, whereas the shading of trees was more accurately considered when using fisheye images.
This study discussed the advantages and disadvantages of using 3D models and fisheye images in shading matrices of urban road environments. A new shading matrix prediction technique was proposed to efficiently illustrate the shading effects of buildings and trees. To construct shading databases in developing countries with insufficient survey data, we suggest attaching a 3D VR camera to a vehicle to acquire fisheye lens images. The findings of this study provide a background for the selection of optimal methods to construct shading databases in urban road environments for the development of SPEV operation technology.
The results indicate that the construction of shading databases for large urban areas will enable the estimation of the PV power generation of SPEVs. It is possible to generate a PV power generation prediction map of urban areas in units of minutes, hours, days, months, and seasons. The solar insolation and power generation efficiency of solar panels can improve the prediction accuracy. Optimal driving routes and parking locations can also be analyzed for SPEVs based on the PV power generation prediction map. Future studies should aim to develop new techniques to predict the PV power generation of SPEVs and construct shading databases of urban roads.

Author Contributions

Conceptualization: J.B. and Y.C.; Data curation: J.B. and Y.C.; Formal analysis: J.B. and Y.C.; Funding acquisition: J.B.; Investigation: J.B.; Methodology: J.B. and Y.C.; Project administration: Y.C.; Resources: Y.C.; Software: Y.C.; Supervision: Y.C.; Validation: J.B.; Visualization: J.B.; Roles/Writing—original draft: J.B.; Writing—review and editing: Y.C. All authors have read and agreed to the published version of the manuscript.

Funding

This work was supported by the National Research Foundation of Korea (NRF) grant funded by the Korea government (MSIT) (No. 2022R1C1C2011947).

Conflicts of Interest

The authors declare no conflict of interest.

Appendix A

Table A1. Summary of released solar-powered electric vehicles.
Table A1. Summary of released solar-powered electric vehicles.
YearManufacturer (City, Country)ModelStagePV LocationRef.
2014Ford Motor Company (Dearborn, MI, USA)C-MAX Solar EnergiPrototypeRoof[10]
2017Toyota Motor Corporation
(Toyota, Aichi, Japan)
Prius PHEV 1PrototypeRoof[43]
2019Prius PHEV 1 demoPrototypeRoof, hood, rear hatch door[44]
2022bZ4XProductionRoof[16]
2019Lightyear
(Helmond, The Netherlands)
Lightyear OnePrototypeRoof, hood, rear hatch door[11]
2019Hyundai Motor Group
(Seoul, Korea)
Sonata HEV 2 with Solar RoofProductionRoof[12]
2021Aptera Motors
(San Diego, CA, USA)
LunaPrototypeRoof[13]
2021Sono Motors
(Munich, Germany)
SionPrototypeRoof, hood, rear hatch door, side doors[14]
2021Mercedes-Benz
(Stuttgart, Germany)
Project MaybachProof of conceptHood[47]
2022Vision EQXXProof of conceptRoof[15]
1 Plug-in hybrid electric vehicle; 2 Hybrid electric vehicle.
Table A2. Example of month-by-hour (from 5 to 19 h) shading matrix (generated from Figure 5a). As the shading factor increases (becoming red), the degree to which the sky is obscured by shading obstructions increases.
Table A2. Example of month-by-hour (from 5 to 19 h) shading matrix (generated from Figure 5a). As the shading factor increases (becoming red), the degree to which the sky is obscured by shading obstructions increases.
MonthHour
5 h6 h7 h8 h9 h10 h11 h12 h13 h14 h15 h16 h17 h18 h19 h
Jan0.00.00.00.00.00.00.41.00.70.40.10.00.00.00.0
Feb0.00.00.00.00.00.00.31.01.01.01.00.20.00.00.0
Mar0.00.10.20.30.00.00.31.01.01.00.60.00.00.00.0
Apr0.00.21.01.00.70.70.91.01.00.70.10.00.00.00.0
May0.00.41.01.01.01.01.01.01.00.30.00.00.00.00.0
Jun0.00.51.01.01.01.01.01.01.00.40.00.00.00.00.0
Jul0.00.41.01.01.01.01.01.01.00.50.00.00.00.00.0
Aug0.00.11.01.01.01.01.01.01.00.60.00.00.00.00.0
Sep0.00.30.70.60.10.20.61.01.00.90.20.00.00.00.0
Oct0.00.00.00.00.00.00.81.01.01.00.60.00.00.00.0
Nov0.00.00.00.00.00.00.81.00.90.70.20.00.00.00.0
Dec0.00.00.00.00.00.00.70.90.10.00.00.00.00.00.0

References

  1. United Nations. Key Aspects of the Paris Agreement. Available online: https://unfccc.int/process-and-meetings/the-paris-agreement/the-paris-agreement/key-aspects-of-the-paris-agreement (accessed on 29 August 2022).
  2. United Nations. Adoption of the Paris Agreement. Available online: https://unfccc.int/files/essential_background/convention/application/pdf/english_paris_agreement.pdf (accessed on 29 August 2022).
  3. Volkswagen Group. NEW AUTO: Volkswagen Group Set to Unleash Value in Battery-Electric Autonomous Mobility World. Available online: https://www.volkswagen-newsroom.com/en/press-releases/new-auto-volkswagen-group-set-to-unleash-value-in-battery-electric-autonomous-mobility-world-7313 (accessed on 29 August 2022).
  4. Volvo Group. Climate Strategy. Available online: https://www.volvogroup.com/en/sustainability/climate-goals-strategy.html (accessed on 29 August 2022).
  5. S&P Global. Global Light Duty EV Sales to Rise to 26.8 Mil by 2030: Platts Analytics. Available online: https://www.spglobal.com/commodityinsights/en/market-insights/latest-news/energy-transition/021622-global-light-duty-ev-sales-to-rise-to-268-mil-by-2030-platts-analytics (accessed on 29 August 2022).
  6. International Energy Agency. Global EV Outlook. 2022. Available online: https://iea.blob.core.windows.net/assets/e0d2081d-487d-4818-8c59-69b638969f9e/GlobalElectricVehicleOutlook2022.pdf (accessed on 29 August 2022).
  7. The Wall Street Journal. Solar-Powered Electric Vehicles Are Almost Ready to Hit the Road. Available online: https://www.wsj.com/articles/solar-power-electric-vehicle-11635259950 (accessed on 29 August 2022).
  8. Araki, K.; Ji, L.; Kelly, G.; Yamaguchi, M. To Do List for Research and Development and International Standardization to Achieve the Goal of Running a Majority of Electric Vehicles on Solar Energy. Coatings 2018, 8, 251. [Google Scholar] [CrossRef] [Green Version]
  9. Commault, B.; Duigou, T.; Maneval, V.; Gaume, J.; Chabuel, F.; Voroshazi, E. Overview and Perspectives for Vehicle-Integrated Photovoltaics. Appl. Sci. 2021, 11, 11598. [Google Scholar] [CrossRef]
  10. The Ford Motor Company. Let The Sun In: Ford C-MAX Solar Energi Concept Goes Off The Grid, Gives Glimpse Of Clean Vehicle Future. Available online: https://media.ford.com/content/fordmedia/fna/us/en/news/2014/01/02/let-the-sun-in--ford-c-max-solar-energi-concept-goes-off-the-gri.html (accessed on 29 August 2022).
  11. Lightyear. Lightyear One. Available online: https://lightyear.one/lightyear-one (accessed on 29 August 2022).
  12. Hyundai Motor Group. Everything about the Sonata Hybrid’s Solar Roof. Available online: https://tech.hyundaimotorgroup.com/article/everything-about-the-sonata-hybrids-solar-roof (accessed on 29 August 2022).
  13. Aptera Motors. Luna. Available online: https://aptera.us/ (accessed on 29 August 2022).
  14. Sono Motors. Sion. Available online: https://sonomotors.com/en/sion/ (accessed on 29 August 2022).
  15. Mercedes-Benz. VISION EQXX: The New Benchmark of Efficiency. Available online: https://www.mercedes-benz.com/en/vehicles/passenger-cars/concept-cars/vision-eqxx-the-new-benchmark-of-effiency/ (accessed on 29 August 2022).
  16. Toyota Motor Corporation. The All-New 2023 bZ4X. Available online: https://global.toyota/en/newsroom/toyota/36254760.html (accessed on 29 August 2022).
  17. Shepero, M.; Munkhammar, J.; Widén, J.; Bishop, J.D.K.; Boström, T. Modeling of Photovoltaic Power Generation and Electric Vehicles Charging on City-Scale: A Review. Renew. Sustain. Energy Rev. 2018, 89, 61–71. [Google Scholar]
  18. Sierra Rodriguez, A.; de Santana, T.; MacGill, I.; Ekins-Daukes, N.J.; Reinders, A. A Feasibility Study of Solar PV-Powered Electric Cars Using an Interdisciplinary Modeling Approach for the Electricity Balance, CO2 Emissions, and Economic Aspects: The Cases of The Netherlands, Norway, Brazil, and Australia. Prog. Photovolt. Res. Appl. 2020, 28, 517–532. [Google Scholar] [CrossRef] [Green Version]
  19. Kobashi, T.; Yoshida, T.; Yamagata, Y.; Naito, K.; Pfenninger, S.; Say, K.; Takeda, Y.; Ahl, A.; Yarime, M.; Hara, K. On the Potential of “Photovoltaics + Electric Vehicles” for Deep Decarbonization of Kyoto’s Power Systems: Techno-Economic-Social Considerations. Appl. Energy 2020, 275, 517–532. [Google Scholar] [CrossRef]
  20. Stauch, A. Does Solar Power Add Value to Electric Vehicles? An Investigation of Car-Buyers’ Willingness to Buy Product-Bundles in Germany. Energy Res. Soc. Sci. 2021, 75, 102006. [Google Scholar] [CrossRef]
  21. Thiel, C.; Gracia Amillo, A.; Tansini, A.; Tsakalidis, A.; Fontaras, G.; Dunlop, E.; Taylor, N.; Jäger-Waldau, A.; Araki, K.; Nishioka, K.; et al. Impact of Climatic Conditions on Prospects for Integrated Photovoltaics in Electric Vehicles. Renew. Sustain. Energy Rev. 2022, 158, 112109. [Google Scholar] [CrossRef]
  22. Sato, D.; Masuda, T.; Tomizawa, R.; Yamada, N. Theoretical Concentration Limit and Maximum Annual Optical Efficiency of Static/Low-Concentration CPV for Horizontal Integration to Vehicle Bodies. Opt. Express 2022, 30, 846–863. [Google Scholar] [CrossRef]
  23. Araki, K.; Ota, Y.; Nishioka, K.; Tobita, H.; Ji, L.; Kelly, G.; Yamaguchi, M. Toward the Standardization of the Car-Roof PV-The Challenge to the 3-D Sunshine Modeling and Rating of the 3-D Continuously Curved PV Panel. In Proceedings of the 2018 IEEE 7th World Conference on Photovoltaic Energy Conversion (WCPEC) (A Joint Conference of 45th IEEE PVSC, 28th PVSEC & 34th EU PVSEC), Waikoloa, HI, USA, 10–15 June 2018. [Google Scholar]
  24. Araki, K.; Lee, K.H.; Masuda, T.; Hayakawa, Y.; Yamada, N.; Ota, Y.; Yamaguchi, M. Rough and Straightforward Estimation of the Mismatching Loss by Partial Shading of the PV Modules Installed on an Urban Area or Car-Roof. In Proceedings of the 2019 IEEE 46th Photovoltaic Specialists Conference (PVSC), Chicago, IL, USA, 16–21 June 2019. [Google Scholar]
  25. Araki, K.; Ota, Y.; Yamaguchi, M. Measurement and Modeling of 3D Solar Irradiance for Vehicle-Integrated Photovoltaic. Appl. Sci. 2020, 10, 872. [Google Scholar] [CrossRef] [Green Version]
  26. Lodi, C.; Seitsonen, A.; Paffumi, E.; de Gennaro, M.; Huld, T.; Malfettani, S. Reducing CO2 Emissions of Conventional Fuel Cars by Vehicle Photovoltaic Roofs. ransp. Res. Part D Transp. Environ. 2018, 59, 313–324. [Google Scholar] [CrossRef]
  27. Ota, Y.; Masuda, T.; Araki, K.; Yamaguchi, M. A Mobile Multipyranometer Array for the Assessment of Solar Irradiance Incident on a Photovoltaic-Powered Vehicle. Solar Energy 2019, 184, 84–90. [Google Scholar] [CrossRef]
  28. Ota, Y.; Araki, K.; Nagaoka, A.; Nishioka, K. Evaluating the Output of a Car-Mounted Photovoltaic Module under Driving Conditions. IEEE J. Photovolt. 2021, 11, 1299–1304. [Google Scholar] [CrossRef]
  29. Ota, Y.; Araki, K.; Nagaoka, A.; Nishioka, K. Curve Correction of Vehicle-Integrated Photovoltaics Using Statistics on Commercial Car Bodies. Prog. Photovolt. Res. Appl. 2022, 30, 152–163. [Google Scholar] [CrossRef]
  30. Wetzel, G.; Salomon, L.; Krugener, J.; Peibst, R. Specifications for Maximum Power Point Tracking in Vehicle-Integrated Photovoltaics Based on High-Resolution Transient Irradiance Measurements. In Proceedings of the Conference Record of the IEEE Photovoltaic Specialists Conference, Calgary, AB, Canada, 15 June–21 August 2020; pp. 1124–1126. [Google Scholar]
  31. Wetzel, G.; Salomon, L.; Krügener, J.; Bredemeier, D.; Peibst, R. High Time Resolution Measurement of Solar Irradiance onto Driving Car Body for Vehicle Integrated Photovoltaics. Prog. Photovolt. Res. Appl. 2022, 30, 543–551. [Google Scholar] [CrossRef]
  32. Ng, C.W.X.; Zhang, J.; Tay, S.E.R. A Tropical Case Study Quantifying Solar Irradiance Collected on a Car Roof for Vehicle Integrated Photovoltaics towards Low-Carbon Cities. In Proceedings of the Conference Record of the IEEE Photovoltaic Specialists Conference, Calgary, AB, Canada, 15 June–21 August 2020; pp. 1124–1126. [Google Scholar]
  33. Kutter, C.; Alanis, L.E.; Neuhaus, D.H.; Heinrich, M. Yield potential of vehicle integrated photovoltaics on commercial trucks and vans. In Proceedings of the 38th European Photovoltaic Solar Energy Conference and Exhibition (EU PVSEC 2021), Online, 6–10 September 2021. [Google Scholar]
  34. Oh, M.; Kim, S.M.; Park, H.D. Estimation of Photovoltaic Potential of Solar Bus in an Urban Area: Case Study in Gwanak, Seoul, Korea. Renew Energy 2020, 160, 1335–1348. [Google Scholar] [CrossRef]
  35. Kim, H.; Ku, J.; Kim, S.M.; Park, H.D. A New GIS-Based Algorithm to Estimate Photovoltaic Potential of Solar Train: Case Study in Gyeongbu Line, Korea. Renew Energy 2022, 190, 713–729. [Google Scholar] [CrossRef]
  36. Lv, M.; Guan, N.; Ma, Y.; Ji, D.; Knippel, E.; Liu, X.; Yi, W. Speed Planning for Solar-Powered Electric Vehicles. In Proceedings of the 7th International Conference on Future Energy Systems, e-Energy 2016, Waterloo, ON, Canada, 21–24 June 2016. [Google Scholar]
  37. Liu, Z.; Yang, A.; Gao, M.; Jiang, H.; Kang, Y.; Zhang, F.; Fei, T. Towards Feasibility of Photovoltaic Road for Urban Traffic-Solar Energy Estimation Using Street View Image. J. Clean Prod. 2019, 228, 303–318. [Google Scholar] [CrossRef]
  38. Hasicic, M.; Bilic, D.; Siljak, H. Criteria for Solar Car Optimized Route Estimation. Microprocess. Microsyst. 2017, 51, 289–296. [Google Scholar] [CrossRef] [Green Version]
  39. Jiang, L.; Hua, Y.; Ma, C.; Liu, X. SunChase: Energy-Efficient Route Planning for Solar-Powered EVs. In Proceedings of the 2017 IEEE 37th International Conference on Distributed Computing Systems, Atlanta, GA, USA, 05–08 June 2017. [Google Scholar]
  40. Schuss, C.; Fabritius, T.; Eichberger, B.; Rahkonen, T. Energy-efficient Routing of Electric Vehicles with Integrated Photovoltaic Installations. In Proceedings of the 2020 IEEE International Instrumentation and Measurement Technology Conference (I2MTC), Dubrovnik, Croatia, 25–28 May 2020. [Google Scholar]
  41. Zhou, P.; Wang, C.; Yang, Y. Design and Optimization of Solar-Powered Shared Electric Autonomous Vehicle System for Smart Cities. IEEE Trans. Mob. Comput. 2021, 20, 1. [Google Scholar] [CrossRef]
  42. Choi, Y.; Kang, B.; Kang, M.; Kim, M.; Lee, S.; Han, S.; Ha, W.; Lee, S.W.; Ahn, S. Analysis of Optimal Location for Campus Solar-Powered Electric Vehicle Parking Lots Using a Fisheye Lens Camera. J. Korean Soc. Miner. Energy Resour. Eng. 2021, 58, 307–318. [Google Scholar] [CrossRef]
  43. Panasonic Corporation. Panasonic’s Photovoltaic Module HIT™ adopted for Toyota Motor’s New Prius PHV. Available online: https://news.panasonic.com/global/press/data/2017/02/en170228-3/en170228-3.html (accessed on 29 August 2022).
  44. Toyota Motor Corporation. NEDO, Sharp, and Toyota to Begin Public Road Trials of Electrified Vehicles Equipped with High-efficiency Solar Batteries. Available online: https://global.toyota/en/newsroom/corporate/28787347.html (accessed on 29 August 2022).
  45. Solar Team Eindhoven. Stella Vita. Available online: https://vita.solarteameindhoven.nl/ (accessed on 29 August 2022).
  46. Sono Motors. Sono Motors Technology Used for Munich’s First Solar Bus—MVG to Put Solar Bus Trailer Into Operation. Available online: https://sonomotors.com/en/press/press-releases/sono-motors-technology-used-for-munichs-first-solar-bus/ (accessed on 29 August 2022).
  47. Mercedes-Benz. Project MAYBACH. Available online: https://www.mercedes-benz.com/en/vehicles/passenger-cars/mercedes-maybach/project-maybach/ (accessed on 29 August 2022).
  48. Fu, P.; Rich, P.M. A Geometric Solar Radiation Model and Its Applications in Agriculture and Forestry. Comput. Electron. Agric. 2002, 37, 25–35. [Google Scholar] [CrossRef]
  49. Lee, J.; Stucky, D. On Appliying Viewshed Analysis for Determining Least-Cost Paths on Digital Elevation Models. Int. J. Geogr. Inf. Sci. 1998, 12, 891–905. [Google Scholar] [CrossRef]
  50. Solmetric. SunEye 210 Shade Tool. Available online: https://www.solmetric.com/buy210.html (accessed on 29 August 2022).
Figure 1. Procedures of comparing shading matrices generated from 3D models and fisheye images.
Figure 1. Procedures of comparing shading matrices generated from 3D models and fisheye images.
Energies 15 08228 g001
Figure 2. Diagrams of hemispherical viewshed analysis. (a) Perspective view of 3D models showing the horizon angle between the survey point and shading obstructions along the LOS at each azimuth angle. (b) Section view of the terrain showing the horizon angle between the survey point and terrains at the east and west azimuth directions.
Figure 2. Diagrams of hemispherical viewshed analysis. (a) Perspective view of 3D models showing the horizon angle between the survey point and shading obstructions along the LOS at each azimuth angle. (b) Section view of the terrain showing the horizon angle between the survey point and terrains at the east and west azimuth directions.
Energies 15 08228 g002
Figure 3. Skymaps generated using (a) hemispherical viewshed technique applied to a 3D model and (b) fisheye lens camera. Gray areas depict shading obstructions. Numbers represent the maximum horizon angle between the survey point and the shading obstruction at each azimuth angle.
Figure 3. Skymaps generated using (a) hemispherical viewshed technique applied to a 3D model and (b) fisheye lens camera. Gray areas depict shading obstructions. Numbers represent the maximum horizon angle between the survey point and the shading obstruction at each azimuth angle.
Energies 15 08228 g003
Figure 4. Solar position defined by the solar azimuth angle ( α ), solar zenith angle ( θ ), and altitude ( h ).
Figure 4. Solar position defined by the solar azimuth angle ( α ), solar zenith angle ( θ ), and altitude ( h ).
Energies 15 08228 g004
Figure 5. Sun-path diagrams overlaid on skymaps generated using (a) 3D model and (b) fisheye image.
Figure 5. Sun-path diagrams overlaid on skymaps generated using (a) 3D model and (b) fisheye image.
Energies 15 08228 g005
Figure 6. Aerial view of study area. Circular (red), squared (yellow), and hexagonal (green) marks represent the 12 survey points surrounded by (A) only roadside trees, (B) roadside trees and buildings, and (C) only buildings.
Figure 6. Aerial view of study area. Circular (red), squared (yellow), and hexagonal (green) marks represent the 12 survey points surrounded by (A) only roadside trees, (B) roadside trees and buildings, and (C) only buildings.
Energies 15 08228 g006
Figure 7. Photographs of scenery around the 12 survey points captured at approximately 17:00. Case A: survey points surrounded by only roadside trees (a) A1; (b) A2; (c) A3; (d) A4; Case B: survey points surrounded by buildings and trees (e) B1; (f) B2; (g) B3; (h) B4; Case C: survey points surrounded by only buildings (i) C1; (j) C2; (k) C3; (l) C4.
Figure 7. Photographs of scenery around the 12 survey points captured at approximately 17:00. Case A: survey points surrounded by only roadside trees (a) A1; (b) A2; (c) A3; (d) A4; Case B: survey points surrounded by buildings and trees (e) B1; (f) B2; (g) B3; (h) B4; Case C: survey points surrounded by only buildings (i) C1; (j) C2; (k) C3; (l) C4.
Energies 15 08228 g007
Figure 8. (a) Aerial view and (b) perspective view of 3D building models of the study area.
Figure 8. (a) Aerial view and (b) perspective view of 3D building models of the study area.
Energies 15 08228 g008
Figure 9. Skymaps generated from 3D building models overlaid by the sun-path diagram. Gray areas represent buildings around the survey points. Case A: survey points surrounded by only roadside trees (a) A1; (b) A2; (c) A3; (d) A4; Case B: survey points surrounded by buildings and trees (e) B1; (f) B2; (g) B3; (h) B4; Case C: survey points surrounded by only buildings (i) C1; (j) C2; (k) C3; (l) C4.
Figure 9. Skymaps generated from 3D building models overlaid by the sun-path diagram. Gray areas represent buildings around the survey points. Case A: survey points surrounded by only roadside trees (a) A1; (b) A2; (c) A3; (d) A4; Case B: survey points surrounded by buildings and trees (e) B1; (f) B2; (g) B3; (h) B4; Case C: survey points surrounded by only buildings (i) C1; (j) C2; (k) C3; (l) C4.
Energies 15 08228 g009
Figure 10. Fisheye images overlaid with sun-path diagram. Yellow and green areas represent open sky and shading obstructions (i.e., roadside trees and buildings), respectively. Case A: survey points surrounded by only roadside trees (a) A1; (b) A2; (c) A3; (d) A4; Case B: survey points surrounded by buildings and trees (e) B1; (f) B2; (g) B3; (h) B4; Case C: survey points surrounded by only buildings (i) C1; (j) C2; (k) C3; (l) C4.
Figure 10. Fisheye images overlaid with sun-path diagram. Yellow and green areas represent open sky and shading obstructions (i.e., roadside trees and buildings), respectively. Case A: survey points surrounded by only roadside trees (a) A1; (b) A2; (c) A3; (d) A4; Case B: survey points surrounded by buildings and trees (e) B1; (f) B2; (g) B3; (h) B4; Case C: survey points surrounded by only buildings (i) C1; (j) C2; (k) C3; (l) C4.
Energies 15 08228 g010
Figure 11. Shading matrices obtained at survey points A3 and A4 using 3D models (a,b) and fisheye images (c,d), respectively. Squared error matrices between two shading matrices at survey points (e) A3 and (f) A4.
Figure 11. Shading matrices obtained at survey points A3 and A4 using 3D models (a,b) and fisheye images (c,d), respectively. Squared error matrices between two shading matrices at survey points (e) A3 and (f) A4.
Energies 15 08228 g011
Figure 12. Mean shading factors and mean squared errors for each month at survey points (a) A1, (b) A2, (c) A3, and (d) A4.
Figure 12. Mean shading factors and mean squared errors for each month at survey points (a) A1, (b) A2, (c) A3, and (d) A4.
Energies 15 08228 g012
Figure 13. Shading matrices obtained at survey points B1 and B4 using 3D models (a,b) and fisheye images (c,d), respectively. Squared error matrices between two shading matrices at survey points (e) B1 and (f) B4.
Figure 13. Shading matrices obtained at survey points B1 and B4 using 3D models (a,b) and fisheye images (c,d), respectively. Squared error matrices between two shading matrices at survey points (e) B1 and (f) B4.
Energies 15 08228 g013
Figure 14. Mean shading factors for each month at survey points (a) B1, (b) B2, (c) B3, and (d) B4.
Figure 14. Mean shading factors for each month at survey points (a) B1, (b) B2, (c) B3, and (d) B4.
Energies 15 08228 g014
Figure 15. Shading matrices obtained at survey points C2 and C3 using 3D models (a,b) and fisheye images (c,d), respectively. Squared error matrices between two shading matrices at survey points (e) C2 and (f) C3.
Figure 15. Shading matrices obtained at survey points C2 and C3 using 3D models (a,b) and fisheye images (c,d), respectively. Squared error matrices between two shading matrices at survey points (e) C2 and (f) C3.
Energies 15 08228 g015
Figure 16. Mean shading factors for each month at survey points (a) C1, (b) C2, (c) C3, and (d) C4.
Figure 16. Mean shading factors for each month at survey points (a) C1, (b) C2, (c) C3, and (d) C4.
Energies 15 08228 g016
Figure 17. 3D point clouds and 3D tree models at survey points (a) B1 and (b) B4.
Figure 17. 3D point clouds and 3D tree models at survey points (a) B1 and (b) B4.
Energies 15 08228 g017
Figure 18. Mean shading factors based on 3D models with buildings and trees or fisheye images at survey points (a) B1 and (b) B4.
Figure 18. Mean shading factors based on 3D models with buildings and trees or fisheye images at survey points (a) B1 and (b) B4.
Energies 15 08228 g018
Table 1. Summary of previous studies on software technologies for SPEV operation.
Table 1. Summary of previous studies on software technologies for SPEV operation.
Sub-FieldReferencesMethodsConsiderations
S.I.S.E.R.G.T.T.T.F.E.C.P.G.
Speed planningLv et al. [36]Dynamic programming
Solar PV energy mappingLiu et al. [37]GIS-based map overlay
Route planningHasicic et al. [38]Dijkstra’s algorithm
Jiang et al. [39]Multi-label correcting algorithm
Schuss et al. [40]Multi-criteria decision analysis
Zhou et al. [41]Traveling sales problem (TSP)
Optimal parking lot analysisChoi et al. [42]Fisheye image
S.I.: Solar Insolation; S.E.: Shading Effect; R.G.: Road Gradient; T.T.: Travel Time; T.F: Traffic Flow; E.C.: Energy Consumption; P.G.; PV Generation.
Table 2. Description of factors used for shading matrix calculation.
Table 2. Description of factors used for shading matrix calculation.
Type of FactorsFormulaComplexity
24 hourly shading factors for each day of the year S F ( d a y ,   h o u r ) = D T S = 1 N D T S S I ( d a y ,   h o u r ,   D T S ) N D T S 365 days
× 24 h
Shading index for all discrete time sectors in the sun-path diagram S I ( d a y , h o u r ,   D T S ) = 0 (If discrete time sector is shaded, before sunrise, or after sunset)
S I ( d a y , h o u r ,   D T S ) = 1 (Otherwise)
365 days × 24 h × N D T S
The number of discrete time sectors for an hour N D T S = 1   h o u r I D T S -
24 hourly shading factors for each month of the year S F ( m o n t h ,   h o u r ) = d a y = 1 N d a y s S F ( d a y , h o u r ) N d a y s 12 months
× 24 h
Month-by-hour shading matrix M S F ( m o n t h ,       h o u r ) = [ S F ( J a n ,       0 ) S F ( J a n ,       23 ) S F ( D e c ,       0 ) S F ( D e c ,       23 ) ] -
Table 3. Equations for calculation of factors used to compare the shading matrices.
Table 3. Equations for calculation of factors used to compare the shading matrices.
Type of FactorsFormulaComplexity
Mean shading factors (MSF) for each month’s possible duration of sunshine M S F ( m o n t h ) = h o u r = s u n r i s e s u n s e t S F ( m o n t h ,   h o u r ) N P . D . S 12 months
Mean shading factors (MSF) over one year M S F ( a n n u a l ) = m o n t h = J a n D e c M S F ( m o n t h ) 12 -
Mean shading factors (MSF) during May to October M S F ( M a y O c t ) = M S F ( M a y ) + + M S F ( O c t ) 12 -
Mean shading factors (MSF) during November to April M S F ( N o v A p r ) = M S F ( N o v ) + + M S F ( A p r ) 12 -
Squared error matrix between two shading matrices generated from two shading matrix generation methods M S E ( m o n t h , h o u r ) = ( M S F 1 M S F 2 ) 2 = [ ( S F 1 ( J a n , 0 ) S F 2 ( J a n , 0 ) ) 2 ( S F 1 ( J a n , 23 ) S F 2 ( J a n , 23 ) ) 2 ( S F 1 ( D e c , 0 ) S F 2 ( D e c , 0 ) ) 2 ( S F 1 ( D e c , 23 ) S F 2 ( D e c , 23 ) ) 2 ] -
Mean squared error (MSE) between two shading matrices for each month (%) M S E ( m o n t h ) = h o u r = s u n r i s e s u n s e t ( S F 1 ( m o n t h , h o u r ) S F 2 ( m o n t h , h o u r ) ) 2 N P . D . S 100 12 months
Mean squared error (MSE) between two shading matrices over one year (%) M S E ( a n n u a l ) = m o n t h = J a n D e c M S E ( m o n t h ) 12 -
Table 4. Sunrise time, sunset time, and possible duration of sunshine during each month in the study area.
Table 4. Sunrise time, sunset time, and possible duration of sunshine during each month in the study area.
JanFebMarAprMayJunJulAugSepOctNovDec
Sunrise time (h)887766667788
Sunset time (h)161717171818181817161616
Possible duration of sunshine (h)910111113131313111099
Table 5. Mean shading factors over one year, in May–October, and in November–April at 12 survey points.
Table 5. Mean shading factors over one year, in May–October, and in November–April at 12 survey points.
Point IDAnnual MSF (%)Annual MSE (%)May-Oct MSF (%)May-Oct MSE (%)Nov-Apr MSF (%)Nov-Apr MSE (%)
3D ModelFisheye3D ModelFisheye3D ModelFisheye
A116.956.929.49.238.319.224.675.539.6
A211.432.116.58.134.021.014.730.112.0
A314.848.732.96.744.030.723.053.335.0
A430.154.119.111.540.317.948.667.920.4
Avg.18.347.924.58.939.122.227.756.726.8
B141.558.312.423.333.87.359.682.717.5
B219.251.025.511.345.122.527.056.828.4
B324.643.324.219.928.617.629.458.030.8
B417.350.333.510.340.730.524.360.036.4
Avg.25.650.723.916.237.119.535.164.428.3
C181.174.52.667.858.43.994.390.51.3
C278.274.03.973.469.75.283.078.42.7
C372.971.20.975.675.50.470.266.81.5
C475.476.71.054.957.01.895.996.40.1
Avg.76.974.12.167.965.22.885.983.01.4
Table 6. Mean shading factors over one year during May–October and November–April, estimated using a 3D model with and without tree models at survey points B1 and B4.
Table 6. Mean shading factors over one year during May–October and November–April, estimated using a 3D model with and without tree models at survey points B1 and B4.
Point No.Contents of the 3D ModelMSE (%) between Shading Matrices of Two Methods
AnnualMay-OctNov-Apr
B1Only Buildings12718
Buildings + Trees232
% Decrease80.764.687.4
B4Only Buildings333136
Buildings + Trees685
% Decrease81.674.587.6
Table 7. Comparison of shading effect calculation methods of this study and previous studies.
Table 7. Comparison of shading effect calculation methods of this study and previous studies.
ReferencesMethodsConsiderations
Temporal VariationSpatial
Variation
BuildingsTrees
Araki et al. [23,24,25]Calculating the shading fraction considering randomly extracted buildings’ heights
Lodi et al. [26]Empirical assumption based on the statistical analysis
Ota et al. [27,28,29]Calculating the effective shading angles using fisheye images
Oh et al. [34]Calculating the hemispherical shading maps using the DSM
Kim et al. [35]Calculating the binary shading factor using the DSM
OursGenerating shading matrices using 3D models and fisheye images
● indicates that consideration has been taken into account in the shading effect calculation.
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Baek, J.; Choi, Y. Comparative Study on Shading Database Construction for Urban Roads Using 3D Models and Fisheye Images for Efficient Operation of Solar-Powered Electric Vehicles. Energies 2022, 15, 8228. https://doi.org/10.3390/en15218228

AMA Style

Baek J, Choi Y. Comparative Study on Shading Database Construction for Urban Roads Using 3D Models and Fisheye Images for Efficient Operation of Solar-Powered Electric Vehicles. Energies. 2022; 15(21):8228. https://doi.org/10.3390/en15218228

Chicago/Turabian Style

Baek, Jieun, and Yosoon Choi. 2022. "Comparative Study on Shading Database Construction for Urban Roads Using 3D Models and Fisheye Images for Efficient Operation of Solar-Powered Electric Vehicles" Energies 15, no. 21: 8228. https://doi.org/10.3390/en15218228

APA Style

Baek, J., & Choi, Y. (2022). Comparative Study on Shading Database Construction for Urban Roads Using 3D Models and Fisheye Images for Efficient Operation of Solar-Powered Electric Vehicles. Energies, 15(21), 8228. https://doi.org/10.3390/en15218228

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop