Next Article in Journal
Solving a Stochastic Multi-Objective Sequence Dependence Disassembly Sequence Planning Problem with an Innovative Bees Algorithm
Previous Article in Journal
Detection of Novel Objects without Fine-Tuning in Assembly Scenarios by Class-Agnostic Object Detection and Object Re-Identification
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Design and Application Research of a UAV-Based Road Illuminance Measurement System

1
Research Institute of Photonics, Dalian Polytechnic University, Dalian 116039, China
2
Guizhou Zhifu Optical Valley Investment Management Co., Ltd., Bijie 551700, China
3
CQC Standard (Shanghai) Testing Technology Co., Ltd., Shanghai 201114, China
*
Authors to whom correspondence should be addressed.
Current address: Lianyungang Center, Taihu Laboratory of Deep-Sea Technological Science, Lianyungang 222000, China.
Automation 2024, 5(3), 407-431; https://doi.org/10.3390/automation5030024
Submission received: 11 June 2024 / Revised: 19 July 2024 / Accepted: 16 August 2024 / Published: 22 August 2024

Abstract

:
This paper presents a UAV-based road illumination measurement system and evaluates its performance through experiments. The system employs a HUBSAN Zino 2+ UAV, STM32F103RCT6 microcontroller, BH1750 illuminance sensor, and GPS and integrates flight, processing, measurement, cloud platform, obstacle avoidance, communication, and power supply units via the OneNET cloud platform. Both hardware and software designs were implemented, using the Z-score algorithm to handle outliers in illumination data. The system showed a single-point measurement error rate of 1.14% and a MAPE of 5.08% for multi-point measurements. In experiments, the system’s horizontal and vertical illuminance RMSE were 1.92 lx and 1.75 lx, respectively. The real-time visualization interface improved operational efficiency, cutting labor costs by half and time costs by nearly four-fifths. UAV control and monitoring from the roadside ensured safety during measurements. The system’s efficiency and wide measurement range enabled extended experiments, collecting illuminance data across multiple horizontal and vertical planes. This resulted in the creation of both horizontal and innovative vertical-plane illuminance distribution maps. These findings provide valuable data for evaluating road lighting quality, enhancing road traffic safety, and improving road illumination design.

1. Introduction

With the continuous development of road traffic and the increasing volume of traffic, nighttime road lighting has become increasingly important [1]. Adequate road lighting enhances drivers’ perception of road conditions and helps reduce traffic accidents [1,2]. China has strict standards for road lighting, as outlined in the “CJJ 45-2015 Road Lighting Design Standards”, which classify motor vehicle road lighting into expressways, main roads, secondary roads, and branch roads [3]. For asphalt surfaces, the average illuminance (Eh.av) for main roads is 20–30 lx, with a minimum uniformity (UE) of 0.4; secondary roads have an average illuminance of 15–20 lx, with a uniformity of 0.4; and branch roads have an average illuminance of 8–10 lx, with a uniformity of 0.3 (lx abbreviates lux, i.e., lumens per unit area in the SI system of units). For concrete surfaces, the average illuminance values are reduced by 30% [3]. The “GBT 5700-2023 Lighting Measurement Methods” specifies the methods and contents for measuring outdoor lighting. Traditional road lighting detection [4], which involves manually placing points and recording data at each point, is time-consuming, has low automation levels, and poses safety hazards [5]. As road traffic scale expands and traffic networks become increasingly complex, traditional road lighting detection methods face numerous challenges [5,6].
In recent years, UAV technology has been rapidly developing and has been widely used in various fields. Fatma Outay et al. provided an overview of recent advances in UAVs for road safety, traffic monitoring, and highway infrastructure management, emphasizing the advantages of UAVs in improving the efficiency and accuracy of detection [7]. In the field of traffic monitoring, Jie Wang et al. proposed a vehicle detection and tracking method based on UAVs and a deep learning framework, achieving 92.1% single-target tracking accuracy and 81.3% multi-target tracking accuracy by capturing traffic flow videos through a camera [8]. Additionally, Mahmud Hossain et al. developed a UAV-based traffic monitoring system for smart cities, capable of real-time traffic condition monitoring and providing effective data support [9]. In the field of target detection and obstacle avoidance, Gulay Unal utilized Kalman filter and background subtraction algorithms to achieve accurate tracking of multiple targets by UAVs [10]. Emre Kiyak realized autonomous obstacle avoidance control of quadcopters by integrating terrestrial robot obstacle avoidance algorithms and adopting fuzzy and PID control techniques, verifying the ability of UAVs to autonomously avoid obstacles in dynamic and complex environments [11]. For highway infrastructure management, Sainab Feroz et al. comprehensively reviewed the application of UAVs combined with remote sensing technology in bridge condition monitoring, exploring the future prospects of various non-destructive testing methods [12]. In Wenzhou, Zhejiang Province, 262 intelligent streetlights equipped with UAV inspection functions were installed on the new Electric City Avenue, which opened to traffic in 2021. These UAVs can undertake most of the timed inspections and emergency inspection tasks during later operation and maintenance work [11,12]. In the field of nighttime lighting environment assessment, Xi Li et al. utilized UAVs to monitor urban nighttime lights and found that the measurement results from UAVs and ground equipment were highly consistent [13]. Luciano Massetti et al. explored the application of UAVs in urban light pollution monitoring, demonstrating a significant correlation between UAV data and traditional ground-based measurements, thereby proving the reliability and efficiency of UAVs in light pollution monitoring [14]. M. T. Vaaja et al. proposed a methodology to integrate UAV imagery with 3D measurement techniques and luminance imaging, verifying the feasibility of 3D luminance point clouds for assessing nighttime roadway lighting conditions [15]. These applications demonstrate the great potential of UAVs in various complex environments, providing technical support and reference for road illumination measurements [7,8,9,10,11,12,13,14,15].
Current research progress in illuminance measurement sensor technology is primarily reflected in the development of automated measurement systems, which utilize motorized vehicles, smart carts, or drones equipped with illuminance meters for mobile inspection [16]. Piotr Jaskowski et al. proposed a mobile system for measuring roadway illuminance, overcoming the limitations of traditional handheld meters, and providing a more efficient and accurate measurement method [17]. Additionally, Rémy Guyonneau et al. developed a robotic system named Romulux, which integrates 3D LiDAR and an illuminance meter. This system uses the Robotic Operating System (ROS) and Simultaneous Localization and Mapping (SLAM) algorithms to generate indoor illuminance maps and compare them with standards to achieve energy savings [18]. Cheng-Hsien Chen et al. designed a specially equipped vehicle for rapid and efficient measurement of road lighting conditions. The effectiveness of this technique was experimentally validated, showing a high correlation with traditional methods [19]. UAVs are capable of moving flexibly over roads and covering large measurement areas quickly, significantly enhancing measurement efficiency. Przemyslaw Tabaka conducted pilot measurements to quantify and assess the impact of lighting on light pollution by mounting an illuminance meter on a UAV [20]. Shengwei Jia et al. proposed a UAV-based illuminance measurement system employing the LOF algorithm to improve stability and accuracy, which was verified through stadium experiments [21]. Compared to other automated measurement systems, such as ground robots, UAVs offer clear advantages in terms of flexibility and speed [22]. While ground robots can also perform automated measurements, they move slowly and are more restricted by terrain, making them less suitable for complex road environments.
Although significant research has been conducted on the application of UAVs in traffic monitoring, infrastructure management, and lighting environment assessment, their application in road illuminance measurement remains in its infancy [23]. Current research on UAVs in illuminance measurement primarily involves carrying illuminance meters or designing simple measurement devices [24], with few systems specifically tailored for UAVs [25,26]. This paper proposes a high-precision illuminance measurement system based on UAVs, capable of providing comprehensive and accurate illuminance data in complex road environments through multi-angle measurements and position information acquisition. The system employs the Z-score algorithm to process measurement data, enhancing stability and accuracy. Additionally, an integrated obstacle avoidance function ensures system safety. Compared with existing mobile measurement systems, the UAV-based system designed in this study offers significant advantages in terms of flexibility and measurement efficiency, enabling rapid coverage of large areas and providing real-time data analysis and visualization interfaces. Experimental validation in real road scenarios demonstrates the system’s effectiveness in improving field measurement operability, reducing labor and time costs, and ensuring the safety of surveyors. Section 2 details the system architecture and methodology for designing and implementing the illuminance measurement system. Section 3 presents illuminance acquisition experiments and analyzes the results, demonstrating the system’s ability to enhance on-site measurement operability and efficiency, save manpower and time costs, and ensure the safety of measurement personnel. Section 4 provides a summary of the paper.
In summary, this research not only fills gaps in existing studies but also offers valuable references for future road lighting management.

2. Materials and Methods

2.1. System Architecture

The UAV-based illuminance collection system is comprised of seven main components: the flight unit, processing unit, measurement unit, obstacle avoidance unit, communication unit, power supply unit, and cloud platform unit [27].
This system conducts road illuminance data collection at night, requiring the UAV to maintain stability during nighttime flights and to achieve fixed-point hovering for measurements. Based on these requirements, the HUBSAN Zino 2+ UAV was ultimately selected as the flight unit for this system. In terms of positioning, the HUBSAN Zino 2+ UAV features both optical flow positioning and dual-mode satellite positioning. At night, when ground brightness is low, optical flow positioning becomes ineffective, making UAVs that primarily rely on optical flow positioning unsuitable for this system’s measurement tasks. Under low-light conditions at night [28], the HUBSAN Zino 2+ employs dual-mode satellite positioning, which enables precise hovering both indoors and outdoors, meeting the requirements of this study [29].
The minimum system of the microcontroller determines whether the system can function properly. This system uses the STM32F103RCT6 microcontroller (STMicroelectronics, Geneva, Switzerland) in an LQFP64 package, supported by clock circuits and a reset circuit to form the processing unit [30]. The STM32F103RCT6 is an integrated circuit (IC) embedded microcontroller with a 32-bit Cortex-M3 core processor, operating at a speed of 72 MHz, with a program memory capacity of 256 KB of FLASH type and a RAM capacity of 48 K [31]. It is known for its high performance and low power consumption. Additionally, it boasts a wealth of peripherals, including general-purpose timers, multiple serial communication interfaces, and analog/digital converters, fully meeting the data acquisition and control requirements of this system [32]. The STM32F103RCT6 forms the core of the processing unit, as illustrated in Figure 1.
For illuminance data collection, a high-accuracy, fast-response, and stable-performance BH1750 illuminance sensor was chosen. The BH1750 offers ease of use through its simple I2C connection and configuration without the need for elaborate calibration. Its high resolution, wide dynamic range, low power consumption, and quick response time make it a widely used high-performance sensor in light measurement applications [27]. To enable switching between horizontal and vertical illuminance measurements, the MG90S servo is employed to assist in data collection [33]. This compact and lightweight servo is well-suited for UAV applications that require sensitivity to weight. It offers high precision, strong torque, fast response, and cost-effectiveness. For accurate scenario mapping in road scenes, collecting mere illuminance data is insufficient. Location coordinates during data collection are crucial, especially for outdoor road illuminance measurement, hence the inclusion of the NEO-7M GPS module [31]. The NEO-7M, equipped with a rechargeable backup battery, retains configuration parameters after power loss and offers excellent high-tracking sensitivity, ensuring precise location even where typical GPS modules fail. The BH1750 illuminance sensor, MG90S servo motor, and NEO-7M-GPS module together constitute the measurement unit of this system.
When collecting data in road scenarios, the system may encounter vehicles and pedestrians. To intelligently avoid obstacles, two HC-SR04 ultrasonic modules are employed, placed at the front and back of the system, forming the obstacle avoidance unit capable of detecting obstacles both ahead and behind to facilitate timely avoidance [32]. Ultrasonic obstacle avoidance offers good real-time performance, rapid response, precise distance measurement, low power consumption, and strong adaptability [34], as it is insensitive to changes in environmental light and color.
All collected illuminance data, location information, and obstacle data are transmitted to the cloud platform via the WiFi wireless transmission technology using the ESP-01S ESP8266 module as the communication unit [35]. The ESP8266, an integrated Wi-Fi module, supports the 802.11 b/g/n Wi-Fi standard, enabling reliable communication in various network environments and is widely used in IoT, embedded systems, and wireless communication sectors due to its ease of configuration and control with AT commands [36].
The foundation for stable operation of all these units is the power supply unit, which employs the TP4056 charging module and a 3.7 V 2000 mAh lithium battery. The TP4056 provides simple peripheral circuitry, good protection performance, high charging precision, and adjustable output current, enabling in-system battery charging [37]. The lithium battery is characterized by high energy density, lightweight design, long life, low self-discharge rate, stable voltage output, fast charging, absence of memory effect, and high discharge current, making it an ideal energy choice for the system.
The cloud platform unit utilizes China Mobile’s OneNET cloud platform, comprising cloud servers and a visualization interface. As part of China Mobile’s IoT platform, OneNET provides cloud data storage services, supporting long-term preservation and querying of device data. It offers a wealth of development tools and APIs, supports various communication protocols, and provides developers with convenience [38]. The platform’s user-friendly visualization interface aids in monitoring and managing IoT devices.
The structure of these seven units is illustrated in Figure 2.
The flight unit consists of the HUBSAN Zino 2+ UAV, which carries all other units and serves as the dynamic support of the system, facilitating functions such as flight and fixed-point hovering. The processing unit is centered around the STM32F103RCT6 microcontroller, forming the core system that controls all sensors and acts as the central brain. The measurement unit is composed of the BH1750 illuminance sensor, MG90S servo, and GPS module. The illuminance sensor measures road illuminance, the servo switches between horizontal and vertical illuminance measurements, and the GPS module provides real-time positioning of the UAV and measurement points. The illuminance sensor and GPS communicate with the processing unit using the IIC communication protocol and UART serial communication protocol [39,40], respectively, while the servo requires only a single signal line and is driven using PWM control [41].
The obstacle avoidance unit consists of two ultrasonic modules that collect real-time distances to obstacles such as vehicles and pedestrians for avoidance, communicating with the processing unit via square wave communication signals. The communication unit is made up of the ESP8266 WiFi module, which connects to the internet via WiFi and sends data from the MCU to the cloud server, using UART serial communication protocol to connect with the processing unit (MCU) [42].
The power supply unit comprises a TP4056 charging module and a lithium battery, providing stable, continuous, and safe power to the entire system. The cloud platform unit consists of cloud servers and a visualization interface. The cloud server communicates with the ESP8266 using the MQTT protocol, enabling data upload and command issuance [38]. Upon receiving data packets, the cloud server performs real-time analysis and storage of the data, and the collected data is presented through the visualization interface, allowing users to monitor and control devices in real-time.
The structure is illustrated in Figure 3, which also shows inter-unit communications.

2.2. System Software Design

The software design of this system is divided into two main areas: cloud platform development and system program design. Initially, the cloud platform development involves several steps: creating devices and data streams, obtaining the device’s API key, configuring the ESP8266, creating a visualization interface in the OneNET console, configuring visualization elements to bind to the device data streams, and monitoring via the cloud platform [43]. The specific process is illustrated in Figure 4.
For the system program design, the Keil μVision5 development tool is utilized, and it is divided into three parts: communication programming, illuminance collection programming, and obstacle avoidance programming.
The communication programming segment is a critical component of the design, enabling the uploading of collected data and the issuance of control commands. This part includes WiFi connection and cloud platform integration. The specific programming process flow is depicted in Figure 5.
Illuminance collection programming is the core of the system’s software design, involving illuminance data collection, switching between horizontal and vertical illuminance, control of measurement positioning, and sending location information. The detailed programming process flow is shown in Figure 6.
Obstacle avoidance programming ensures the safety of the entire system, implementing functions such as obstacle measurement and sending alert messages. The programming layout is shown in Figure 7.

2.3. System Outlier Handling

In the process of collecting road illuminance data with drones, anomalies may occur due to sensor errors, equipment malfunctions, environmental changes, and flight vibrations. These anomalies can impact subsequent analysis and have negative consequences. To address the situation where data points for individual measurement points are limited, we employ the lightweight Z-Score algorithm [44]. This algorithm measures the deviation of data points from the mean of the dataset, helping us identify potential anomalies.
Z-score is a standardization method used to measure the position of data points within a dataset. It calculates the relative position of a data point by dividing the difference between the original data point and the mean of the dataset by the standard deviation. Here is a detailed description of the Z-score algorithm [45].
Step 1: Calculate Mean and Standard Deviation
For a given illuminance dataset, calculate the mean and standard deviation of the data:
M e a n = i = 1 n x i n
S t a n d a r d   D e v i a t i o n = i = 1 n ( x i M e a n ) n
where n is the number of data points and x i is the value of the i -th data point.
Step 2: Calculate Z-Score
For each illuminance data point x i , calculate its Z-score using the following formula:
Z i = x i M e a n S t a n d a r d   D e v i a t i o n
Z-score represents the deviation of a data point from the mean in terms of standard deviations. If the Z-score is positive, it indicates that the data point is above the mean; if negative, it indicates that the data point is below the mean.
Step 3: Set Threshold
The threshold for Z-score is typically chosen between 2 and 3. This threshold is used to determine whether data points deviate from the normal range, indicating potential outliers. For the dynamic measurement application of this system, a threshold of 2 is chosen [44,45].
Step 4: Identify Outliers
Inspect the calculated Z-scores and identify values that exceed the set threshold. These data points are considered outliers.
Step 5: Handle Outliers
Based on application requirements, there are three common ways to handle outliers: removing them, replacing with the mean or median, and interpolation. This system adopts the approach of removing outliers without considering their impact on subsequent analysis.
The outlier handling process designed according to the five steps mentioned above is illustrated in Figure 8.
During the measurement of road illuminance data, the time taken for the cloud platform to issue commands is approximately 1 s, and data upload takes about 1 s as well. Setting the BH1750 illuminance sensor to continuous measurement mode, at its highest sampling rate, it can measure close to 10 data points per second. Through the outlier handling and analysis of these 10 data points, precise measurements can be obtained for each measurement point. After processing outliers, an illuminance data point is obtained and sent back to the cloud platform for visualization. Applying the outlier handling algorithm on the STM32 microcontroller can reduce the frequency of data transmission and conserve resources.

2.4. System Integration

Based on the system design methods described above, system integration involves both the physical assembly and cloud platform development. Given the UAV’s limited load-bearing capacity, the physical system must be compact and lightweight. Simply connecting the units using breadboards or prototyping boards does not meet the safety, stability, and lightness requirements of this system. Therefore, circuit schematics and PCB layouts are designed using Altium Designer software (version 2021). Components are soldered onto the circuit board one by one, and then the units are integrated to form the physical hardware of the system, as detailed in Figure 9, which shows the placement of various hardware modules and components within the system.
The visualization interface developed on the cloud platform is shown in Figure 10. It features a dashboard and graphs that can display current illuminance data, which can be refreshed every second. An information alert box displays whether the UAV’s position is safe in real time; two buttons control the start and end of illuminance data collection, as well as switch between horizontal and vertical illuminance measurements; a map displays the real-time location of the UAV illuminance measurement system; and the platform also displays latitude and longitude data in real time.
The illuminance measurement range of the system is 0–65,535 lx. The UAV has a single flight endurance of approximately 34 min, with a data transmission distance of up to 100 m. The system achieves an illuminance acquisition time as fast as 120 milliseconds and a GPS position refresh rate of 1 s. Illuminance and GPS data are uploaded to the cloud platform every second. The servo’s data response time is 0.5 s, and the system has an obstacle avoidance distance of approximately 5 m.

2.5. System Test

After the system integration is completed, accuracy and stability tests are conducted on the system, divided into two parts: single-point measurement to verify the system’s outlier handling performance, and multi-point measurement to verify the accuracy and stability of the system’s multi-point measurement capabilities [46].
For the first test, single-point measurement involves slight adjustments to the outlier handling program, allowing the UAV to hover at a measurement point for 60 s to collect illuminance data, transmitting unprocessed illuminance data every second, and sending real-time GPS location information. On the cloud platform, 60 illuminance data points can be refreshed, and the system records the current location information, processes outliers in the returned data, and calculates the illuminance value. To avoid the influence of factors such as daylight, the experiment is conducted after 9 PM under a streetlight, as shown in Figure 11.
During the fixed-point hovering measurement, some outlier values may occur due to UAV shaking and environmental changes. We use the Z-score outlier handling algorithm to process the data, thereby enhancing measurement stability. First, the mean and standard deviation of the collected illuminance data are calculated as mean = 85.1 and standard deviation = 10.0. Then, using the Z-score algorithm, the Z-score values for data points 3, 4, 7, 8, 10, and 11 are calculated, with absolute values greater than the system threshold of 2, identifying these six data points as outliers, marked in red in the figure, while normal values are shown in blue. After excluding outliers, the average illuminance value at the current measurement point is approximately 87 lx, and the SPIC-200 light meter measures an illuminance of 88 lx at the same point, resulting in a measurement error rate of 1.14% for this system at this measurement point.
For the second test, multi-point measurement involves the UAV flying to multiple road measurement points to collect illuminance data and transmitting real-time GPS location information for each measurement point. Simultaneously, a SPIC-200 light meter is manually used to measure standard illuminance data at these points. The comparison of the two sets of data verifies the accuracy and stability of the system in multi-point measurement conditions. This experiment is also conducted after 9 PM, with the comparison of the two sets of illuminance data shown in Figure 12.
The mean absolute percentage error (MAPE) is chosen as the metric to evaluate the system’s measurement accuracy, calculated as follows [47]:
M A P E = 100 % n i = 1 n y ^ i y i y i
where n is the number of measurement points, y i is the standard illuminance value measured by the SPIC-200 illuminance meter, and y ^ i is the illuminance data obtained by the unmanned aerial vehicle illuminance measurement system. By calculation, the MAPE of this dataset is determined to be 5.08%.
Thus, the accuracy and stability tests for this system are completed. In single-point measurements, the system’s measurement error rate is only 1.14%, with the Z-score algorithm showing stable outlier handling performance. In multi-point measurements, the system’s MAPE is 5.08%. Image analysis also shows a small discrepancy between the two. In conclusion, the UAV-based illuminance measurement system designed in this study is suitable for road illuminance measurements, with high stability and accuracy.

3. Results

3.1. Illuminance Collection Human–Machine Comparison Experiment

After completing system integration and testing, an experiment was conducted to collect road illuminance data on a selected road in Dalian. This road is a single-lane with single-sided lighting, a lane width of 4.5 m, and streetlights at a height of 3 m. The road runs east–west, with the distance between the first and second streetlights being 31 m, and between the second and third streetlights being 33 m. The lighting layout is illustrated in Figure 13. In this experiment, we used both the UAV-based road illuminance measurement system designed in this study and traditional manual road illuminance measurement methods. To avoid the influence of daylight and other factors, the illuminance data collection was conducted after 9 PM.
Based on the GB/T 5700-2023 “Illumination Measurement Methods” standard, and taking into account the actual conditions of the road, the area between the first and second lamp posts was selected for measurement. According to the center point method, the measurement area was divided into 30 rectangles, each 3.1 m long and 1.5 m wide, with the center of each rectangle serving as a measurement point, marked with a yellow marker to ensure full coverage between the two lamps. The layout of the measurement points is illustrated in Figure 14.
Firstly, manual measurements were taken along the road by four individuals, two measuring and recording data, and two ensuring on-site safety and monitoring passing vehicles to assist in observing any deviations in measurement. Horizontal illuminance at ground level and vertical illuminance at 1.5 m were measured; for ground-level horizontal illuminance, measurement devices were placed at the measurement points, taking about 30 s from start to data recording; for 1.5 m vertical illuminance, a tape measure was used to mark 1.5 m height, and a lux meter was placed at the measurement points for another 30 s. Each measurement point repeated these two steps, with the entire manual data recording process taking 40 min to complete, recording 30 sets of horizontal and vertical illuminance data. This manual measurement process was physically demanding for the personnel involved.
Subsequently, after the manual measurements, the UAV-based illuminance measurement system was used to collect data from the mentioned 30 measurement points. This collection was completed by two people, one controlling the UAV and adjusting its flight and measurement positions, and the other monitoring the visualization terminal to control the start and end of the illuminance collection and switch between horizontal and vertical illuminance measurements.
During the collection process, the UAV continuously transmitted and recorded current flight positions, altitudes, and image data. The UAV paused for about 5 s at each measurement point, then used the Z-score algorithm to remove outliers, averaging the remaining valid data as the illuminance value for that point. These processed illuminance data along with GPS location data were sent back to the cloud platform and displayed on the visualization terminal.
The UAV-based measurement method was similarly applied to the 30 measurement points for both ground-level horizontal and 1.5 m vertical illuminance, completing in about 8 min. Based on the measured illuminance data and GPS location information, horizontal and vertical illuminance distribution diagrams were drawn, as shown in Figure 15 and Figure 16 [48].
The road illuminance distribution diagram uses road width as the x-axis, the distance between two lamps as the y-axis, and illuminance values as the z-axis, depicting the distribution of road illuminance. The horizontal illuminance distribution diagram reflects the overall illumination on the ground, showing higher illuminance values near the lamps and significantly lower values between them, due to the limited radiation range of the lamps. The vertical illuminance distribution diagram highlights the vertical distribution of lighting, similar to the horizontal distribution, with higher illuminance near street lamps and lower between them [48]. These distribution diagrams provide a more intuitive understanding of spatial changes in road illuminance, offering robust data support for in-depth analysis of road lighting.
Comparing manually measured illuminance data with UAV-collected data, the root mean square error (RMSE) for horizontal illuminance data is calculated at 1.92, and for vertical illuminance at 1.75, indicating small errors and high stability and accuracy of the system [49].
R M S E = 1 n i = 1 n ( x m a n u a l x S P I C ) 2
It is noteworthy that the system’s use of a real-time visualization interface enhances data processing efficiency, especially for subsequent detailed analyses and data processing, requiring only data retrieval from the server, avoiding the tedious process of manual data entry. This feature plays a significant role in improving work efficiency and reducing human errors.
In terms of testing efficiency, traditional manual measurements requiring four people and 40 min can be completed by this system with two people in 8 min, saving half the manpower and about 80% of the time, significantly enhancing on-site measurement efficiency.
In terms of safety, traditional manual measurements require personnel to be on the road from setup to completion, whereas with this system, measurement personnel simply control and monitor the UAV from the roadside, greatly enhancing the safety of road measurements and avoiding errors due to obstruction of light during manual measurement.

3.2. UAV Extended Measurement Experiment

Based on the human–machine comparison experiments discussed, it is evident that the drone-based illumination measurement system significantly enhances testing efficiency compared to traditional manual methods. Traditionally, illumination testing has been confined to measuring the luminance on road surfaces, merely evaluating the distribution of luminance across these surfaces. Such analysis is limited as it does not comprehensively study the spatial luminance or light flow of the road. The equipment used in this study enables measurements at heights difficult for manual methods to reach, such as 2 m, 3 m, and 5 m. Given these issues and the system’s advantages, this study contemplates using drones for extended measurement experiments, conducting multi-layered and multi-dimensional research on the drone-assisted road illumination flow at various heights and directions.
To better study the illumination light flow on the road, we decided to measure the illumination range covered by three street lights, not just the area between two lights. Specifically, we expanded the measurement area by adding three measurement points at each of the two edge street lights, increasing the measurement area from the original 31 m to 81.6 m. The transmission distance of the illuminance data is 100 m. If the surveyor is positioned at the center of the test area, the effective transmission distance can reach up to 200 m, ensuring that the measurement area of 81.6 m is within the system’s performance capabilities. The UAV is capable of conducting measurements at different heights and in various directions within this extended measurement area. Figure 17 illustrates the schematic diagram of the road illumination measurement system using UAVs.
The diagram indicates the drone’s flight direction and path during measurement, with a measurement point every 3.1 m along the path, totaling 26 points on one line and 78 points across the entire area. From a horizontal perspective, the drone can measure luminance at various heights determined by the street lamp heights. Vertically, the drone can measure the road’s longitudinal vertical planes, with specific vertical measurement divisions based on the road’s lane distribution and lighting conditions. For this road, vertical planes 1–3 are set at 0.75 m, 2.25 m, and 3.75 m from the eastern edge.
Study of Horizontal Luminance Distribution:
Initially, the road’s horizontal luminance was studied. With street lamps at 3 m high, the drone measured at ground level, 1 m, 1.5 m, and 2 m. The luminance data and GPS information collected by the drone were used to plot the luminance distribution curves at different heights, as shown in Figure 18a–d.
Figure 18a represents the illuminance distribution at ground level, with a maximum illuminance (Emax) of 138 lx. The average illuminance (Eav), calculated using the zero-replacement method, is 23.46 lx, and the illuminance uniformity (U) is 0.0213. This figure exhibits the most uniform illuminance distribution among all figures [50]. Although the maximum illuminance is lower compared to the other figures, the peaks and troughs of illuminance across the road surface are relatively smooth.
Figure 18b represents the illuminance distribution at a height of 1 m. The maximum illuminance (Emax) is 214 lx, the average illuminance (Eav) is 25.8 lx, and the illuminance uniformity (U) is 0.0194. It can be observed that the illuminance begins to converge with increased height, leading to a rise in the maximum illuminance and a slight decrease in uniformity.
Figure 18c further illustrates the illuminance distribution at a height of 1.5 m, with a maximum illuminance (Emax) of 252 lx, an average illuminance (Eav) of 29.37 lx, and an illuminance uniformity (U) of 0.017. At this height, illuminance fluctuations become more pronounced, especially near the light source, with noticeable sharp peaks and broader troughs in the illuminance distribution, indicating the emergence of hotspots and shadowed areas.
Figure 18d shows the illuminance distribution at a height of 2 m, with a maximum illuminance (Emax) of 384 lx, an average illuminance (Eav) of 32.14 lx, and an illuminance uniformity (U) of 0.0156. Significant fluctuations in illuminance can be observed at this height, suggesting that the light source may be overly concentrated or unevenly distributed, resulting in sharp peaks and extensive troughs.
The analysis indicates that as the measurement height increases, illuminance uniformity decreases, with peaks becoming steeper and troughs widening. While lower levels exhibit better uniformity, they also show lower maximum illuminance values. This suggests that improvements in uniformity may come at the cost of maximum illuminance. In certain areas, this could mean that the lighting intensity may fall below safety standards or design requirements. In road lighting design, it is crucial to avoid extreme variations in illuminance, particularly in areas with frequent vehicle and pedestrian traffic. Excessive illuminance can cause glare, while insufficient illuminance can impact the visual comfort and safety of pedestrians and drivers.
Study of Vertical Illuminance Distribution:
This study next explored the vertical illuminance of the road in question. Prior research has completed comprehensive measurements of horizontal illuminance, and based on this data, we further analyzed and evaluated the horizontal illuminance from a vertical perspective. Since illuminance at higher spatial positions is challenging to measure, similar vertical illuminance distribution charts are rare in the existing literature.
We divided the road’s measurement area into three vertical planes, as shown in Figure 17. The horizontal illuminance distributions across these vertical planes are illustrated in Figure 19a–c. These vertical illuminance distribution charts easily identify the range of each light source, which is highly effective in assessing street lamp arrangements and their impact on surrounding area lighting.
Figure 19a: vertical plane 1. As this plane is closest to the streetlight, the highest illuminance value is substantial, with Emax reaching 384 lx. The average illuminance (Eav) is 51.92 lx, and the illuminance uniformity (U) is 0.0096. This indicates very strong lighting directly beneath or near the light source, as expected. However, the concentration of high illuminance in a narrow area can result in an uneven light distribution, creating a sharp gradient from high to low illuminance. This rapid transition may disrupt the vision of drivers and pedestrians.
Figure 19b: vertical plane 2. The highest illuminance value (Emax) is reduced to 150 lx, with an average illuminance (Eav) of 26.82 lx and an illuminance uniformity (U) of 0.0186. This indicates that the illumination weakens in areas farther from the streetlights. This reduction in concentration improves the uniformity of the illuminance distribution, although significant areas of high illuminance remain, suggesting relatively large spacing between streetlights.
Figure 19c: vertical plane 3. The furthest vertical plane shows a further decrease in the maximum illuminance value (Emax) to 50 lx, with an average illuminance (Eav) of 8.56 lx and an illuminance uniformity (U) of 0.0584. This suggests that the intensity of illumination significantly diminishes with increased distance from the streetlights, potentially leading to under-illumination. Practically, this may indicate the need for additional streetlights or adjustments to the existing layout to ensure adequate lighting.
From these three figures, it is evident that the road’s illumination is concentrated in specific areas, leading to uneven distribution. As the vertical plane moves further from the streetlights, the illuminance becomes more uniform, but the overall illumination levels decrease.
To comprehensively evaluate the spatial illuminance distribution of the road, analyzing only the horizontal illuminance is insufficient. Therefore, we used this system to measure the vertical illuminance at heights of 1 m, 1.5 m, and 2 m in four directions: southeast, northwest, north, and west. During the measurement process, it was found that the vertical illuminance on the north side was mostly zero due to the measurement instrument facing away from the light source. Consequently, the vertical illuminance distribution on the north side is not analyzed.
The spatial illuminance distribution in each direction is analyzed below. For the illuminance data at the same height in each direction, the UAV measured point by point following the flight path shown in Figure 17. After completing all the data measurements at one altitude, the UAV proceeded to the next altitude. Figure 20a–c shows the vertical illuminance distribution in the three vertical planes on the south side.
The illuminance distribution on the south side of the vertical plane closely resembles that on the top of the vertical plane, with higher illuminance values observed closer to the fixture. The maximum illuminance (Emax) values for vertical planes 1, 2, and 3 are 213 lx, 174 lx, and 69 lx, respectively. As the distance from the light source increases, the illuminance intensity decreases significantly, but the distribution becomes more uniform.
Additionally, it is observed that the maximum illuminance for vertical planes 1 and 2 occurs at the higher part of the vertical plane, while for vertical plane 3, it is located at the lower part. This distribution pattern is attributed to the angle of the light fixtures. The luminous flux emitted by the light source reaches vertical plane 3 with less flux above and more below, resulting in this specific distribution.
The study continues with an analysis of the spatial vertical illuminance distribution on the east side of the vertical plane. Using the illuminance data measured by the system and the corresponding GPS position information, the vertical illuminance distributions for the three vertical planes on the east side are presented in Figure 21a–c.
Vertical plane 1 exhibits a maximum illuminance (Emax) of 358 lx, an average illuminance (Eav) of 24.69 lx, and an illuminance uniformity (U) of 0.0203. Vertical plane 2 shows a maximum illuminance (Emax) of 157 lx, an average illuminance (Eav) of 14.78 lx, and an illuminance uniformity (U) of 0.0338. Vertical plane 3 records a maximum illuminance (Emax) of 36 lx, an average illuminance (Eav) of 6.62 lx, and an illuminance uniformity (U) of 0.0756.
The illuminance distribution graphs on the east side of the vertical plane exhibit some differences compared to those on the upper and south sides. The uniformity of illumination is somewhat reduced relative to the other two directions. However, it also shows that as the distance increases, the illumination intensity decreases and the distribution becomes more uniform. Additionally, it is observed that the high illuminance concentration area appears to shift to the west (shown as a shift to the right in the figures). This shift occurs because the measurement unit faces east, resulting in increasing illuminance values on the front side close to the light source, while the values drop to nearly zero on the back side. This contributes to the decrease in illuminance uniformity.
Furthermore, the maximum illuminance value on the east side is 358 lx, which is higher than the values on the south side. This discrepancy is attributed to the installation angle of the central luminaire being directed westward, as visually confirmed when viewing from west to east.
Following the analysis of the illuminance distribution on the east side, we now explore the vertical illuminance distribution on the west side of the vertical plane. Figure 22a–c illustrates the vertical illuminance distribution on the west side for the three vertical planes.
Vertical plane 1 exhibits a maximum illuminance (Emax) of 257 lx, an average illuminance (Eav) of 24.86 lx, and an illuminance uniformity (U) of 0.0201. Vertical plane 2 has a maximum illuminance (Emax) of 108 lx, an average illuminance (Eav) of 13.56 lx, and an illuminance uniformity (U) of 0.0369. Vertical plane 3 shows a maximum illuminance (Emax) of 53 lx, an average illuminance (Eav) of 5.72 lx, and an illuminance uniformity (U) of 0.0874.
The characteristics of the vertical illuminance distribution on the west side are opposite to those on the east side, with the high illuminance concentration area offset to the east (to the left in the figure). This offset is due to the measurement unit facing west, causing the measured values to drop to nearly zero when the back of the unit is towards the lamps. Consequently, the uniformity of illuminance on the west side is slightly worse than on the east side.
A comprehensive analysis of the three diagrams reveals that the highest illuminance value on the west side is 257 lx, which is lower than the highest value on the east side. This further confirms that the placement angle of the lamps is directed towards the west.
Based on the analysis of the illuminance data on the east and west sides of the vertical plane, it can be concluded that the longitudinal illuminance distribution of the road exhibits a characteristic shift in opposite directions. The uniformity of illuminance deteriorates as the distance between streetlights increases.
Furthermore, combining the above data reveals that the illuminance at the middle of the street is significantly stronger than at both ends. This discrepancy may be due to the aging of streetlights over time or issues during installation, such as wiring problems, leading to uneven power distribution. These findings underscore the importance of proper installation and maintenance of lighting equipment. While the initial design of road lighting might meet the required standards, factors like installation and aging can affect the uniformity and overall illuminance levels.
To achieve optimal lighting effects, it is recommended to conduct on-site testing to verify the design outcomes and adjust the lighting design accordingly. Additionally, energy-saving and sustainability indicators should be considered to ensure the lighting system is both efficient and economical. This completes the analysis of the spatial illuminance distribution of the road.

3.3. Discussion

In terms of system integration, we constructed a UAV road illuminance measurement system composed of seven units using UAVs, microcontrollers, illuminance sensors, GPS, WiFi, and cloud platforms. The system’s hardware and software were meticulously designed and fabricated to ensure stable performance. Compared with existing systems, our system demonstrates significant innovation and advantages in the following aspects:
  • Multi-dimensional measurement and data visualization. The system can measure illuminance data in multiple dimensions of the road and innovatively map the distribution of illuminance in both horizontal and vertical planes at different heights. This multi-dimensional analysis significantly enhances the comprehensiveness and accuracy of road lighting condition assessments.
  • Efficient data processing. The adoption of the Z-score algorithm eliminates abnormal values, enhancing the system’s anti-interference capability and ensuring the accuracy of measurement data. The system’s visualization interface allows real-time observation and processing of data, improving data processing efficiency and on-site measurement operability, and facilitating detailed analysis and processing at later stages.
  • Ease of operation and safety. Surveyors only need to control and monitor the UAV from the roadside, eliminating the need to operate on the road and significantly improving the safety of road measurements.
In the process of road illumination acquisition, comparative experiments between our system and traditional manual measurement methods showed high consistency in the measurement results. A traditional manual measurement that requires four people to cooperate for 40 min can be completed by our system in only 8 min with two people, saving 50% of the manpower and 80% of the time, thereby greatly improving on-site measurement efficiency.
However, we also note that the system has certain limitations. For instance, the current system is equipped with only one illuminance measurement sensor, resulting in the UAV being able to measure illuminance in only one direction per flight, which affects efficiency to a certain extent. This limitation needs to be addressed in future improvements.
Based on the results of this study, future research can further explore the distribution of horizontal and vertical illuminance on different roads and conduct more in-depth analyses of light flow on multiple roads. Additionally, the introduction of more advanced sensor technologies can be considered to improve the system’s accuracy and efficiency.

4. Conclusions

In response to the rapid development of road traffic lighting and the limitations of traditional road illuminance detection methods, this study designed an unmanned aerial vehicle (UAV) illuminance acquisition system and verified its application in road scenarios. Experimental results demonstrate that this system offers high accuracy and stability, enhancing the operability and efficiency of on-site measurements, reducing costs, and ensuring personnel safety.
In single-point measurements, the Z-score algorithm exhibits excellent outlier processing performance, resulting in a measurement error rate of only 1.14%. The system’s mean absolute percentage error (MAPE) in multi-point measurements is 5.08%. When compared with manual measurements, the root mean square error (RMSE) of the system for horizontal and vertical illuminance is 1.92 and 1.75, respectively. The system’s real-time data processing visualization interface significantly improves on-site measurement operations, saving 50% of labor costs and nearly 80% of time costs, thus offering a more convenient solution for practical applications. Operators can control the UAV and conduct measurements from the roadside, enhancing the safety of road measurements. Additionally, the illuminance data and GPS position data obtained from UAV measurements can be used to generate detailed illuminance distribution maps of the road.
Leveraging these advantages, this study utilizes the UAV illuminance system to measure road illuminance data in multiple dimensions. The system measures the illuminance of horizontal planes at different heights, producing horizontal illuminance distribution maps at various levels. It also divides the measured road into vertical planes, measuring the illuminance in different directions and heights, and innovatively drawing vertical illuminance distribution maps. This comprehensive approach allows for a deeper analysis of spatial changes in road illuminance, providing efficient data references for evaluating road lighting quality, improving road traffic safety, and informing road lighting design.
In summary, the UAV illuminance measurement system introduced in this work has broad application prospects in road scenarios. Further research and improvements can enhance the system’s performance and provide more in-depth analyses of the light flow in different road environments.

Author Contributions

Conceptualization, N.Z. and X.H.; methodology, S.X.; software, S.X.; formal analysis, S.X. and K.L. (Kexian Li); investigation, Q.H. and M.C.; data curation, S.X. and K.L. (Kai Liu); writing—original draft preparation, N.Z.; writing—review and editing, N.Z. and S.X. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by the Bijie City “Revealing Leadership” project, titled “Intelligent Lighting Application for Mountainous Road Public Environment” (Bikehe Major Special Project [2022] No.4), applied for by Guizhou Zhifu Guanggu Investment Management Co., Ltd. through the Light Innovation Service Center. The APC was funded by the Guizhou Zhifu Optoelectronics Technology Enterprise Incubator Project, project number QYFHQ[2021]003, under the Guizhou Science and Technology Platform Talent Program.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Informed consent was obtained from all subjects involved in the study. Written informed consent has been obtained from the patient(s) to publish this paper.

Data Availability Statement

No new data were created or analyzed in this study. Data are contained within the article.

Acknowledgments

The authors would like to express their sincere gratitude to Guizhou Zhifu Guanggu Investment Management Co., Ltd. Light Innovation Service Center for their support. We also extend our thanks to the Lianyungang Center, Taihu Laboratory of Deep-sea Technological Science, for their valuable assistance.

Conflicts of Interest

The authors declare that this study received funding from Bijie Qixingguan Industrial Development Co., Ltd. and Guizhou Zhifu Guanggu Investment Management Co., Ltd. The funder was not involved in the study design, collection, analysis, interpretation of data, the writing of this article, or the decision to submit it for publication.

References

  1. Jackett, M.; Frith, W. Quantifying the impact of road lighting on road safety—A New Zealand Study. IATSS Res. 2013, 36, 139–145. [Google Scholar] [CrossRef]
  2. Tetri, E.; Bozorg Chenani, S.; Räsänen, R.S.; Baumgartner, H.; Vaaja, M.; Sierla, S.; Kosonen, I. Tutorial: Road lighting for efficient and safe traffic environments. Leukos 2017, 13, 223–241. [Google Scholar] [CrossRef]
  3. CJJ 45-2005; Standard for Lighting Design of Urban Road. China Building Industry Press: Bejing, China, 2015.
  4. GB/T 5700-2023; Measurement Methods for Lighting. Standards Press of China: Bejing, China, 2023.
  5. Sheela, K.S.; Padmadevi, S. Survey on street lighting system based on vehicle movements. Int. J. Innov. Res. Sci. Eng. Technol. 2014, 3, 9220–9225. [Google Scholar]
  6. Ye, R.; Ye, D.; Ma, C. Research on field detection method of road lighting. J. Light. Eng. 2017, 28, 107–112. [Google Scholar]
  7. Outay, F.; Mengash, H.A.; Adnan, M. Applications of unmanned aerial vehicle (UAV) in road safety, traffic and highway infrastructure management: Recent advances and challenges. Transp. Res. Part. A Policy Pract. 2020, 141, 116–129. [Google Scholar] [CrossRef]
  8. Wang, J.; Simeonova, S.; Shahbazi, M. Orientation-and scale-invariant multi-vehicle detection and tracking from unmanned aerial videos. Remote Sens. 2019, 11, 2155. [Google Scholar] [CrossRef]
  9. Hossain, M.; Hossain, M.A.; Sunny, F.A. A UAV-Based Traffic Monitoring System for Smart Cities. In Proceedings of the 2019 International Conference on Sustainable Technologies for Industry 4.0 (STI), Dhaka, Bangladesh, 24–25 December 2019. [Google Scholar]
  10. Unal, G. Visual target detection and tracking based on Kalman filter. J. Aeronaut. Space Technol. 2021, 142, 251–259. [Google Scholar]
  11. Kiyak, E.; Gol, G.; Karakoc, T.H. Obstacle detection and collision avoidance using indoor quadrocopter. Int. J. Sustain. Aviat. 2017, 34, 297–311. [Google Scholar] [CrossRef]
  12. Feroz, S.; Abu Dabous, S. UAV-Based Remote Sensing Applications for Bridge Condition Assessment. Remote Sens. 2021, 13, 1809. [Google Scholar] [CrossRef]
  13. Li, X.; Levin, N.; Xie, J.; Li, D. Monitoring hourly night-time light by an unmanned aerial vehicle and its implications to satellite remote sensing. Remote Sens. Environ. 2020, 247, 111942. [Google Scholar] [CrossRef]
  14. Massetti, L.; Paterni, M.; Merlino, S. Monitoring Light Pollution with an Unmanned Aerial Vehicle: A Case Study Comparing RGB Images and Night Ground Brightness. Remote Sens. 2022, 14, 2052. [Google Scholar] [CrossRef]
  15. Vaaja, M.T.; Maksimainen, M.; Kurkela, M.; Virtanen, J.P.; Rantanen, T.; Hyyppä, H. Approaches for mapping night-time road environment lighting conditions. ISPRS Ann. Photogramm. Remote Sens. Spat. Inf. Sci. 2020, 5, 199–205. [Google Scholar] [CrossRef]
  16. Pen, M. Design and Development of Intelligent Illumination Detection System for Sports Venues. Master’s Thesis, Xi'an University of Architecture and Technology, Xi’an, China, 2022. [Google Scholar]
  17. Jaskowski, P.; Tomczuk, P. Measurement systems used in measuring the illuminance of the road. In Proceedings of the 2019 Second Balkan Junior Conference on Lighting (Balkan Light Junior), Plovdiv, Bulgaria, 19–21 September 2019. [Google Scholar]
  18. Guyonneau, R.; Mercier, F.; Boucher, V. Robotic system for indoor illuminance map generation. J. Build. Eng. 2024, 86, 108800. [Google Scholar] [CrossRef]
  19. Chen, C.; Hsu, S.; Yang, T.; Sun, C. Design of an Equipped Vehicle for In Situ Road Lighting Measurement. Sustainability 2023, 15, 10478. [Google Scholar] [CrossRef]
  20. Tabaka, P. Pilot Measurement of Illuminance in the Context of Light Pollution Performed with an Unmanned Aerial Vehicle. Remote Sens. 2020, 12, 2124. [Google Scholar] [CrossRef]
  21. Jia, S.; Zou, N.; Xu, S.; Cheng, M. Applied Research of the UAV Illumination Measurement System in Sports Stadiums. Appl. Sci. 2023, 13, 6774. [Google Scholar] [CrossRef]
  22. Becerra, V.M. Autonomous Control of Unmanned Aerial Vehicles. Electronics 2019, 8, 452. [Google Scholar] [CrossRef]
  23. Fan, B.; Li, Y.; Zhang, R.; Fu, Q. Review on the technological development and application of UAV systems. Chin. J. Electron. 2020, 292, 199–207. [Google Scholar] [CrossRef]
  24. Kontogiannis, S.G.; Ekaterinaris, J.A. Design, performance evaluation and optimization of a UAV. Aerosp. Sci. Technol. 2013, 29, 339–350. [Google Scholar] [CrossRef]
  25. Zhu, Z.; Zhang, M.; Sun, M.; Sun, J. Distributed Illuminance Measurement System Based on TMS320F28335. In Proceedings of the 2019 IEEE 2nd International Conference on Information Communication and Signal Processing (ICICSP), Weihai, China, 28–30 September 2019. [Google Scholar]
  26. Bouroussis, C.A.; Topalis, F.V. Assessment of outdoor lighting installations and their impact on light pollution using unmanned aircraft systems—The concept of the drone-gonio-photometer. J. Quant. Spectrosc. Radiat. Transf. 2020, 253, 107155. [Google Scholar] [CrossRef]
  27. Gao, Y.; Zhang, H.; Zou, N.; Kang, Z. Design of a multifunctional illuminometer. Comput. Syst. Appl. 2012, 3, 252–255. [Google Scholar]
  28. Ye, L.; Gao, N.; Yang, Y.; Li, X. A High-Precision and Low-Cost Broadband LEO 3-Satellite Alternate Switching Ranging/INS Integrated Navigation and Positioning Algorithm. Drones 2022, 6, 241. [Google Scholar] [CrossRef]
  29. Zhang, L.; Deng, F.; Chen, J.; Bi, Y.; Phang, S.K.; Chen, X. Trajectory Planning for Improving Vision-Based Target Geolocation Performance Using a Quad-Rotor UAV. IEEE Trans. Aerosp. Electron. Syst. 2019, 55, 2382–2394. [Google Scholar] [CrossRef]
  30. Li, A. Design and innovative application of minimal single-chip microcomputer system. Sci. Technol. Innov. 2019, 7, 26–27. [Google Scholar]
  31. Setiawan, A.; Prastowo, A.T.; Darwis, D. Sistem Monitoring Keberadaan Posisi Mobil Berbasis Gps Dan Penyadap Suara Menggunkan Smartphone. J. Tek. Dan. Sist. Komput. 2022, 3, 35–44. [Google Scholar] [CrossRef]
  32. Jacko, P.; Bereš, M.; Kováčová, I.; Molnár, J.; Vince, T.; Dziak, J.; Fecko, B.; Gans, Š.; Kováč, D. Remote IoT Education Laboratory for Microcontrollers Based on the STM32 Chips. Sensors 2022, 22, 1440. [Google Scholar] [CrossRef]
  33. Putri, S.F.M.; Mardiati, R.; Setiawan, A.E. The Prototype of Arm Robot for Object Mover Using Arduino Mega 2560. In Proceedings of the 2022 8th International Conference on Wireless and Telematics (ICWT), Yogyakarta, Indonesia, 21–22 July 2022. [Google Scholar]
  34. Zhang, C.L.; Mei, Y.P.; Wang, J. Design of ultrasonic rangefinder based on STC89C52 microcontroller. Ind. Technol. Innov. 2020, 7, 33–37. [Google Scholar]
  35. Mesquita, J.; Guimarães, D.; Pereira, C.; Santos, F.; Almeida, L. Assessing the ESP8266 WiFi module for the Internet of Things. In Proceedings of the 2018 IEEE 23rd International Conference on Emerging Technologies and Factory Automation (ETFA), Torino, Italy, 4–7 September 2018. [Google Scholar]
  36. Barai, A.R.; Badhon, M.R.K.; Zhora, F.; Rahman, M.R. Comparison between noninvasive heart rate monitoring systems using GSM module and ESP8266 Wi-Fi module. In Proceedings of the 2019 3rd International Conference on Electrical, Computer & Telecommunication Engineering (ICECTE), Rajshahi, Bangladesh, 26–28 December 2019. [Google Scholar]
  37. Gladwin Antony, R.; Hariharan, S.; Hari Haran, D.; Clement Raj, C. Design of Solar Charging Case for Mobile Phones. J. Phys. Conf. Ser. 2021, 2040, 012031. [Google Scholar] [CrossRef]
  38. Long, Q.L.; Niu, D.X.; Lin, L.Y. Intelligent energy-saving control system based on OneNET Cloud platform and MQTT protocol of Internet of Things. Comput. Meas. Control 2021, 29, 127–130. [Google Scholar]
  39. Mahama, S.; Harbi, Y.J.; Burr, A.G.; Grace, D. Design and Convergence Analysis of an IIC-Based BICM-ID Receiver for FBMC-QAM Systems. IEEE Open J. Comm. Soc. 2020, 1, 563–577. [Google Scholar] [CrossRef]
  40. AGupta, A.K.; Raman, A.; Kumar, N.; Ranjan, R. Design and Implementation of High-Speed Universal Asynchronous Receiver and Transmitter (UART). In Proceedings of the 2020 7th International Conference on Signal Processing and Integrated Networks (SPIN), Noida, India, 27–28 February 2020. [Google Scholar]
  41. Salcedo, R.; Zhu, H.; Zhang, Z.; Wei, Z.; Chen, L.; Ozkan, E.; Falchieri, D. Foliar deposition and coverage on young apple trees with PWM-controlled spray systems. Comput. Electron. Agric. 2020, 178, 105794. [Google Scholar] [CrossRef]
  42. Sowmya, K.B.; Gomes, S.; Tadiparthi, V.R. Design of UART Module using ASMD Technique. In Proceedings of the 2020 5th International Conference on Communication and Electronics Systems (ICCES), Coimbatore, India, 10–12 June 2020. [Google Scholar]
  43. Zhao, T.; Yu, Z.; Han, J.; Zhang, M.; Wang, J. Design for Spatiotemporal Information Cloud Platform of Smart City Based on OSGi. J. Phys. Conf. Ser. 2021, 1732, 012018. [Google Scholar] [CrossRef]
  44. Merza, E.O.; Mohammed, N.J. Fast Ways to Detect Outliers. J. Tech. 2021, 3, 66–73. [Google Scholar] [CrossRef]
  45. Gong, X.; Zhang, F.; Lu, T.; You, W. Comparative Analysis of three outlier detection methods in univariate data sets. In Proceedings of the 2022 3rd International Conference on Electronic Communication and Artificial Intelligence (IWECAI), Zhuhai, China, 14–16 January 2022. [Google Scholar]
  46. Wu, X.; Shen, X.; Cao, L.; Wang, G.; Cao, F. Assessment of individual tree detection and canopy cover estimation using unmanned aerial vehicle based light detection and ranging (UAV-LiDAR) data in planted forests. Remote Sens. 2019, 11, 908. [Google Scholar] [CrossRef]
  47. Prayudani, S.; Hizriadi, A.; Lase, Y.Y.; Fatmi, Y. Analysis accuracy of forecasting measurement technique on random K-nearest neighbor (RKNN) using MAPE and MSE. In Proceedings of the 1st International Conference of SNIKOM 2018, Medan, Indonesia, 23–24 November 2018. [Google Scholar]
  48. Maemura, T.; Nakura, K.; Suzuki, H.; Nakura, K.; Akizuki, Y.; Iwata, M.; Matsumoto, N. Preliminary study of illumination distribution measurement making use of quadcopter-examination of accuracy and drawing of illumination distribution. In Proceedings of the 11th Asian Forum on Graphic Science, Tokyo, Japan, 6–10 August 2017. [Google Scholar]
  49. Hodson, T.O. Root-mean-square error (RMSE) or mean absolute error (MAE): When to use them or not. Geosci. Model. Dev. 2022, 15, 5481–5487. [Google Scholar] [CrossRef]
  50. Hua, X.; Zhanlang, W.; Guancheng, W. Influence factors on illuminance distribution uniformity and energy saving of the indoor illumination control method. Appl. Opt. 2023, 62, 2531–2540. [Google Scholar]
Figure 1. STM32F103RCT6 as the core processing unit.
Figure 1. STM32F103RCT6 as the core processing unit.
Automation 05 00024 g001
Figure 2. UAV illuminance acquisition system structure diagram.
Figure 2. UAV illuminance acquisition system structure diagram.
Automation 05 00024 g002
Figure 3. The communication and structure diagram of the UAV illuminance measurement system.
Figure 3. The communication and structure diagram of the UAV illuminance measurement system.
Automation 05 00024 g003
Figure 4. Cloud platform construction process.
Figure 4. Cloud platform construction process.
Automation 05 00024 g004
Figure 5. Communication programming.
Figure 5. Communication programming.
Automation 05 00024 g005
Figure 6. Illuminance acquisition program design.
Figure 6. Illuminance acquisition program design.
Automation 05 00024 g006
Figure 7. Obstacle avoidance programming.
Figure 7. Obstacle avoidance programming.
Automation 05 00024 g007
Figure 8. Outlier handling process design.
Figure 8. Outlier handling process design.
Automation 05 00024 g008
Figure 9. System physical hardware.
Figure 9. System physical hardware.
Automation 05 00024 g009
Figure 10. Cloud platform visualization window.
Figure 10. Cloud platform visualization window.
Automation 05 00024 g010
Figure 11. The system performs single-point illuminance data measurements.
Figure 11. The system performs single-point illuminance data measurements.
Automation 05 00024 g011
Figure 12. Illuminance data comparison.
Figure 12. Illuminance data comparison.
Automation 05 00024 g012
Figure 13. Road lighting distribution diagram.
Figure 13. Road lighting distribution diagram.
Automation 05 00024 g013
Figure 14. Schematic diagram of measuring points.
Figure 14. Schematic diagram of measuring points.
Automation 05 00024 g014
Figure 15. The horizontal illuminance distribution chart on the ground.
Figure 15. The horizontal illuminance distribution chart on the ground.
Automation 05 00024 g015
Figure 16. The vertical illuminance distribution chart at 1.5 m.
Figure 16. The vertical illuminance distribution chart at 1.5 m.
Automation 05 00024 g016
Figure 17. System measurement diagram.
Figure 17. System measurement diagram.
Automation 05 00024 g017
Figure 18. Horizontal illumination distribution diagram. (a) Ground horizontal illumination, (b) 1 m horizontal illumination, (c) 1.5 m horizontal illumination, and (d) 2 m horizontal illumination.
Figure 18. Horizontal illumination distribution diagram. (a) Ground horizontal illumination, (b) 1 m horizontal illumination, (c) 1.5 m horizontal illumination, and (d) 2 m horizontal illumination.
Automation 05 00024 g018aAutomation 05 00024 g018b
Figure 19. Illuminance distribution above the vertical plane, showing (a) vertical plane 1, (b) vertical plane 2, and (c) vertical plane 3.
Figure 19. Illuminance distribution above the vertical plane, showing (a) vertical plane 1, (b) vertical plane 2, and (c) vertical plane 3.
Automation 05 00024 g019
Figure 20. Illuminance distribution on the south side of the vertical plane, showing (a) vertical plane 1, (b) vertical plane 2, and (c) vertical plane 3.
Figure 20. Illuminance distribution on the south side of the vertical plane, showing (a) vertical plane 1, (b) vertical plane 2, and (c) vertical plane 3.
Automation 05 00024 g020aAutomation 05 00024 g020b
Figure 21. Illuminance distribution on the east side of the vertical plane, showing (a) vertical plane 1, (b) vertical plane 2, and (c) vertical plane 3.
Figure 21. Illuminance distribution on the east side of the vertical plane, showing (a) vertical plane 1, (b) vertical plane 2, and (c) vertical plane 3.
Automation 05 00024 g021aAutomation 05 00024 g021b
Figure 22. Illuminance distribution on the west side of the vertical plane, showing (a) vertical plane 1, (b) vertical plane 2, and (c) vertical plane 3.
Figure 22. Illuminance distribution on the west side of the vertical plane, showing (a) vertical plane 1, (b) vertical plane 2, and (c) vertical plane 3.
Automation 05 00024 g022aAutomation 05 00024 g022b
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Xu, S.; Zou, N.; He, Q.; He, X.; Li, K.; Cheng, M.; Liu, K. Design and Application Research of a UAV-Based Road Illuminance Measurement System. Automation 2024, 5, 407-431. https://doi.org/10.3390/automation5030024

AMA Style

Xu S, Zou N, He Q, He X, Li K, Cheng M, Liu K. Design and Application Research of a UAV-Based Road Illuminance Measurement System. Automation. 2024; 5(3):407-431. https://doi.org/10.3390/automation5030024

Chicago/Turabian Style

Xu, Songhai, Nianyu Zou, Qipeng He, Xiaoyang He, Kexian Li, Min Cheng, and Kai Liu. 2024. "Design and Application Research of a UAV-Based Road Illuminance Measurement System" Automation 5, no. 3: 407-431. https://doi.org/10.3390/automation5030024

APA Style

Xu, S., Zou, N., He, Q., He, X., Li, K., Cheng, M., & Liu, K. (2024). Design and Application Research of a UAV-Based Road Illuminance Measurement System. Automation, 5(3), 407-431. https://doi.org/10.3390/automation5030024

Article Metrics

Back to TopTop