Next Article in Journal
A New CPW-Fed Diversity Antenna for MIMO 5G Smartphones
Previous Article in Journal
An Enhanced Design of Sparse Autoencoder for Latent Features Extraction Based on Trigonometric Simplexes for Network Intrusion Detection Systems
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Emergency Support Unmanned Aerial Vehicle for Forest Fire Surveillance

by
Abdulla Al-Kaff
*,†,
Ángel Madridano
,
Sergio Campos
,
Fernando García
,
David Martín
and
Arturo de la Escalera
Intelligent Systems Lab (LSI), Universidad Carlos III de Madrid—Avnda. de la Universidad 30, 28911 Madrid, Spain
*
Author to whom correspondence should be addressed.
These authors contributed equally to this work.
Electronics 2020, 9(2), 260; https://doi.org/10.3390/electronics9020260
Submission received: 20 December 2019 / Revised: 17 January 2020 / Accepted: 26 January 2020 / Published: 4 February 2020
(This article belongs to the Section Systems & Control Engineering)

Abstract

:
The advances in autonomous technologies and microelectronics have increased the use of Autonomous Unmanned Aerial Vehicles (UAVs) in more critical applications, such as forest fire monitoring and fighting. In addition, implementing surveillance methods that provide rich information about the fires is considered a great tool for Emergency Response Teams (ERT). From this aspect and in collaboration with Telefónica Digital España, Dronitec S.L, and Divisek Systems, this paper presents a fire monitoring system based on perception algorithms, implemented on a UAV, to perform surveillance tasks allowing the monitoring of a specific area, in which several algorithms have been implemented to perform the tasks of autonomous take-off/landing, trajectory planning, and fire monitoring. This UAV is equipped with RGB and thermal cameras, temperature sensors, and communication modules in order to provide full information about the fire and the UAV itself, sending these data to the ground station in real time. The presented work is validated by performing several flights in a real environment, and the obtained results show the efficiency and the robustness of the proposed system, against different weather conditions.

Graphical Abstract

1. Introduction

In recent years, the field of Unmanned Aerial Vehicles (UAVs) has been given great focus in many applications, and this is due to the advances in the field of microelectronics, which have allowed improving certain characteristics, such as weight, endurance, and payload. These advances have allowed the use of this type of vehicle in a large number of applications, such as those collected by [1], which complement human labor or, in some cases, replace it [2,3]. In addition, the possibility of carrying a certain load has resulted in advanced solutions for UAVs that create added value. Through the use of different sensors on board, studies and analyses of a wide range of environments can be carried out, helping to reduce costs and personal risks and monitor hard-to-reach areas effectively [4]. This set of features results in UAVs having a high impact in the field of fighting fires, which have a high economic, social, and environmental cost in most places of the world.
Forest fires are considered among the biggest environmental problems in the European Union. According to the last annual report published by the European Commission in 2017 [5] and as shown in Table 1 and Table 2, Spain has suffered some of the worst years in the last decade with an increment of 10% in the number of fires and of 95% in the hectares affected.
In addition, Spain and Portugal have more than 50% of the number of fires and hectares affected in the five major Southern European countries (Portugal, Spain, France, Italy, and Greece). Figure 1 shows the contribution of each of the five countries in terms of burnt area and number of fires in 2017.
In this paper, a research project is developed in collaboration with the companies Telefónica Digital España, Dronitec S.L., and Divisek Systems using the tools available on the current market for a forest fire monitoring system based on a UAV equipped with cameras and thermal sensors, which allows providing useful information about the fires and the surrounding environment to the emergency team. Moreover, the implemented algorithms allow the UAV to carry out the mission in a fully autonomous manner, together with a graphic interface, as a ground station, which allows monitoring the environment and controlling the UAV in real time.
The reminder of this paper is organized as follows: Section 2 states the recent works related to the use of this technology in civil applications such as agriculture or firefighting. Section 3 describes the experimental platform designed and the main hardware elements mounted on it. Section 4 presents the software developed and its sub-elements. In Section 5, the experiments and the obtained results are discussed. Finally, Section 6 summarizes the conclusions and future works.

2. Related Work

Currently, forest fires are among the main environmental problems in society, in particular in Southern European countries such as Spain. Apart from the impact and damage caused to ecosystems and possible human losses, it is important to consider the large direct and indirect costs resulting from these disasters. As stated in [6], the fires in Galicia in October 2017 alone had an economic impact of 155.89 million Euros.
According to sources from the Ministry of Agriculture, Food and Environment, Spain suffered an annual average of 14,476 incidents affecting an area of 108,282.39 hectares, with economic losses exceeding 54 million Euros. Therefore, technological advances in the area of robotics can help both in reducing the number of incidents and the costs. From this point of view, this project aims to create an autonomous UAV that can perform the tasks of the supervision and monitoring of fires in rural environments.
During the last few years, there has been a current of research that seeks to incorporate autonomous vehicles, both ground and aerial, in the civil field. While in some cases, replacing the functions of human operators was sought, in many others, using this technology as a tool to help perform the work more safely and efficiently was sought, taking advantage of this type of vehicle moving autonomously [7,8,9].
One of the sectors overlapping fire fighting the most is agriculture. In this area, a series of research works has been conducted in the literature, whose advances are of interest within the field on which this work is centered. In [10], a strategy was presented for the coverage and mapping of large areas with a swarm of UAVs, showing an efficient solution to the problem of the monitoring and evaluation of large areas of land, such as for rural fires. The work in [11] presented vision based algorithms for the detection and classification of crops with images captured with a monocular camera installed in a UAV and, although not directly related to fire detection, a useful tool for the analysis of the rural environment in which air vehicles are working. Another work that collected methods of image processing with applicability in this area was presented by [12], in which, through the analysis and processing of images captured by a thermal camera, the detection of people in forest areas was carried out, which is a tool of great interest for the cooperative work between air vehicles and operational personnel on the ground.
The use of UAVs with onboard vision based systems to monitor and detect forest fires from Unmanned Aerial Vehicles (UAVs) was detailed in [13,14]. Although the methods developed based on vision would be useful in this work, all this research has focused on their development, without covering information about the platform used or the navigation methods implemented.
In [15], the use of a system formed by multiple UAVs was employed in a first phase for the search and detection of fires and in a second phase for training to undertake fire extinguishing missions. This work, developed on multiple platforms, which represents an important advance, presented similarities to this work, but only collected results in simulation environments, without implementing their methods and algorithms on real platforms.
Focusing on the development of a perception system for the monitoring of fires within teams formed by different vehicles, the work in [16] presented a set of UAVs in charge of taking images of a forest fire in real time in order to analyze the evolution of the fire with the integration of all the information.
Within the field of multi-UAV systems is the research carried out by [17], in which three different types of unmanned aerial vehicles were used to carry out patrol, confirmation, and fire observation tasks. Their performance was tested in laboratory conditions and showed that they could be used to help ground teams predict and respond to a fire threat. In line with the use of multi-robot systems for the prevention and conservation of ecosystems is the work in [18], in which a team consisting of autonomous vehicles, both land and air, worked cooperatively to perform maintenance tasks in forest areas in order to prevent possible environmental damage such as fires.
Works such as the one developed by [19] are closer to the object of this project. This included the design and implementation of a UAV to undertake fire extinguishing tasks autonomously. Unlike the work proposed in this article, the UAV implemented was a quadricopter with a light payload, and its tasks were oriented toward the intake of water and discharging it at the focal point of the fire. Although at a small scale, projects such as this one showed that work is currently being done to incorporate UAVs into fire extinguishing efforts.
For its part, the work in [20] developed a UAV with fireproof materials that could perform firefighting missions and search for victims through the use of a fire extinguisher on board the aircraft and cameras that allowed a direct view of the environment. In this case, the UAV was a tool for fire brigades, since, through an operator who managed the UAV, it allowed knowing essential data of the environment and could help in the fire extinguishing efforts. Although the authors indicated the possibility of using this device outdoors, in fact, it was presented as a tool to be used in enclosed spaces.
Finally, related to the use of UAVs for fires, the work implemented by [21] presented a particle swarm optimization algorithm to assign multiple tasks to UAVs in dynamic environments such as forest fires.

3. Hardware Architecture

Different aspects were taken into account for the selection of the UAV, such as payload, energy consumption, maneuverability, and cost. For this work, a hexacopter structure was used, as shown in Figure 2, which provided more stability and robustness against meteorological conditions.
This UAV had a maximum takeoff mass of 8.2 kg, an endurance of 18 min, and a flying speed up to 32 km/h.
The flight controller used was PixHawk 2 Cube (Figure 3), the communication protocol of which is compatible with the Robot Operating System (ROS) framework, on which the algorithms of the trajectory generation and environmental analysis were developed. In addition, it was equipped with RTK GNSS and triple redundancy sensors (accelerometers and gyroscopes), which allowed obtaining more precise data of the speed, orientation, and gravitational forces.

3.1. Payload

Regarding the onboard sensors, the UAV was equipped with a set of subsystems as described as follows:
  • Optical sensors: The UAV was equipped with two different types of cameras, allowing the gathering of detailed information of the fire and the environment.
    RGB monocular camera: with a resolution of 1920 × 1080 at 60 fps in HD.
    Thermal camera FLIR AX5 series: the camera in charge of collecting thermal images of the fire (Figure 4), which provided information about the temperature at which the different focuses of interest were found. Its compact size, the wide range of resolutions and aspect ratios, and its compatibility with different software made it ideal for working on board UAVs.
  • Digital temperature sensors DS18B20: a set of five DS18B20 digital temperature sensors distributed along the UAV body, providing temperature data at different points of the UAV.
  • Onboard embedded unit: All the processing was performed on-board by an Intel NUC embedded computer, which had an Intel i7-7567U CPU at 3.5 GHz CPU and 8 GB RAM. The software was developed and integrated with ROS Kinetic, under the Ubuntu 16.04 LTS operating system.
  • Communications module: It was necessary to establish a communications system that allowed the transmission of the data between the UAV and the ground station. To avoid range limitations or interference due to occlusions, a 3G/4G modem was chosen to ensure communications in a wide range of situations.

3.2. Charging Base

In these meteorological conditions, it was required to design a base with dimensions that allowed the take-off and landing maneuvers, with the ability to charge the UAV batteries wirelessly. This base needed to protect the UAV and each sub-system against weather conditions.
Therefore, taking into account the maximum dimensions of the UAV (1250 mm × 1250 mm × 730 mm) and the safety margins for operations, a base with dimensions of 2000 mm × 2000 mm × 1500 mm was constructed. After studying different designs, a base with an upper sliding door with a vertically moving bed that allowed performing the take-off and landing maneuvers was considered, as shown in Figure 5.

4. Software Architecture

In this section, the software architecture is described in detail. First, all the algorithms were implemented in the ROS framework, which presents set of tools and libraries that allow a rapid development of the control and communication with the autopilot. In this architecture, the MAVROS package was used to establish a communication bridge between ROS and MAVLink (Micro Air Vehicle Link).
The main software implementations were divided into the algorithms to execute the mission completely autonomously and the development of a graphic interface that allowed a ground operator to supervise the tasks correctly and interact with the vehicle.

4.1. Autonomous Navigation

All the algorithms developed in this work aimed to provide an autonomous UAV that was able to sense, collect data, and analyze the environment. As shown in Figure 6, the system was divided into several phases as follows.
  • Fire alert: The UAV in standby position receives an alert message with the position of the center of the fire in UTM coordinates. This alert is generated by a system composed of several thermal cameras (Figure 7b) installed at the top of a telecommunication tower, as shown in Figure 7a, which provides a 360 vision of the environment in a short interval of time. This system is responsible for detecting fires within a radius of 3.5 km around the tower. In this phase, the software establishes the communication between the fire detection system and the UAV.
  • Take-off: Once the UAV receives the alert message and based on its initial UTM coordinates ( X o , Y o ) and the fire location ( X d , Y d ) , the path generation algorithm estimates the corresponding waypoints.
    As shown in Figure 8, the path generated starts by including a safe take-off from point A = ( X o , Y o ) to point B, which is a point away from the base with altitude h (this point is located in the collision-free area in the base). Once point B is reached, depending on the fire location, the algorithms generate safety points C l and C r . These points are calculated by considering a distance a from the furthest point of possible collision with respect to the base, then a safety coefficient k is used a k . After that, the algorithm creates a circular path considering a k as the diameter.
    Once the point C l or C r is reached, the algorithm generates the trajectory from this point to point D (the location of the center of the fire).
  • Generation of the path: After the take-off maneuver is accomplished, the algorithm generates a list of waypoints ( x , y , z ) based on several trajectories, as explained in Algorithm 1. These waypoints are in meters with respect to the initial position (location at takeoff).
    Algorithm 1: Trajectory generation.
    Electronics 09 00260 i001
    As shown in Figure 9, the first trajectory is the path from the initial UAV position to a point located in the border of the orbit, and this point is calculated as follows:
    x 2 + y 2 ( 2 o x × x ) ( 2 o y × y ) + ( o x 2 + o y 2 r 2 ) = 0
    y = m x + n
    where x , y are the point coordinates, o x , o y are the coordinates of the center of orbit, and r is the radius of the orbit.
    Then, the first path from the initial position to the orbit is calculated as follows:
    P x i + 1 = P x i + ν cos ( α ) P y i + 1 = P y i + ν sin ( α )
    where ν is the linear distance between trajectory points P i , P i + 1 and α is between the center of the fire and the initial position, as shown in Figure 10.
    The next step is to calculate the orbit path.
    P x i = o x + r cos ( β + λ ( i ) ) P y i = o y + r sin ( β + λ ( i ) )
    where β is the conjugate angle of α , as shown in Figure 10, and λ is the step angle between the orbit path points.
    Finally, the last path generated is equal to the first path, but in the opposite direction.
  • Tracking: Finally, it is necessary to generate an algorithm that receives this list of waypoints and verifies if the UAV achieves these or not. Algorithm 2 describe the waypoint following process.
    Algorithm 2: Waypoint following.
    Electronics 09 00260 i002

4.2. Graphic User Interface

The last element in the developed software is the GUI. This GUI provides a tool for controlling the UAV, as well as visualizing the information about the fire and the environment.
The designed GUI is divided into two tabs: a tab used to configure and establish communications between the ground station and the UAV onboard computer and a second tab showing the information captured by the sensors, as shown in Figure 11.
Furthermore, the second tab is divided into four main groups:
  • Optical sensors: This consists of two displays, where the color and thermal images are shown. Both images are compressed so that they can be transmitted without delay.
  • Autopilot information: The second group illustrates the data from the autopilot, such as GPS position and altitude, providing the information about the status of the flight.
  • Positioning map: This group provides the UAV position in a satellite mode map. Moreover, navigation algorithms are implemented to allow the operator at the ground station to add new waypoints to the predefined generated path, if required.
  • Temperature sensors: The last group provides information about the temperature of different segments of the UAV, in order to keep an eye on the operational conditions of the UAV.

5. Experimental Results

In order to validate the presented work, different tests were carried out in a real environment.

5.1. Scenario

The experiments were performed with real fires in a forest located in Matachines in the North of the Community of Madrid, with different weather conditions, such as wind, cold, or fog, where the base had dimensions of (50 m × 50 m), surrounded by a forest environment. In these experiments, the fires were located at a distance of 600–1000 m from the base.
Although the mission described below was carried out under good weather conditions, due to the location of the place of operations, throughout the project, different tests were carried out with adverse weather conditions such as wind, cold, or fog, as shown in Figure 12. Although the UAV could fly in these conditions, its usefulness was reduced since the on-board cameras would not be able to transmit useful information about the fire.
Figure 12 shows an example of the data obtained from the experiment, where Figure 12a illustrates the temperature information about the UAV, whilst Figure 12b and Figure 12c demonstrate the color and thermal images, respectively, in dense foggy conditions.

5.2. Mission and Results

The data obtained from the experiment are shown in Figure 13, where the UAV performed the complete path of a distance of 1.34 km at an altitude of 66 m from the ground level and a radius of orbit of 50 m, with an average speed of 6.4 km/h and maximum ascent and displacement speeds of 10.8 km/h and 22.7 km/h, respectively.
From this experiment, the temperature sensors maintained constant values (10 °C) during the flight.
Figure 14a,b shows how during the autonomous mode (off-board), the UAV was able to reach the appropriate orientation (red line) toward the points sent by the controller (green line), while flying at a speed of 22.7 km/h. These figures show how the UAV was able to follow the predefined waypoints, with a minimum error in position. Furthermore, it is important to mention that it would be necessary to adjust the PID gains so that the overall movement of the platform during the mission was fast and accurate.

6. Conclusions

In this work, an autonomous UAV was presented, equipped with optical and thermal sensors, communication modules, and processing units, for monitoring and data acquisition of forest fires.
This work presented methods and algorithms implemented for UAVs to perform the tasks of trajectory planning and fire monitoring, in collaboration with Telefónica Digital España, Dronitec S.L, and Divisek Systems, setting up a tool, based on a UAV, capable of supervising and monitoring fires in forest areas by transmitting information from the vehicle to a ground station in real time.
The presented work was validated by performing several flights in a real environment, and the obtained results showed the efficiency and robustness of the proposed system, against different weather conditions.
The future work will be focused on implementing algorithms for recognizing and avoiding dynamic obstacles in the environment and the use of other sensors like a stereo camera or LiDAR, which will provide the UAV with rich information. Furthermore, the UAV’s hardware will be improved in order to obtain greater endurance.

Author Contributions

A.A.-K., Á.M. and S.C. conceived and designed the experiments; A.A.-K. and A.M. performed the experiments; S.C., F.G. and D.M. analyzed the data; A.d.l.E. contributed reagents/materials/analysis tools; A.A.-K. and A.M. wrote the paper. All authors have read and agreed to the published version of the manuscript.

Funding

This work is supported by the Comunidad de Madrid Government through the Industrial Doctorates Grants (Grant No. IND2017/TIC-7834).

Acknowledgments

The Intelligent Systems Laboratory thanks Telefonica Digital España, S.L.U., for its help and funding through the project: “Sistemas de UAV Autónomo para Supervisión de Incendios Forestales.”

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Pajares, G. Overview and current status of remote sensing applications based on unmanned aerial vehicles (UAVs). Photogramm. Eng. Remote. Sens. 2015, 81, 281–330. [Google Scholar] [CrossRef] [Green Version]
  2. Erdos, D.; Erdos, A.; Watkins, S.E. An experimental UAV system for search and rescue challenge. IEEE Aerosp. Electron. Syst. Mag. 2013, 28, 32–37. [Google Scholar] [CrossRef]
  3. Al-Kaff, A.; Gómez-Silva, M.J.; Moreno, F.M.; de la Escalera, A.; Armingol, J.M. An appearance-based tracking algorithm for aerial search and rescue purposes. Sensors 2019, 19, 652. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  4. Toriz, A.; Raygoza, M.; Martínez, D. Modelo de inclusión tecnológica UAV para la prevención de trabajos de alto riesgo, en industrias de la construcción basado en la metodología IVAS. Revista Iberoamericana de Automática e Informática Industrial 2017, 14, 94–103. [Google Scholar] [CrossRef] [Green Version]
  5. San-Miguel-Ayanz, J.; Durrant, T.; Boca, R.; Libertà, G.; Branco, A.; de Rigo, D.; Ferrari, D.; Maianti, P.; Artes, T.; Costa, H.; et al. Forest Fires in Europe, Middle East and North Africa 2017; Joint Research Centre Technical Report; Publications Office of the European Union: Luxembourg, 2018. [Google Scholar] [CrossRef]
  6. Loureiro, M.L.; Allo, M. Los Incendios Forestales Y Su Impacto Económico: Propuesta Para Una Agenda Investigadora. Revista Galega de Economía 2018, 27, 129–142. [Google Scholar]
  7. Wang, N.; Deng, Q.; Xie, G.; Pan, X. Hybrid finite-time trajectory tracking control of a quadrotor. ISA Trans. 2019, 90, 278–286. [Google Scholar] [CrossRef] [PubMed]
  8. Wang, N.; Su, S.F.; Han, M.; Chen, W.H. Backpropagating constraints-based trajectory tracking control of a quadrotor with constrained actuator dynamics and complex unknowns. IEEE Trans. Syst. Man Cybern. Syst. 2018, 49, 1322–1337. [Google Scholar] [CrossRef]
  9. Al-Kaff, A.; de La Escalera, A.; Armingol, J.M. Homography-based navigation system for unmanned aerial vehicles. In Proceedings of the International Conference on Advanced Concepts for Intelligent Vision Systems, Antwerp, Belgium, 18–21 September 2017; Springer: Berlin, Germany, 2017; pp. 288–300. [Google Scholar]
  10. Albani, D.; Nardi, D.; Trianni, V. Field coverage and weed mapping by UAV swarms. In Proceedings of the 2017 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Vancouver, BC, Canada, 24–28 September 2017; pp. 4319–4325. [Google Scholar]
  11. Lottes, P.; Khanna, R.; Pfeifer, J.; Siegwart, R.; Stachniss, C. UAV-based crop and weed classification for smart farming. In Proceedings of the 2017 IEEE International Conference on Robotics and Automation (ICRA), Singapore, 29 May–3 June 2017; pp. 3024–3031. [Google Scholar]
  12. Ostovar, A.; Hellström, T.; Ringdahl, O. Human detection based on infrared images in forestry environments. In Proceedings of the International Conference on Image Analysis and Recognition, Póvoa de Varzim, Portugal, 13–15 July 2016; pp. 175–182. [Google Scholar]
  13. Yuan, C.; Liu, Z.; Zhang, Y. Vision-based forest fire detection in aerial images for firefighting using UAVs. In Proceedings of the 2016 International Conference on Unmanned Aircraft Systems (ICUAS), Washington, DC, USA, 7–10 June 2016; pp. 1200–1205. [Google Scholar]
  14. Yuan, C.; Liu, Z.; Zhang, Y. Aerial images-based forest fire detection for firefighting using optical remote sensing techniques and unmanned aerial vehicles. J. Intell. Robot. Syst. 2017, 88, 635–654. [Google Scholar] [CrossRef]
  15. Harikumar, K.; Senthilnath, J.; Sundaram, S. Multi-UAV Oxyrrhis Marina-Inspired Search and Dynamic Formation Control for Forest Firefighting. IEEE Trans. Autom. Sci. Eng. 2018, 16, 863–873. [Google Scholar] [CrossRef]
  16. Merino, L.; Caballero, F.; de Dios, J.R.M.; Maza, I.; Ollero, A. Automatic forest fire monitoring and measurement using unmanned aerial vehicles. In Proceedings of the 6th International Congress on Forest Fire Research, Coimbra, Portugal, 15–18 November 2010. [Google Scholar]
  17. Zharikova, M.; Sherstjuk, V. Forest Firefighting Monitoring System Based on UAV Team and Remote Sensing. In Automated Systems in the Aviation and Aerospace Industries; IGI Global: Pennsylvania, PA, USA, 2019; pp. 220–241. [Google Scholar]
  18. Couceiro, M.S.; Portugal, D.; Ferreira, J.F.; Rocha, R.P. SEMFIRE: Towards a new generation of forestry maintenance multi-robot systems. In Proceedings of the 2019 IEEE/SICE International Symposium on System Integration (SII), Paris, France, 14–16 January 2019; pp. 270–276. [Google Scholar]
  19. Qin, H.; Cui, J.Q.; Li, J.; Bi, Y.; Lan, M.; Shan, M.; Liu, W.; Wang, K.; Lin, F.; Zhang, Y.; et al. Design and implementation of an unmanned aerial vehicle for autonomous firefighting missions. In Proceedings of the 2016 12th IEEE International Conference on Control and Automation (ICCA), Kathmandu, Nepal, 1–3 June 2016; pp. 62–67. [Google Scholar]
  20. Imdoukh, A.; Shaker, A.; Al-Toukhy, A.; Kablaoui, D.; El-Abd, M. Semi-autonomous indoor firefighting UAV. In Proceedings of the 2017 18th International Conference on Advanced Robotics (ICAR), Hong Kong, China, 10 July 2017; pp. 310–315. [Google Scholar]
  21. Chen, K.; Sun, Q.; Zhou, A.; Wang, S. Adaptive Multiple Task Assignments for UAVs Using Discrete Particle Swarm Optimization. In Proceedings of the International Conference on Internet of Vehicles, Paris, France, 6–9 November 2018; pp. 220–229. [Google Scholar]
  22. Instituto Geográfico Nacional de España. Iberpix 4. 2018. Available online: https://www.ign.es/iberpix2/visor/ (accessed on 13 September 2019).
Figure 1. Share of the total burnt area (a) and the total number of fires (b) in each of the Southern Member States for 2017 [5].
Figure 1. Share of the total burnt area (a) and the total number of fires (b) in each of the Southern Member States for 2017 [5].
Electronics 09 00260 g001
Figure 2. Emergency Support (ES)-UAV.
Figure 2. Emergency Support (ES)-UAV.
Electronics 09 00260 g002
Figure 3. PixHawk autopilot.
Figure 3. PixHawk autopilot.
Electronics 09 00260 g003
Figure 4. Thermal image.
Figure 4. Thermal image.
Electronics 09 00260 g004
Figure 5. Charging base.
Figure 5. Charging base.
Electronics 09 00260 g005
Figure 6. System overview.
Figure 6. System overview.
Electronics 09 00260 g006
Figure 7. Surveillance system located in a telecommunication tower (Telefónica Digital España).
Figure 7. Surveillance system located in a telecommunication tower (Telefónica Digital España).
Electronics 09 00260 g007
Figure 8. Safety take-off path [22].
Figure 8. Safety take-off path [22].
Electronics 09 00260 g008
Figure 9. Generated trajectory.
Figure 9. Generated trajectory.
Electronics 09 00260 g009
Figure 10. Generated trajectory.
Figure 10. Generated trajectory.
Electronics 09 00260 g010
Figure 11. User interface.
Figure 11. User interface.
Electronics 09 00260 g011
Figure 12. Experimental tests in adverse meteorologist conditions.
Figure 12. Experimental tests in adverse meteorologist conditions.
Electronics 09 00260 g012
Figure 13. Real trajectory.
Figure 13. Real trajectory.
Electronics 09 00260 g013
Figure 14. Trajectory following.
Figure 14. Trajectory following.
Electronics 09 00260 g014
Table 1. Number of fires in 2017 compared with the 10 year average [5].
Table 1. Number of fires in 2017 compared with the 10 year average [5].
Average 2007–20162017
Number of fires <1 ha82288705
Number of fires ≥ 1 ha41355088
Total12,36313,793
Table 2. Burnt area in 2017 compared with the 10 year average [5].
Table 2. Burnt area in 2017 compared with the 10 year average [5].
Average 2007–20162017
Burnt area of other wooded land (ha)27,226.4166,839.02
Burnt area of forest (ha)91,846.74178,233.93

Share and Cite

MDPI and ACS Style

Al-Kaff, A.; Madridano, Á.; Campos, S.; García, F.; Martín, D.; de la Escalera, A. Emergency Support Unmanned Aerial Vehicle for Forest Fire Surveillance. Electronics 2020, 9, 260. https://doi.org/10.3390/electronics9020260

AMA Style

Al-Kaff A, Madridano Á, Campos S, García F, Martín D, de la Escalera A. Emergency Support Unmanned Aerial Vehicle for Forest Fire Surveillance. Electronics. 2020; 9(2):260. https://doi.org/10.3390/electronics9020260

Chicago/Turabian Style

Al-Kaff, Abdulla, Ángel Madridano, Sergio Campos, Fernando García, David Martín, and Arturo de la Escalera. 2020. "Emergency Support Unmanned Aerial Vehicle for Forest Fire Surveillance" Electronics 9, no. 2: 260. https://doi.org/10.3390/electronics9020260

APA Style

Al-Kaff, A., Madridano, Á., Campos, S., García, F., Martín, D., & de la Escalera, A. (2020). Emergency Support Unmanned Aerial Vehicle for Forest Fire Surveillance. Electronics, 9(2), 260. https://doi.org/10.3390/electronics9020260

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop