Unmanned Aerial Vehicle Systems for Remote Estimation of Flooded Areas Based on Complex Image Processing
Abstract
:1. Introduction
2. Instruments and Methods
2.1. UAV System
2.2. Trajectory Control
2.3. Image-Based Flood Detection System
- In IPU, ortho-rectified images are created and then they are combined into a single image without overlapping and without gaps (orthophotoplan).
- From the orthophotoplan, adjacent cropped images of dimension 6000 × 4000 pixels are investigated for flood evaluation.
- non-overlapping box decomposition of the tested image is made. So, a grid of boxes is created and its dimension will represent the resolution of flood segmentation. Thus, if the image dimension is and the box dimension is , then the resolution of segmentation (BMP dimension) is .
- The flood segmentation is made by patch classification in two regions of interest (flood—F and non flood—NF) taking into account the patch signatures and class representatives, which contain information about color and texture. As we mentioned earlier, the process has two phases: the learning phase (for feature selection and parameter adjustment) and the mission phase (for flood detection, segmentation and evaluation). Flood evaluation is made for each cropped image and, finally, the sum of partial results is calculated.
2.3.1. Learning Phase
- Ti is calculated for all the learning patches (PF) and the confidence interval is determined, where and represents, respectively, the mean and the standard deviation of Ti.
- Similarly, Ti is calculated for all the learning patches from PNF and the resulting set of values is noted as NFi.
- A confidence indicator for feature Ti, is created:
- The features Ti with greatest are selected in decreasing order, until the fixed number of features imposed for flood signature is reached. For example, in Section 3, a signature T, with 6 elements is considered (12):
- As a consequence of the signature T, a set of confidence intervals is created (13). will be the representative of the class F:
- For each selected feature Ti a weight wi is calculated as follows. Another set of 100 patches (50—flood and 50—non flood) is considered and the confusion matrix for the feature Ti is calculated based on a preliminary classification criterion: the patch if .
- Obviously, represents an ideal situation and are not encountered.
- If , then Ti and Tj are redundant and one can be eliminated.
2.3.2. Mission Phase
2.3.3. Algorithm for Flood Detection
Algorithm 1: Learning Phase |
|
For each patch of the first set: 1. Image decomposition on color channels (R, G, B, H, S, V) of patches; 2. Reject noise with median local filter; 3. Calculate the features: Im, Con, En, Hom, Ent, Var, Dm and L on color channels; 4. Until end of set 1; 5. Calculate the Confidence Indicator CIi for each feature based on Equation (11); 6. Feature selection: Ti, i = 1, …, 6; 7. Determine the intervals for flood class representative , Equations (13) and (14) For each Ti: 8. Calculate the confusion matrices CMi from the set 2; 9. Calculate the weights wi, i = 1, …, 6; Equations (15) and (17) 10. Return {Ti, wi}. |
Algorithm 2: Classification Phase |
|
For each image I: 1. Image decomposition in small non-overlapping patches (50 × 50 pixels); For each patch B 2. Calculate the selected features ImR, ConHH, EnHS, HomHH, DmG and LR; 3. Calculate Di(B); 4. Patch classification based on voting scheme (18); 5. Until end of patches from image Ii; 6. Create the matrix of patches for each feature; 7. Noise rejection based on local median filter in matrices of patches; 8. Create the final matrix of patches based on voting scheme; 9. Create segmented image; 10. Calculate the percent of flooded area from image with Equation (19); 11. Until end of images to be analyzed; 12. Return the segmented images and percent of flooded area. |
3. Experimental Results
4. Discussion
5. Conclusions
Acknowledgments
Author Contributions
Conflicts of Interest
Appendix A. The Pseudo Code (Matlab) for the Proposed Algorithm for Computing CCM between Spectral Components A and B
BEGIN Compute(A, B, xdimension, ydimension, n) A = normalize(A,n); B = normalize(B,n); Initialize result as a n × n array of 0; bExtended = extendMatrix(B, xdimension, ydimension); offset = calculateOffset([xdimension, ydimension]); for i = 0 to n for j = 0 to n positions = searchAppearences(A, i); if positions ! = null for k = 0 to positions[0].length bRow =positions[0][k] +offset[1] + xdimension; bCol = positions[1][k] + offset[2] + ydimension; if bExtended[bRow][bCol] == j result[i][j] ++; return result; END BEGIN calculateOffset(offsetIn) for i = 1 to offsetIn.length if offsetIn(i) >= 0 offsetOut(i) = 0; else offsetOut(i) = abs(offsetIn(i)); END BEGIN searchAppearences(A, x) count = countAppearences(A, x); initialize positions as a 2 x count array; if count == 0 return null; int k = 0; for i = 0 to rows for j = 0 to cols if A[i][j] == x positions[0][k] = i; positions[1][k] = j; k++; return positions; END BEGIN countAppearences(A, int x) count = 0; for i = 0 to rows for j = 0 to cols if A[i][j] == x count++; return count; END BEGIN extendMatrix(A, noRows, noCols) initialize result as an array (A.length + abs(noRows))x(A[0].length + abs(noCols)); offset = calculateOffset([rowsNo, colsNo]); for i = 0+offset[1] to A.length+offset[1] for j = 0+offset[2] to A[0].length+offset[2] result[i][j] = A[i][j]; for i = A.length to A.length + noRows for k = 0 to A[0].length + noCols result[i][k] = −1; for i = A[0].length to A[0].length + noCols for k = 0 to A.length + noRows result[k][i] = −1; return result; END
References
- Doong, D.J.; Chuang, L.H.; Wu, L.C.; Fan, Y.M.; Kao, C.; Wang, J.H. Development of an operational coastal flooding early warning system. Nat. Hazards Earth Syst. Sci. 2012, 12, 379–390. [Google Scholar] [CrossRef] [Green Version]
- Benfield, A. Annual Global Climate and Catastrophe Report; Impact Forecasting: Chicago, IL, USA, 2013. [Google Scholar]
- Basha, E.; Rus, D. Design of early warning flood detection systems for developing countries. In Proceedings of the International Conference on Information and Communication Technologies and Development (ICTD 2007), Bangalore, India, 15–16 December 2007; pp. 1–10.
- Koschitzki, R.; Schwalbe, E.; Maas, H. An autonomous image based approach for detecting glacial lake outburst floods. ISPRS Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2014, XL-5, 337–342. [Google Scholar] [CrossRef]
- Sakai, T.; Hatta, S.; Okumura, M.; Hiyama, T.; Yamaguchi, Y.; Inoue, G. Use of Landsat TM/ETM+ to monitor the spatial and temporal extent of spring breakup floods in the Lena River, Siberia. Int. J. Remote Sens. 2015, 36, 719–733. [Google Scholar] [CrossRef]
- Zhang, J.; Zhou, C.; Xu, K.; Watanabe, M. Flood disaster monitoring and evaluation in China. Glob. Environ. Chang. Part B Environ. Hazards 2002, 4, 33–43. [Google Scholar] [CrossRef]
- Revilla-Romero, B.; Thielen, J.; Salamon, P.; De Groeve, T.; Brakenridge, G. Evaluation of the satellite-based Global Flood Detection System for measuring river discharge: Influence of local factors. Hydrol. Earth Syst. Sci. 2014, 18, 4467–4484. [Google Scholar] [CrossRef] [Green Version]
- Aicardi, I.; Chiabrando, F.; Lingua, A.M.; Noardo, F.; Piras, M.; Vigna, B. A methodology for acquisition and processing of thermal data acquired by UAVs: A test about subfluvial springs’ investigations. Geomat. Nat. Hazards Risk 2016. [Google Scholar] [CrossRef]
- Pandey, R.K.; Crétaux, J.F.; Bergé-Nguyen, M.; Tiwari, V.M.; Drolon, V.; Papa, F.; Calmant, S. Water level estimation by remote sensing for the 2008 flooding of the Kosi river. Int. J. Remote Sens. 2014, 35, 424–440. [Google Scholar] [CrossRef]
- Khurshid, H.; Khan, M.F. Segmentation and Classification Using Logistic Regression in Remote Sensing Imagery. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2015, 8, 224–232. [Google Scholar] [CrossRef]
- Yulianto, F.; Sofan, P.; Zubaidah, A.; Sukowati, K.A.D.; Pasaribu, J.M.; Khomarudin, M.R. Detecting areas affected by flood using multi-temporal ALOS PALSAR remotely sensed data in Karawang, West Java, Indonesia. Nat. Hazards 2015, 77, 959–985. [Google Scholar] [CrossRef]
- Lo, S.W.; Wu, J.H.; Lin, F.P.; Hsu, C.H. Cyber surveillance for flood disasters. Sensors 2015, 15, 2369–2387. [Google Scholar] [CrossRef] [PubMed]
- Pentari, A.; Moirogiorgou, K.; Livanos, G.; Iliopoulou, D.; Zervakis, M. Feature analysis on river flow video data for floating tracers detection. In Proceedings of the IEEE International Conference on Imaging Systems and Techniques (IST), Santorini Island, Greece, 14–17 October 2014; pp. 287–292.
- Lee, J.N.; Kwak, K.C. A trends analysis of image processing in unmanned aerial vehicle. Int. J. Comput. Inf. Sci. Eng. 2014, 8, 261–264. [Google Scholar]
- Scarsi, A.; Emery, W.J.; Moser, G.; Pacifici, F.; Serpico, S.B. An automated flood detection framework for very high spatial resolution imagery. In Proceedings of the IEEE International Geoscience and Remote Sensing Symposium (IGARSS), Quebec City, QC, Canada, 13–18 July 2014; pp. 4954–4957.
- Abdelkader, M.; Shaqura, M.; Claudel, C.G.; Gueaieb, W. A UAV based system for real time flash flood monitoring in desert environments using Lagrangian microsensors. In Proceedings of the International Conference on Unmanned Aircraft Systems (ICUAS), Atlanta, GA, USA, 28–31 May 2013; pp. 25–34.
- Achille, C.; Adami, A.; Chiarini, S.; Cremonesi, S.; Fassi, F.; Fregonese, L.; Taffurelli, L. UAV-Based photogrammetry and integrated technologies for architectural applications-methodological strategies for the after-quake survey of vertical structures in Mantua (Italy). Sensors 2015, 15, 15520–15539. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Feng, Q.; Liu, J.; Gong, J. Urban flood mapping based on unmanned aerial vehicle remote sensing and random forest classifier—A case of Yuyao, China. Water 2015, 7, 1437–1455. [Google Scholar] [CrossRef]
- Boccardo, P.; Chiabrando, F.; Dutto, F.; Tonolo, F.G.; Lingua, A.M. UAV deployment exercise for mapping purposes: Evaluation of emergency response applications. Sensors 2015, 15, 15717–15737. [Google Scholar] [CrossRef] [PubMed]
- Aicardi, F.; Nex, F.C.; Gerke, M.; Lingua, A.M. An image-based approach for the co-registration of multi-temporal UAV image datasets. Remote Sens. 2016, 8, 779. [Google Scholar] [CrossRef]
- Beard, R.W.; McLain, T.W. Small Unmanned Aircraft: Theory and Practice; Princeton University Press: Princeton, NJ, USA, 2012. [Google Scholar]
- Siebert, S.; Teizer, J. Mobile 3D mapping for surveying earthwork projects using an Unmanned Aerial Vehicle (UAV) system. Autom. Constr. 2014, 41, 1–14. [Google Scholar] [CrossRef]
- Obermeyer, K.J. Path planning for a UAV performing reconnaissance of static ground targets in terrain. In Proceedings of the AIAA Guidance, Navigation, and Control Conference, Chicago, IL, USA, 10–13 August 2009; pp. 10–13.
- Eisenbeiss, H.; Sauerbier, M. Investigation of UAV systems and flight modes for photogrammetric applications. Photogramm. Rec. 2011, 26, 400–421. [Google Scholar] [CrossRef]
- Zhang, B.; Liu, W.; Mao, Z.; Liu, J.; Shen, L. Cooperative and Geometric Learning Algorithm (CGLA) for path planning of UAVs with limited information. Automatica 2014, 50, 809–820. [Google Scholar] [CrossRef]
- Levine, J. Analysis and Control of Nonlinear Systems: A Flatness-Based Approach; Springer Science & Business Media: Berlin, Germany, 2009. [Google Scholar]
- Suryawan, F. Constrained Trajectory Generation and Fault Tolerant Control Based on Differential Flatness and B-Splines. Ph.D. Thesis, The University of Newcastle, Callaghan, Australia, August 2011. [Google Scholar]
- Stoican, F.; Prodan, I.; Popescu, D. Flat trajectory generation for way-points relaxations and obstacle avoidance. In Proceedings of the 23th Mediterranean Conference on Control and Automation (MED), Torremolinos, Spain, 16–19 June 2015; pp. 695–700.
- Gordon, W.J.; Riesenfeld, R.F. B-spline curves and surfaces. Comput. Aided Geom. Des. 1974, 167, 95. [Google Scholar]
- Stoican, F.; Popescu, D. Trajectory generation with way-point constraints for UAV systems. In Advances in Robot Design and Intelligent Control; Borangiu, T., Ed.; Springer: Basel, Switzerland, 2016; pp. 379–386. [Google Scholar]
- Haralick, R.M. Statistical and structural approaches to texture. Proc. IEEE 1979, 67, 786–804. [Google Scholar] [CrossRef]
- Haralick, R.M.; Shanmugam, K.; Dinstein, I.H. Textural features for image classification. IEEE Trans. Syst. Man Cybern. 1973, 3, 610–621. [Google Scholar] [CrossRef]
- Khelifi, R.; Adel, M.; Bourennane, S. Multispectral texture characterization: Application to computer aided diagnosis on prostatic tissue images. EURASIP J. Adv. Signal Process. 2012, 2012, 118. [Google Scholar] [CrossRef]
- Losson, O.; Porebski, A.; Vandenbroucke, N.; Macaire, L. Color texture analysis using CFA chromatic co-occurrence matrices. Comput. Vis. Image Underst. 2013, 117, 747–763. [Google Scholar] [CrossRef]
- MUROS—Teamnet International. Available online: http://www.teamnet.ro/grupul-teamnet/cercetare-si-dezvoltare/muros/ (accessed on 15 December 2016).
- Programul CDI Tehnologie Spatiala si Cercetare Avansata STAR. Available online: https://star.rosa.ro (accessed on 15 December 2016).
- Fliess, M.; Lévine, J.; Martin, P.; Rouchon, P. On Differentially Flat Nonlinear Systems, Nonlinear Control Systems Design; Pergamon Press: Oxford, UK, 1992. [Google Scholar]
- Popescu, D.; Ichim, L.; Dobrescu, R. Sliding box method for automated detection of the optic disc and macula in retinal images. Lect. Notes Comput. Sci. 2015, 9043, 250–261. [Google Scholar]
- Ojala, T.; Pietikainen, M. Unsupervised texture segmentation using feature distributions. Pattern Recognit. 1999, 32, 477–486. [Google Scholar] [CrossRef]
- Sarkar, N.; Chaudhuri, B. An efficient differential box-counting approach to compute fractal dimension of image. IEEE Trans. Syst. Man Cybern. 1994, 24, 115–120. [Google Scholar] [CrossRef]
- Popescu, D.; Ichim, L.; Gornea, D.; Stoican, F. Complex image processing using correlated color information. Lect. Notes Comput. Sci. 2016, 10016, 723–734. [Google Scholar]
- Karperien, A. FracLac for ImageJ; Charles Sturt University: Sydney, Australia, 2013. [Google Scholar]
- Fawcett, T. An introduction to ROC analysis. Pattern Recognit. Lett. 2006, 27, 861–874. [Google Scholar] [CrossRef]
Abbreviation/Module Name | Function |
---|---|
FMCU Flight and Mission Control Unit | -Coordinates the flight mission; -Provides the platform’s stability and quick response in case of disturbances that may deflect the drone from its pre-defined route or its removal from the flight envelope; -Allows for manual piloting by an operator on the ground; -Implements the automated low-level control loops which assure path tracking. |
AHRS Attitude and Heading Reference System | -Provides information for an autonomous flight; -Contains the sensor subsystem composed of static and dynamic pressure sensors for speed measurement (ADXL352), accelerometer (ASDXRRX005PD2A5), magnetometer (HMC5983), altimeter (MPL3115A2) and gyroscope (ADXRS450); -Data provided by AHRS are used by FMCU. |
SU Safety Unit | -Assures the permanent monitoring of the signals sent by other units and interprets the error signals received; -Taking into account the fault-tree and the reported error, the SU may decide the future functioning of the UAV. Thus, it can decide to continue the mission, to return to the launching point or the designated retrieval point, or, as a last step, to deploy the parachute. |
PU Power Unit | -Assures the electrical power to the other components of the UAV, especially to the propulsion motor; -Contains power sources and a storage balance sensor used to equilibrate the energy consumed. |
VD Video Datalink | -Sends video data from the camera (PS) to the ground (via the GDT, to the GCS). It contains a modem RF (TXR) and a power amplifier RFA. |
TD Telemetric Datalink | -Assures a duplex communication for both transmission and reception of telemetry data. It has a structure similar to the VD. |
Payload Working load (payload) | -Has a dedicated CPU for device retracted; -Provides high resolution imagery or video HD; -Based on a gyro-stabilized mechanism. |
GDT Ground Data Terminal | -Antenna based tracking system; -The operational range is extended by using multiple ground data terminals; -Radio and Internet connections. |
GCS C Ground Control Station Coordinator | -Is the main component of the system; -Has a friendly user interface for operational purposes; -Internet connection with GDTs and IPU. |
GCS L Local Ground Control Station | -Optional -Transfer the control to operational field for each UAV. |
CSU Control for Servomotor Unit | -Ensures the control of the electric actuators; -Provides a feedback on their state. |
CRU Control Radio Unit | -Ensures the radio data transmission to and from GDT: telemetry, video/images and control. |
DESP Data Exchange & Signal Processing | -Data exchange between GCS and UAV via GDT; -Encoding/ decoding of video data; -Interface with Ethernet IP (ETH). |
SPTU Servo Pan Tilt Unit | -Transmission of control to the payload servomotors. |
PFCT PC for Flight Control and Telemetry | -Is the main module of GCS and is based on a CPU. |
ETH Switch Ethernet | -Ensures the data transmission at distance. |
RC Radio Control | -Ensures the control transmission to the GDT. |
LL Launcher Link | -Ensures the interface of GCS with the launcher; -Transmits the launch command. |
SL Safety Launcher Module | -Assures the start of UAV propulsion, if the speed launch is correct. |
IPU Imge Processing Unit | -Processes the images for flood detection -Estimate the size of flooded areas. |
ORT Ortho-rectified module | -Creates the ortho-rectified images. |
PLAN Ortho-photoplan module | -Creates the ortho-photoplan. |
LP Learning module | -Establishes the patches for feature selection; -Establishes the class representatives and features for patch signatures. |
CP Clssification module | -Divides the image in patches; -Classifies the patches as flood and non flood. |
DE Flood detection and estimation module | -Creates the segmented images -Estimates the flooded area (in percent). |
WiFi Module for WiFi communication | -Assures WiFi communication. |
Energy | Contrast | ||
Entropy | Correlation | ||
Homogeneity | Mean intensity | ||
Variance | LBP Histogram | ||
Mass fractal dimension | Lacunarity |
Characteristics | Technical Specifications |
---|---|
Propulsion | Electric |
Weight | 15 kg |
Wingspan | 4 m |
Endurance | 120 min |
Operating range | 15 km in classical regime and 30 km in autopilot regime |
Navigation support | GIS |
Navigation | manual/automatic |
Communication | antenna tracking system |
Payload | retractable and gyro-stabilized |
Mission | Planning software |
Recovery system | Parachute |
Maximum speed | 120 km/h |
Cruise speed | 70 km/h |
Maximum altitude | 3000 m |
Maximum camera weight | 1 kg |
Camera type | Sony Nex7, objective 50 mm, 24.3 megapixels, 10 fps |
Parameters for flood detection | Flight speed of 70 km/h and flight level 300 m |
Typical applications | Monitoring of critical infrastructures, reconnaissance missions over the areas affected by calamities (floods, earthquakes, fires, accidents, etc.), camera tracking, photography and cartography |
Patch | ImR | HomHH | ConHH | EnHS | DmG | LR |
---|---|---|---|---|---|---|
P1_F | 0.460 | 0.999 | 0.001 | 0.916 | 2.667 | 0.445 |
P2_F | 0.472 | 0.997 | 0.003 | 0.921 | 2.690 | 0.432 |
P3_F | 0.484 | 0.998 | 0.001 | 0.911 | 2.665 | 0.387 |
P4_F | 0.504 | 0.998 | 0.007 | 0.932 | 2.641 | 0.455 |
P5_F | 0.478 | 0.999 | 0.007 | 0.926 | 2.668 | 0.485 |
P6_F | 0.488 | 0.996 | 0.001 | 0.919 | 2.643 | 0.415 |
P7_F | 0.475 | 0.996 | 0.008 | 0.915 | 2.639 | 0.395 |
P8_F | 0.485 | 0.997 | 0.002 | 0.912 | 2.635 | 0.401 |
P9_F | 0.506 | 0.999 | 0.001 | 0.928 | 2.664 | 0.413 |
P10_F | 0.443 | 0.998 | 0.001 | 0.926 | 2.671 | 0.398 |
P11_F | 0.433 | 0.995 | 0.001 | 0.934 | 2.648 | 0.446 |
P12_F | 0.486 | 0.997 | 0.002 | 0.924 | 2.685 | 0.432 |
P13_F | 0.479 | 0.999 | 0.003 | 0.909 | 2.645 | 0.457 |
P14_F | 0.502 | 0.996 | 0.003 | 0.914 | 2.654 | 0.395 |
P15_F | 0.477 | 0.996 | 0.001 | 0.921 | 2.675 | 0.438 |
P16_F | 0.491 | 0.997 | 0.002 | 0.929 | 2.632 | 0.442 |
P17_F | 0.465 | 0.999 | 0.007 | 0.941 | 2.643 | 0.413 |
P18_F | 0.451 | 0.998 | 0.005 | 0.937 | 2.642 | 0.428 |
P19_F | 0.462 | 1.000 | 0.006 | 0.938 | 2.685 | 0.391 |
P20_F | 0.498 | 0.999 | 0.004 | 0.917 | 2.650 | 0.394 |
0.476 | 0.997 | 0.003 | 0.923 | 2.635 | 0.423 | |
[0.418; 0.535] | [0.994; 1.002] | [−0.004; 0.011] | [0.896; 0.951] | [2.605; 2.709] | [0.344; 0.502] | |
P1_NF | 0.161 | 0.195 | 0.392 | 0.415 | 2.601 | 0.177 |
P2_NF | 0.302 | 0.176 | 0.591 | 0.580 | 2.581 | 0.182 |
P3_NF | 0.226 | 0.187 | 0.560 | 0.602 | 2.592 | 0.164 |
P4_NF | 0.201 | 0.588 | 0.621 | 0.604 | 2.557 | 0.161 |
P5_NF | 0.241 | 0.576 | 0.399 | 0.424 | 2.569 | 0.345 * |
P6_NF | 0.151 | 0.192 | 0.581 | 0.522 | 2.590 | 0.194 |
P7_NF | 0.160 | 0.184 | 0.395 | 0.589 | 2.583 | 0.176 |
P8_NF | 0.215 | 0.177 | 0.581 | 0.449 | 2.596 | 0.167 |
P9_NF | 0.210 | 0.583 | 0.632 | 0.608 | 2.562 | 0.155 |
P10_NF | 0.151 | 0.593 | 0.481 | 0.625 | 2.568 | 0.174 |
P11_NF | 0.356 | 0.192 | 0.492 | 0.519 | 2.656 * | 0.255 |
P12_NF | 0.152 | 0.201 | 0.353 | 0.450 | 2.592 | 0.162 |
P13_NF | 0.169 | 0.171 | 0.372 | 0.561 | 2.590 | 0.175 |
P14_NF | 0.211 | 0.581 | 0.367 | 0.382 | 2.577 | 0.145 |
P15_NF | 0.205 | 0.544 | 0.624 | 0.613 | 2.573 | 0.198 |
P16_NF | 0.174 | 0.193 | 0.368 | 0.402 | 2.590 | 0.207 |
P17_NF | 0.195 | 0.576 | 0.634 | 0.634 | 2.562 | 0.184 |
P18_NF | 0.382 | 0.476 | 0.587 | 0.596 | 2.606 * | 0.195 |
P19_NF | 0.421 * | 0.425 | 0.456 | 0.545 | 2.584 | 0.198 |
P20_NF | 0.203 | 0.543 | 0.429 | 0.512 | 2.597 | 0.178 |
1 | 0 | 0 | 0 | 2 | 1 | |
20 | 20 | 20 | 20 | 20 | 20 | |
CI | 0.95 | 1 | 1 | 1 | 0.90 | 0.95 |
ImR = T1 | HomHH = T2 | ConHH = T3 | EnHS = T4 | DmG = T5 | LR = T6 |
---|---|---|---|---|---|
w1 = 0.91 | w2 = 0.93 | w3 = 0.96 | w4 = 0.97 | w5 = 0.88 | w6 = 0.94 |
Patch (Actual) | ImR/D(B1) | HomHH/D(B2) | ConHH/D(B3) | EnHS/D(B4) | DmG/D(B5) | LR/D(B6) | D(B)/F,NF T = 0.8 × 5.59 |
---|---|---|---|---|---|---|---|
B1_F | 0.494/0.91 | 0.996/0.93 | 0.001/0.96 | 0.942/0.97 | 2.661/0.88 | 0.372/0.94 | 5.59/F |
B2_F | 0.506/0.91 | 0.998/0.93 | 0.003/0.96 | 0.9340.97 | 2.637/0.88 | 0.421/0.94 | 5.59/F |
B3_F | 0.457/0.91 | 0.999/0.93 | 0.006/0.96 | 0.9610.97 | 2.643/0.88 | 0.446/0.94 | 5.59/F |
B4_F | 0.464/0.91 | 0.999/0.93 | 0.005/0.96 | 0.9160.97 | 2.701/0.88 | 0.497/0.94 | 5.59/F |
B5_F | 0.515/0.91 | 0.997/0.93 | 0.004/0.96 | 0.9520.97 | 2.621/0.88 | 0.480/0.94 | 5.59/F |
B6_F | 0.398/0 | 0.995/0.93 | 0.021/0 | 0.899/0.97 | 2.587/0 | 0.346/0.94 | 2.84/NF |
B7_F | 0.437/0.91 | 0.998/0.93 | 0.003/0.96 | 0.9190.97 | 2.678/0.88 | 0.405/0.94 | 5.59/F |
B8_F | 0.493/0.91 | 0.997/0.93 | 0.004/0.96 | 0.9310.97 | 2.671/0.88 | 0.417/0.94 | 5.59/F |
B9_F | 0.476/0.91 | 0.995/0.93 | 0.003/0.96 | 0.9150.97 | 2.682/0.88 | 0.482/0.94 | 5.59/F |
B10_F | 0.350/0 | 0.992/0 | 0.013/0 | 0.850/0 | 2.623/0.88 | 0.321/0 | 0.88/NF |
B1_NF | 0.172/0 | 0.204/0 | 0.387/0 | 0.423/0 | 2.599/0 | 0.167/0 | 0/NF |
B2_NF | 0.137/0 | 0.189/0 | 0.582/0 | 0.502/0 | 2.579/0 | 0.202/0 | 0/NF |
B3_NF | 0.224/0 | 0.526/0 | 0.353/0 | 0.412/0 | 2.564/0 | 0.327/0 | 0/NF |
B4_NF | 0.198/0 | 0.537/0 | 0.624/0 | 0.623/0 | 2.538/0 | 0.211/0 | 0/NF |
B5_NF | 0.249/0 | 0.592/0 | 0.617/0 | 0.589/0 | 2.521/0 | 0.149/0 | 0/NF |
B6_NF | 0.335/0 | 0.213/0 | 0.457/0 | 0.501/0 | 2.599/0 | 0.268/0 | 0/NF |
B7_NF | 0.186/0 | 0.555/0 | 0.602/0 | 0.654/0 | 2.556/0 | 0.172/0 | 0/NF |
B8_NF | 0.139/0 | 0.185/0 | 0.366/0 | 0.573/0 | 2.572/0 | 0.161/0 | 0/NF |
B9_NF | 0.231/0 | 0.593/0 | 0.401/0 | 0.438/0 | 2.569/0 | 0.339/0 | 0/NF |
B10_NF | 0.391/0 | 0.821/0 | 0.009/0.96 | 0.722/0 | 2.651/0.88 | 0.311/0 | 1.84/NF |
TP | TN | FP | FN | Sensitivity | Specificity | Accuracy |
---|---|---|---|---|---|---|
486 | 495 | 5 | 14 | 97.2% | 99% | 98.1% |
Images | IS1 | IS2 | IS3 | IS4 | IS5 | IS6 |
---|---|---|---|---|---|---|
Percent | 32.88 | 32.79 | 16.85 | 28.07 | 21.57 | 2.44 |
No. patches | 3156 | 3148 | 1617 | 2695 | 2071 | 234 |
© 2017 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license ( http://creativecommons.org/licenses/by/4.0/).
Share and Cite
Popescu, D.; Ichim, L.; Stoican, F. Unmanned Aerial Vehicle Systems for Remote Estimation of Flooded Areas Based on Complex Image Processing. Sensors 2017, 17, 446. https://doi.org/10.3390/s17030446
Popescu D, Ichim L, Stoican F. Unmanned Aerial Vehicle Systems for Remote Estimation of Flooded Areas Based on Complex Image Processing. Sensors. 2017; 17(3):446. https://doi.org/10.3390/s17030446
Chicago/Turabian StylePopescu, Dan, Loretta Ichim, and Florin Stoican. 2017. "Unmanned Aerial Vehicle Systems for Remote Estimation of Flooded Areas Based on Complex Image Processing" Sensors 17, no. 3: 446. https://doi.org/10.3390/s17030446
APA StylePopescu, D., Ichim, L., & Stoican, F. (2017). Unmanned Aerial Vehicle Systems for Remote Estimation of Flooded Areas Based on Complex Image Processing. Sensors, 17(3), 446. https://doi.org/10.3390/s17030446