Deep Green Diagnostics: Urban Green Space Analysis Using Deep Learning and Drone Images
Abstract
:1. Introduction
2. Related Work
3. Deep Green Diagnostics (DGD)
- Geotagged image dataset. Users must fly a UAV about 30 m above ground level to obtain the image of the terrain to be analyzed; these images include the georeference in their metadata and will constitute the dataset used for training the Deep Neural Network. See Section 3.1 for more details.
- Deep Neural Network. Users enter the previously obtained dataset into the system, which in turn will process it using the architecture described in Section 3.2. As output, the system will classify each image according to the eight classes shown in Figure 3.
- Visualization and retraining. In this stage, the users are able to visualize the classification results and, if desired, re-train the pre-loaded DNN with a new dataset. Details are presented in Section 3.3.
3.1. Geotagged Image Dataset
3.2. Deep Neural Network
Model Training and Validation
3.3. Visualization and Retraining
3.4. Full Map Display
4. Discussion
4.1. Model’s Performance
4.2. Software
5. Conclusions
Author Contributions
Funding
Conflicts of Interest
Appendix A. Full Maps
References
- O’Mara, F.P. The role of grasslands in food security and climate change. Ann. Bot. 2012, 110, 1263–1270. [Google Scholar] [CrossRef]
- Jacobs, S.W.; Whalley, R.; Wheeler, D.J. Grasses of New South Wales; University of New England Botany: Armidale, Australia, 2008. [Google Scholar]
- Angold, P.; Sadler, J.P.; Hill, M.O.; Pullin, A.; Rushton, S.; Austin, K.; Small, E.; Wood, B.; Wadsworth, R.; Sanderson, R.; et al. Biodiversity in urban habitat patches. Sci. Total Environ. 2006, 360, 196–204. [Google Scholar] [CrossRef]
- Wolch, J.R.; Byrne, J.; Newell, J.P. Urban green space, public health, and environmental justice: The challenge of making cities ‘just green enough’. Landscape Urban Plann. 2014, 125, 234–244. [Google Scholar] [CrossRef]
- Abraham, A.; Sommerhalder, K.; Abel, T. Landscape and well-being: A scoping study on the health-promoting impact of outdoor environments. Int. J. Publ. Health 2010, 55, 59–69. [Google Scholar] [CrossRef] [PubMed]
- Kondo, M.; Fluehr, J.; McKeon, T.; Branas, C. Urban green space and its impact on human health. Int. J. Environ. Res. Publ. Health 2018, 15, 445. [Google Scholar] [CrossRef] [PubMed]
- James, P.; Hart, J.E.; Banay, R.F.; Laden, F. Exposure to greenness and mortality in a nationwide prospective cohort study of women. Environ. Health Perspect. 2016, 124, 1344–1352. [Google Scholar] [CrossRef] [PubMed]
- Dadvand, P.; Nieuwenhuijsen, M.J.; Esnaola, M.; Forns, J.; Basagaña, X.; Alvarez-Pedrerol, M.; Rivas, I.; López-Vicente, M.; Pascual, M.D.C.; Su, J.; et al. Green spaces and cognitive development in primary schoolchildren. Proc. Natl. Acad. Sci. USA 2015, 112, 7937–7942. [Google Scholar] [CrossRef]
- Crouse, D.L.; Pinault, L.; Balram, A.; Hystad, P.; Peters, P.A.; Chen, H.; van Donkelaar, A.; Martin, R.V.; Ménard, R.; Robichaud, A.; et al. Urban greenness and mortality in Canada’s largest cities: A national cohort study. Lancet Planet. Health 2017, 1, e289–e297. [Google Scholar] [CrossRef]
- Paquet, C.; Coffee, N.T.; Haren, M.T.; Howard, N.J.; Adams, R.J.; Taylor, A.W.; Daniel, M. Food environment, walkability, and public open spaces are associated with incident development of cardio-metabolic risk factors in a biomedical cohort. Health Place 2014, 28, 173–176. [Google Scholar] [CrossRef]
- Abbasi, M.; El Hanandeh, A. Forecasting municipal solid waste generation using artificial intelligence modelling approaches. Waste Manag. 2016, 56, 13–22. [Google Scholar] [CrossRef]
- Souza, J.T.D.; Francisco, A.C.D.; Piekarski, C.M.; Prado, G.F.D. Data mining and machine learning to promote smart cities: A systematic review from 2000 to 2018. Sustainability 2019, 11, 1077. [Google Scholar] [CrossRef]
- Rodriguez Lopez, J.M.; Rosso, P.; Scheffran, J.; Delgado Ramos, G.C. Remote sensing of sustainable rural-urban land use in Mexico City: A qualitative analysis for reliability and validity. Interdisciplina 2015, 3, 321–342. [Google Scholar] [CrossRef]
- Trimble. eCognition Software. 2019. Available online: http://www.ecognition.com/ (accessed on 9 February 2019).
- De la CDMX, P.A.O.T. Sistema de Información del Patrimonio Ambiental y Urbano de la CDMX. 2014. Available online: http://200.38.34.15:8008/mapguide/sig/siginterno.php (accessed on 9 February 2019).
- Ševo, I.; Avramović, A. Convolutional neural network based automatic object detection on aerial images. IEEE Geosci. Remote Sens. Lett. 2016, 13, 740–744. [Google Scholar] [CrossRef]
- Lu, H.; Fu, X.; Liu, C.; Li, L.G.; He, Y.X.; Li, N.W. Cultivated land information extraction in UAV imagery based on deep convolutional neural network and transfer learning. J. Mt. Sci. 2017, 14, 731–741. [Google Scholar] [CrossRef]
- Do, D.; Pham, F.; Raheja, A.; Bhandari, S. Machine learning techniques for the assessment of citrus plant health using UAV-based digital images. In Proceedings of the Autonomous Air and Ground Sensing Systems for Agricultural Optimization and Phenotyping III, Orlando, FL, USA, 15–19 April 2018; p. 106640O. [Google Scholar]
- Phan, C.; Raheja, A.; Bhandari, S.; Green, R.L.; Do, D. A predictive model for turfgrass color and quality evaluation using deep learning and UAV imageries. In Proceedings of the Autonomous Air and Ground Sensing Systems for Agricultural Optimization and Phenotyping II, Anaheim, CA, USA, 9–13 April 2017; p. 102180H. [Google Scholar]
- Ding, K.; Raheja, A.; Bhandari, S.; Green, R.L. Application of machine learning for the evaluation of turfgrass plots using aerial images. In Proceedings of the Autonomous Air and Ground Sensing Systems for Agricultural Optimization and Phenotyping, Baltimore, MD, USA, 17–21 April 2016; p. 98660I. [Google Scholar]
- Feng, Q.; Liu, J.; Gong, J. UAV remote sensing for urban vegetation mapping using random forest and texture analysis. Remote Sens. 2015, 7, 1074–1094. [Google Scholar] [CrossRef]
- Dos Santos Ferreira, A.; Freitas, D.M.; da Silva, G.G.; Pistori, H.; Folhes, M.T. Weed detection in soybean crops using ConvNets. Comput. Electron. Agric. 2017, 143, 314–324. [Google Scholar] [CrossRef]
- DJI. PHANTOM 4.2019. Available online: https://www.dji.com/mx/phantom-4 (accessed on 28 August 2019).
- Vargas, I. Deep Green Diagnostics, Mendeley Data. 2019. Available online: http://dx.doi.org/10.17632/dn8rj26kzm.4 (accessed on 26 March 2019).
- LeCun, Y.; Bengio, Y. Convolutional networks for images, speech, and time series. Handb. Brain Theor. Neural Netw. 1995, 3361, 1995. [Google Scholar]
- Gu, J.; Wang, Z.; Kuen, J.; Ma, L.; Shahroudy, A.; Shuai, B.; Liu, T.; Wang, X.; Wang, G.; Cai, J.; et al. Recent advances in convolutional neural networks. Pattern Recognit. 2018, 77, 354–377. [Google Scholar] [CrossRef]
- Sokolova, M.; Lapalme, G. A systematic analysis of performance measures for classification tasks. Inf. Process. Manag. 2009, 45, 427–437. [Google Scholar] [CrossRef]
- Lopez, A. Deep Green Diagnostics, Github. 2019. Available online: https://github.com/AnayantzinPao/DeepGreenDiagnostics (accessed on 10 September 2019).
- Moreno-Armendariz, M.A. Deep Green Diagnostics Video, Youtube. 2019. Available online: https://youtu.be/OnOQ8g0cAfc (accessed on 10 September 2019).
- Xu, G.; Zhu, X.; Fu, D.; Dong, J.; Xiao, X. Automatic land cover classification of geo-tagged field photos by deep learning. Environ. Modell. Softw. 2017, 91, 127–134. [Google Scholar] [CrossRef]
- Zhang, F.; Zhou, B.; Liu, L.; Liu, Y.; Fung, H.H.; Lin, H.; Ratti, C. Measuring human perceptions of a large-scale urban region using machine learning. Landscape Urban Plann. 2018, 180, 148–160. [Google Scholar] [CrossRef]
- Bengio, Y. Deep learning of representations for unsupervised and transfer learning. In Proceedings of the 2011 International Conference on Unsupervised and Transfer Learning Workshop, Bellevue, WA, USA, 2 July 2011; pp. 17–36. [Google Scholar]
- Ren, R.; Hung, G.G.; Tan, K.C. A Generic Deep-Learning-Based Approach for Automated Surface Inspection. IEEE Trans. Cybern. 2018, 48, 929–940. [Google Scholar] [CrossRef] [PubMed]
- González-Jaramillo, V.; Fries, A.; Bendix, J. AGB Estimation in a Tropical Mountain Forest (TMF) by Means of RGB and Multispectral Images Using an Unmanned Aerial Vehicle (UAV). Remote Sens. 2019, 11, 1413. [Google Scholar] [CrossRef]
- Xu, G.; Zhu, X.; Tapper, N.; Bechtel, B. Urban climate zone classification using convolutional neural network and ground-level images. Prog. Phys. Geogr. Earth Environ. 2019. [Google Scholar] [CrossRef]
- Kitano, B.T.; Mendes, C.C.; Geus, A.R.; Oliveira, H.C.; Souza, J.R. Corn Plant Counting Using Deep Learning and UAV Images. IEEE Geosci. Remote Sens. Lett. 2019. [Google Scholar] [CrossRef]
- Kinaneva, D.; Hristov, G.; Raychev, J.; Zahariev, P. Early Forest Fire Detection Using Drones and Artificial Intelligence. In Proceedings of the 2019 42nd International Convention on Information and Communication Technology, Electronics and Microelectronics (MIPRO), Opatija, Croatia, 20–24 May 2019; pp. 1060–1065. [Google Scholar]
- Ojeda, L.; Borenstein, J.; Witus, G.; Karlsen, R. Terrain characterization and classification with a mobile robot. J. Field Rob. 2006, 23, 103–122. [Google Scholar] [CrossRef]
- Pellenz, J.; Lang, D.; Neuhaus, F.; Paulus, D. Real-time 3d mapping of rough terrain: A field report from disaster city. In Proceedings of the 2010 IEEE Safety Security and Rescue Robotics, Bremen, Germany, 26–30 July 2010; pp. 1–6. [Google Scholar]
- Jin, Y.Q.; Wang, D. Automatic detection of terrain surface changes after Wenchuan earthquake, May 2008, from ALOS SAR images using 2EM-MRF method. IEEE Geosci. Remote Sens. Lett. 2009, 6, 344–348. [Google Scholar]
- Sumfleth, K.; Duttmann, R. Prediction of soil property distribution in paddy soil landscapes using terrain data and satellite information as indicators. Ecol. Indic. 2008, 8, 485–501. [Google Scholar] [CrossRef]
- Middel, A.; Lukasczyk, J.; Zakrzewski, S.; Arnold, M.; Maciejewski, R. Urban form and composition of street canyons: A human-centric big data and deep learning approach. Landscape Urban Plann. 2019, 183, 122–132. [Google Scholar] [CrossRef]
- Atfarm. Precise Fertilisation Made Simple. 2019. Available online: https://www.at.farm/ (accessed on 10 September 2019).
General Parameters | Values |
---|---|
Batch size | 20 |
Images for training | 9000 |
Images for testing | 901 |
Number of iterations | 200 |
Activation function | ReLu |
Loss function | Cross entropy |
Optimization method | Adam |
Regularization | None |
Learning rate | |
Parameters of CNN | |
Pooling layer | Max pooling |
Convolution type | Basic |
Parameters of MLP | |
Layer type | Full connected |
Output layer transfer function | pureline |
H | D | UNH | NV | HC | DC | UNHC | HVC | Total | |
---|---|---|---|---|---|---|---|---|---|
Bicentenario Park | 2677 | 7917 | 376 | 5270 | 5 | 162 | 247 | 146 | 16,800 |
ESCOM-IPN | 5011 | 5241 | 204 | 10,829 | 14 | 76 | 134 | 91 | 21,600 |
Bonito Ecatepec | 321 | 106 | 41 | 5350 | 2 | 18 | 74 | 88 | 6000 |
Forest area | 2258 | 1429 | 259 | 1008 | 6 | 55 | 242 | 143 | 5400 |
H | D | UNH | NV | HC | DC | UNHC | HVC | Total | |
---|---|---|---|---|---|---|---|---|---|
Bicentenario Park | 18,364.22 | 54,310.60 | 2579.36 | 36,152.20 | 34.30 | 1111.32 | 1879.64 | 1,007.45 | 115,439.10 |
ESCOM-IPN | 34,376.46 | 35,953.26 | 1399.44 | 74,287.94 | 96.04 | 522.36 | 919.24 | 624.26 | 148,179.00 |
Bonito Ecatepec | 2202.06 | 727.16 | 281.26 | 36,701.00 | 13.72 | 123.48 | 507.64 | 603.68 | 41,160.00 |
Forested area | 15,489.88 | 9803.00 | 1776.74 | 6914.88 | 41.16 | 377.30 | 1660.12 | 981.00 | 37,044.08 |
Biomass | Contamination | |
---|---|---|
Bicentenario park | 67.76 | 3.64 |
ESCOM-IPN | 47.34 | 1.64 |
Bonito Ecatepec | 9.37 | 16.73 |
Forested area | 78.69 | 7.13 |
Image Source: | It Is Classified Based on or with the Objective of: | Uses | Test | |||||||
---|---|---|---|---|---|---|---|---|---|---|
Authors | UAV | Terrain (Vehicle) | Satellite | Health | Contamination | Texture | Counting | Disasters | NDVI | Accurancy |
Ours | X | - | - | X | X | - | - | - | - | 72% |
[30] | - | - | X | - | - | X | - | - | - | 73.61% |
[35] | - | X | - | - | - | X | - | - | - | 69.6% |
[34] | X | - | - | X | - | - | - | - | - | - |
[36] | X | - | - | - | - | - | X | - | - | - |
[37] | X | - | - | - | - | - | - | X | X | - |
[38] | - | X | - | - | - | X | - | - | - | 80% |
[39] | - | X | - | - | - | - | - | X | - | - |
[40] | - | - | X | - | - | - | - | X | - | - |
[41] | - | X | X | X | - | - | - | - | X | - |
[42] | - | - | - | - | - | X | - | - | - | 95.63% |
© 2019 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).
Share and Cite
Moreno-Armendáriz, M.A.; Calvo, H.; Duchanoy, C.A.; López-Juárez, A.P.; Vargas-Monroy, I.A.; Suarez-Castañon, M.S. Deep Green Diagnostics: Urban Green Space Analysis Using Deep Learning and Drone Images. Sensors 2019, 19, 5287. https://doi.org/10.3390/s19235287
Moreno-Armendáriz MA, Calvo H, Duchanoy CA, López-Juárez AP, Vargas-Monroy IA, Suarez-Castañon MS. Deep Green Diagnostics: Urban Green Space Analysis Using Deep Learning and Drone Images. Sensors. 2019; 19(23):5287. https://doi.org/10.3390/s19235287
Chicago/Turabian StyleMoreno-Armendáriz, Marco A., Hiram Calvo, Carlos A. Duchanoy, Anayantzin P. López-Juárez, Israel A. Vargas-Monroy, and Miguel Santiago Suarez-Castañon. 2019. "Deep Green Diagnostics: Urban Green Space Analysis Using Deep Learning and Drone Images" Sensors 19, no. 23: 5287. https://doi.org/10.3390/s19235287
APA StyleMoreno-Armendáriz, M. A., Calvo, H., Duchanoy, C. A., López-Juárez, A. P., Vargas-Monroy, I. A., & Suarez-Castañon, M. S. (2019). Deep Green Diagnostics: Urban Green Space Analysis Using Deep Learning and Drone Images. Sensors, 19(23), 5287. https://doi.org/10.3390/s19235287