Deriving Agricultural Field Boundaries for Crop Management from Satellite Images Using Semantic Feature Pyramid Network
Abstract
:1. Introduction
2. Study Areas and Available Datasets
3. Methodology
3.1. Agricultural Land (Parcels) Detection with a Fully Convolutional Network
3.1.1. Network Architecture: Semantic FPN (ResNet50 Backbone)
3.1.2. Deep Supervision and Loss Function
3.2. Delineation of Field Boundary and Inner Non-Planting Region
3.2.1. Boundary Delineation
3.2.2. Non-Planting Area Detection
3.3. Performance Evaluation
3.3.1. Semantic Segmentation Performance Metrics
3.3.2. Attained Field Boundaries Evaluation
4. Results
4.1. Experimental Set Up (Training Details)
4.2. Proposed Method Performance Comparison
4.2.1. Pixel Classification Evaluation
4.2.2. Attained Contour Verification on Different Sites
Application in Study Site 1
Application in Study Site 2
Application in Study Site 3
5. Conclusions
- ①
- Semantic convolution neural network (CNN) models would have great effects on agricultural or planting parcel extraction; the attained IoU value (around 0.90) and F1 score (around 0.94) both remain close to each other when using FPN, Link-Net, U-Net, and U-Net++ with or without the proposed post-processing procedure, but the attained precision and recall is quite different using different models.
- ②
- Agricultural field boundaries could be delineated in different study sites with varied planting modes (average area changes from 0.11 ha, 1.39 ha to 2.24 ha); in addition, internal non-planting areas, such as electronic poles, and walking or water paths, could greatly impact the field boundary result (especially slender path inside).
- ③
- Applicable field boundary delineation is greatly affected by the semantic models and the post-processing method. A sharp decrease in inapplicable and redundant field boundaries took place in different study places after post-processing; in study site 1, inapplicable boundary number using FPN changed from 60 to 5, and redundant boundary number shrank from 7359 to 244 when using Link-Net; in study site 2, inapplicable boundary number using PSP-Net changed from 25 to 2, and the redundant boundary number shrank from 435 to 58 when using Link-Net; and in study site 3, inapplicable boundary number using PSP-Net changed from 49 to 10, and redundant boundary number shrank from 36 to 3 when using Link-Net.
- ④
- The determined applicable boundary number, total, and average planting area generally remain closer to the reference values when using the proposed methodology (i.e., semantic FPN with post-processing) in the three different study sites, compared with other methods. Moreover, the number of inapplicable, redundant, and missed field boundaries also remain the lowest, which helps to avoid the unnecessary wasting of management and planning time on machinery operations.
- ⑤
- Besides the extraction models, the planting mode also greatly affects the boundary extraction; small-scale and small-gap planting would weaken the field boundary delineation performance.
Author Contributions
Funding
Data Availability Statement
Conflicts of Interest
References
- Bogunovic, I.; Pereira, P.; Brevik, E.C. Spatial distribution of soil chemical properties in an organic farm in Croatia. Sci. Total Environ. 2017, 584, 535–545. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Tetteh, G.O.; Gocht, A.; Conrad, C. Optimal parameters for delineating agricultural parcels from satellite images based on supervised Bayesian optimization. Comput. Electron. Agric. 2020, 178, 105696. [Google Scholar] [CrossRef]
- Lobert, F.; Holtgrave, A.K.; Schwieder, M.; Pause, M.; Vogt, J.; Gocht, A.; Erasmi, S. Mowing event detection in permanent grasslands: Systematic evaluation of input features from Sentinel-1, Sentinel-2, and Landsat 8 time series. Remote Sens. Environ. 2021, 267, 112751. [Google Scholar] [CrossRef]
- Kanjir, U.; Đurić, N.; Veljanovski, T. Sentinel-2 based temporal detection of agricultural land use anomalies in support of common agricultural policy monitoring. ISPRS Int. J. Geo-Inf. 2018, 7, 405. [Google Scholar] [CrossRef] [Green Version]
- Atzberger, C. Advances in remote sensing of agriculture: Context description, existing operational monitoring systems and major information needs. Remote Sens. 2013, 5, 949–981. [Google Scholar] [CrossRef] [Green Version]
- Rahman, M.; Ishii, K.; Noguchi, N. Optimum harvesting area of convex and concave polygon field for path planning of robot combine harvester. Intell. Serv. Robot. 2019, 12, 167–179. [Google Scholar] [CrossRef] [Green Version]
- Dorigo, W.A.; Zurita-Milla, R.; de Wit, A.J.W.; Brazile, J.; Singh, R.; Schaepman, M.E. A review on reflective remote sensing and data assimilation techniques for enhanced agroecosystem modeling. Int. J. Appl. Earth Obs. Geoinf. 2007, 9, 165–193. [Google Scholar] [CrossRef]
- Bellón, B.; Bégué, A.; Lo Seen, D.; De Almeida, C.A.; Simões, M. A remote sensing approach for regional-scale mapping of agricultural land-use systems based on NDVI time series. Remote Sens. 2017, 9, 600. [Google Scholar] [CrossRef] [Green Version]
- Smith, W.K.; Dannenberg, M.P.; Yan, D.; Herrmann, S.; Barnes, M.L.; Barron-Gafford, G.A.; Biederman, J.A.; Ferrenberg, S.; Fox, A.M.; Hudson, A.; et al. Remote sensing of dryland ecosystem structure and function: Progress, challenges, and opportunities. Remote Sens. Environ. 2019, 233, 111401. [Google Scholar] [CrossRef]
- Liu, X.; Zhai, H.; Shen, Y.; Lou, B.; Jiang, C.; Li, T.; Husain, S.B.; Shen, G. Large-scale crop mapping from multisource remote sensing images in google earth engine. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2020, 13, 414–427. [Google Scholar] [CrossRef]
- Paudel, D.; Boogaard, H.; de Wit, A.; Janssen, S.; Osinga, S.; Pylianidis, C.; Athanasiadis, I.N. Machine learning for large-scale crop yield forecasting. Agric. Syst. 2021, 187, 103016. [Google Scholar] [CrossRef]
- Ji, Z.; Pan, Y.; Zhu, X.; Wang, J.; Li, Q. Prediction of crop yield using phenological information extracted from remote sensing vegetation index. Sensors 2021, 21, 1406. [Google Scholar] [CrossRef] [PubMed]
- Waldner, F.; Lambert, M.J.; Li, W.; Weiss, M.; Demarez, V.; Morin, D.; Marais-Sicre, C.; Hagolle, O.; Baret, F.; Defourny, P. Land cover and crop type classification along the season based on biophysical variables retrieved from multi-sensor high-resolution time series. Remote Sens. 2015, 7, 10400–10424. [Google Scholar] [CrossRef] [Green Version]
- Kussul, N.; Lavreniuk, M.; Skakun, S.; Shelestov, A. Deep learning classification of land cover and crop types using remote sensing data. IEEE Geosci. Remote Sens. Lett. 2017, 14, 778–782. [Google Scholar] [CrossRef]
- Orynbaikyzy, A.; Gessner, U.; Conrad, C. Crop type classification using a combination of optical and radar remote sensing data: A review. Int. J. Remote Sens. 2019, 40, 6553–6595. [Google Scholar] [CrossRef]
- Xu, D.; Guo, X. Some insights on grassland health assessment based on remote sensing. Sensors 2015, 15, 3070–3089. [Google Scholar] [CrossRef] [Green Version]
- Gogoi, N.K.; Deka, B.; Bora, L.C. Remote sensing and its use in detection and monitoring plant diseases: A review. Agric. Rev. 2018, 39. [Google Scholar] [CrossRef]
- Ennouri, K.; Triki, M.A.; Kallel, A. Applications of remote sensing in pest monitoring and crop management. In Bioeconomy for Sustainable Development; Springer: Singapore, 2020; pp. 65–77. [Google Scholar] [CrossRef]
- Sa, I.; Popović, M.; Khanna, R.; Chen, Z.; Lottes, P.; Liebisch, F.; Nieto, J.; Stachniss, C.; Walter, A.; Siegwart, R. WeedMap: A large-scale semantic weed mapping framework using aerial multispectral imaging and deep neural network for precision farming. Remote Sens. 2018, 10, 1423. [Google Scholar] [CrossRef] [Green Version]
- Dalezios, N.R.; Dercas, N.; Spyropoulos, N.V.; Psomiadis, E. Remotely Sensed Methodologies for Crop Water Availability and Requirements in Precision Farming of Vulnerable Agriculture. Water Resour. Manag. 2019, 33, 1499–1519. [Google Scholar] [CrossRef]
- Persello, C.; Tolpekin, V.A.; Bergado, J.R.; de By, R.A. Delineation of agricultural fields in smallholder farms from satellite images using fully convolutional networks and combinatorial grouping. Remote Sens. Environ. 2019, 231, 111253. [Google Scholar] [CrossRef]
- Kumar, M.S.; Jayagopal, P. Delineation of field boundary from multispectral satellite images through U-Net segmentation and template matching. Ecol. Inform. 2021, 64, 101370. [Google Scholar] [CrossRef]
- Zhang, H.; Liu, M.; Wang, Y.; Shang, J.; Liu, X.; Li, B.; Song, A.; Li, Q. Automated delineation of agricultural field boundaries from Sentinel-2 images using recurrent residual U-Net. Int. J. Appl. Earth Obs. Geoinf. 2021, 105, 102557. [Google Scholar] [CrossRef]
- Waldner, F.; Diakogiannis, F.I.; Batchelor, K.; Ciccotosto-Camp, M.; Cooper-Williams, E.; Herrmann, C.; Mata, G.; Toovey, A. Detect, consolidate, delineate: Scalable mapping of field boundaries using satellite images. Remote Sens. 2021, 13, 2197. [Google Scholar] [CrossRef]
- Turker, M.; Kok, E.M. Field-based sub-boundary extraction from remote sensing imagery using perceptual grouping. ISPRS J. Photogramm. Remote Sens. 2013, 79, 106–121. [Google Scholar] [CrossRef]
- Yan, L.; Roy, D.P. Automated crop field extraction from multi-temporal web enabled Landsat data. Remote Sens. Environ. 2014, 144, 42–64. [Google Scholar] [CrossRef] [Green Version]
- Graesser, J.; Ramankutty, N. Detection of cropland field parcels from Landsat imagery. Remote Sens. Environ. 2017, 201, 165–180. [Google Scholar] [CrossRef] [Green Version]
- Segl, K.; Kaufmann, H. Detection of small objects from high-resolution panchromatic satellite imagery based on supervised image segmentation. IEEE Trans. Geosci. Remote Sens. 2001, 39, 2080–2083. [Google Scholar] [CrossRef]
- Da Costa, J.P.; Michelet, F.; Germain, C.; Lavialle, O.; Grimier, G. Delineation of vine parcels by segmentation of high resolution remote sensed images. Precis. Agric. 2007, 8, 95–110. [Google Scholar] [CrossRef]
- García-Pedrero, A.; Gonzalo-Martín, C.; Lillo-Saavedra, M. A machine learning approach for agricultural parcel delineation through agglomerative segmentation. Int. J. Remote Sens. 2017, 38, 1809–1819. [Google Scholar] [CrossRef] [Green Version]
- Wang, L.; Zhao, Y.; Liu, S.; Li, Y.; Chen, S.; Lan, Y. Precision detection of dense plums in orchards using the improved YOLOv4 model. Front. Plant Sci. 2022, 13, 839269. [Google Scholar] [CrossRef]
- Wagner, M.P.; Oppelt, N. Extracting Agricultural Fields from Remote Sensing Imagery Using Graph-Based Growing Contours. Remote Sens. 2020, 12, 1205. [Google Scholar] [CrossRef] [Green Version]
- Hong, R.; Park, J.; Jang, S.; Shin, H.; Kim, H.; Song, I. Development of a parcel-level land boundary extraction algorithm for aerial imagery of regularly arranged agricultural areas. Remote Sens. 2021, 13, 1167. [Google Scholar] [CrossRef]
- North, H.C.; Pairman, D.; Belliss, S.E. Boundary Delineation of Agricultural Fields in Multitemporal Satellite Imagery. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2019, 12, 237–251. [Google Scholar] [CrossRef]
- Taravat, A.; Wagner, M.P.; Bonifacio, R.; Petit, D. Advanced fully convolutional networks for agricultural field boundary detection. Remote Sens. 2021, 13, 722. [Google Scholar] [CrossRef]
- Xu, L.; Ming, D.; Du, T.; Chen, Y.; Dong, D.; Zhou, C. Delineation of cultivated land parcels based on deep convolutional networks and geographical thematic scene division of remotely sensed images. Comput. Electron. Agric. 2022, 192, 106611. [Google Scholar] [CrossRef]
- Lv, Y.; Zhang, C.; Yun, W.; Gao, L.; Wang, H.; Ma, J.; Li, H.; Zhu, D. The delineation and grading of actual crop production units in modern smallholder areas using RS Data and Mask R-CNN. Remote Sens. 2020, 12, 1074. [Google Scholar] [CrossRef]
- Chiu, M.T.; Xu, X.; Wei, Y.; Huang, Z.; Schwing, A.; Brunner, R.; Khachatrian, H.; Karapetyan, H.; Dozier, I.; Rose, G.; et al. Agriculture-vision: A large aerial image database for agricultural pattern analysis. In Proceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition, Seattle, WA, USA, 13–19 June 2020; pp. 2825–2835. [Google Scholar] [CrossRef]
- Oksanen, T.; Visala, A. Coverage path planning algorithms for agricultural field machines. J. Field Robot. 2009, 8, 651–668. [Google Scholar] [CrossRef]
- Cabreira, T.; Brisolara, L.; Ferreira, P. Survey on Coverage Path Planning with Unmanned Aerial Vehicles. Drones 2019, 3, 4. [Google Scholar] [CrossRef] [Green Version]
- Xu, Y.; Sun, Z.; Xue, X.; Gu, W.; Peng, B. A hybrid algorithm based on MOSFLA and GA for multi-UAVs plant protection task assignment and sequencing optimization. Appl. Soft Comput. J. 2020, 96, 106623. [Google Scholar] [CrossRef]
- Kirillov, A.; Girshick, R.; He, K.; Dollar, P. Panoptic feature pyramid networks. In Proceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition, Long Beach, CA, USA, 15–20 June 2019; pp. 6392–6401. [Google Scholar] [CrossRef] [Green Version]
- Suzuki, S.; Be, K. Topological structural analysis of digitized binary images by border following. Comput. Vis. Graph. Image Process. 1985, 30, 32–46. [Google Scholar] [CrossRef]
- Chen, Y.; Zhou, L.; Pei, S.; Yu, Z.; Chen, Y.; Liu, X.; Du, J.; Xiong, N. Knn-Block Dbscan: Fast Clustering for Large-Scale Data. IEEE Trans. Syst. Man Cybern. Syst. 2021, 51, 3939–3953. [Google Scholar] [CrossRef]
- Chaurasia, A.; Culurciello, E. LinkNet: Exploiting encoder representations for efficient semantic segmentation. In Proceedings of the 2017 IEEE Visual Communications and Image Processing (VCIP 2017), St. Petersburg, FL, USA, 10–13 December 2017; pp. 1–4. [Google Scholar] [CrossRef] [Green Version]
- Zhao, H.; Shi, J.; Qi, X.; Wang, X.; Jia, J. Pyramid scene parsing network. In Proceedings of the 2017 IEEE Conference on Computer Vision and Pattern Recognition (CVPR 2017), Honolulu, HI, USA, 21–26 July 2017; pp. 6230–6239. [Google Scholar] [CrossRef] [Green Version]
- Ronneberger, O.; Fischer, P.; Brox, T. U-net: Convolutional networks for biomedical image segmentation. In Proceedings of the International Conference on Medical Image Computing and Computer-Assisted Intervention, Munich, Germany, 5–7 October 2015; Springer: Cham, Switzerland, 2015; pp. 234–241. [Google Scholar] [CrossRef] [Green Version]
- Zhou, Z.; Rahman Siddiquee, M.M.; Tajbakhsh, N.; Liang, J. Unet++: A nested u-net architecture for medical image segmentation. In Deep Learning in Medical Image Analysis and Multimodal Learning for Clinical Decision Support; Springer: Cham, Switzerland, 2018; pp. 3–11. [Google Scholar] [CrossRef] [Green Version]
Method | IoU | F1 Score | Precision | Recall |
---|---|---|---|---|
FPN | 0.8949 | 0.9339 | 0.9648 | 0.9274 |
Link-Net | 0.9011 | 0.9419 | 0.9183 | 0.9813 |
PSP-Net | 0.8637 | 0.9149 | 0.9543 | 0.9032 |
U-Net | 0.9099 | 0.9472 | 0.9234 | 0.9852 |
U-Net++ | 0.9043 | 0.9433 | 0.9288 | 0.9735 |
Method | IoU | F1 Score | Precision | Recall |
---|---|---|---|---|
FPN | 0.8933 | 0.9329 | 0.9652 | 0.9255 |
Link-Net | 0.8978 | 0.9399 | 0.9203 | 0.9757 |
PSP-Net | 0.8621 | 0.9139 | 0.9549 | 0.9008 |
U-Net | 0.9094 | 0.9470 | 0.9243 | 0.9837 |
U-Net++ | 0.9028 | 0.9423 | 0.9296 | 0.9711 |
Methods | Parcel Boundary Contour Number | Total Area/ha | Average Area/ha | ||||||
---|---|---|---|---|---|---|---|---|---|
Applicable | Inapplicable | Redundant | Missed | Reference | Attained | Reference | Attained | Reference | |
FPN | 276 | 60 | 293 | 0 | 317 | 447.83 | 440.02 | 0.71 | 1.39 |
Link-Net | 237 | 35 | 7359 | 0 | 517.02 | 0.07 | |||
PSP-Net | 316 | 21 | 872 | 0 | 462.30 | 0.38 | |||
U-Net | 289 | 16 | 1118 | 0 | 495.95 | 0.35 | |||
U-Net++ | 264 | 32 | 1533 | 0 | 498.34 | 0.27 | |||
FPN ① | 326 | 5 | 50 | 1 | 459.29 | 1.21 | |||
Link-Net ① | 245 | 17 | 244 | 0 | 518.66 | 1.03 | |||
PSP-Net ① | 297 | 5 | 251 | 1 | 471.68 | 0.85 | |||
U-Net ① | 277 | 7 | 144 | 0 | 507.53 | 1.19 | |||
U-Net++ ① | 264 | 22 | 142 | 0 | 508.18 | 1.19 |
Methods | Parcel Boundary Contour Number | Total Area/ha | Average Area/ha | ||||||
---|---|---|---|---|---|---|---|---|---|
Applicable | Inapplicable | Redundant | Missed | Reference | Attained | Reference | Attained | Reference | |
FPN | 200 | 15 | 77 | 0 | 214 | 468.17 | 480.18 | 1.60 | 2.24 |
Link-Net | 195 | 10 | 435 | 0 | 504.21 | 0.79 | |||
PSP-Net | 195 | 25 | 104 | 0 | 460.22 | 1.42 | |||
U-Net | 198 | 8 | 206 | 0 | 505.63 | 1.23 | |||
U-Net++ | 202 | 7 | 232 | 0 | 495.10 | 1.12 | |||
FPN ① | 224 | 1 | 11 | 1 | 484.95 | 2.05 | |||
Link-Net ① | 202 | 5 | 58 | 0 | 521.28 | 1.97 | |||
PSP-Net ① | 215 | 2 | 32 | 1 | 481.22 | 1.93 | |||
U-Net ① | 203 | 4 | 54 | 0 | 523.81 | 2.01 | |||
U-Net++ ① | 206 | 3 | 51 | 0 | 512.98 | 1.97 |
Methods | Parcel Boundary Contour Number | Total Area/ha | Average Area/ha | ||||||
---|---|---|---|---|---|---|---|---|---|
Applicable | Inapplicable | Redundant | Missed | Reference | Attained | Reference | Attained | Reference | |
FPN | 221 | 17 | 8 | 1 | 272 | 27.44 | 29.94 | 0.11 | 0.11 |
Link-Net | 18 | 23 | 36 | 0 | 34.22 | 0.44 | |||
PSP-Net | 113 | 49 | 1 | 3 | 23.71 | 0.15 | |||
U-Net | 23 | 31 | 0 | 5 | 33.76 | 0.63 | |||
U-Net++ | 2 | 14 | 26 | 1 | 35.82 | 0.85 | |||
FPN ① | 220 | 3 | 0 | 5 | 30.53 | 0.14 | |||
Link-Net ① | 27 | 2 | 3 | 1 | 37.03 | 1.16 | |||
PSP-Net ① | 206 | 10 | 1 | 8 | 25.80 | 0.12 | |||
U-Net ① | 28 | 3 | 1 | 1 | 36.34 | 1.14 | |||
U-Net++ ① | 16 | 4 | 4 | 2 | 37.03 | 1.54 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2023 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Xu, Y.; Xue, X.; Sun, Z.; Gu, W.; Cui, L.; Jin, Y.; Lan, Y. Deriving Agricultural Field Boundaries for Crop Management from Satellite Images Using Semantic Feature Pyramid Network. Remote Sens. 2023, 15, 2937. https://doi.org/10.3390/rs15112937
Xu Y, Xue X, Sun Z, Gu W, Cui L, Jin Y, Lan Y. Deriving Agricultural Field Boundaries for Crop Management from Satellite Images Using Semantic Feature Pyramid Network. Remote Sensing. 2023; 15(11):2937. https://doi.org/10.3390/rs15112937
Chicago/Turabian StyleXu, Yang, Xinyu Xue, Zhu Sun, Wei Gu, Longfei Cui, Yongkui Jin, and Yubin Lan. 2023. "Deriving Agricultural Field Boundaries for Crop Management from Satellite Images Using Semantic Feature Pyramid Network" Remote Sensing 15, no. 11: 2937. https://doi.org/10.3390/rs15112937
APA StyleXu, Y., Xue, X., Sun, Z., Gu, W., Cui, L., Jin, Y., & Lan, Y. (2023). Deriving Agricultural Field Boundaries for Crop Management from Satellite Images Using Semantic Feature Pyramid Network. Remote Sensing, 15(11), 2937. https://doi.org/10.3390/rs15112937