Next Article in Journal
Origin Intelligent Identification of Angelica sinensis Using Machine Vision and Deep Learning
Next Article in Special Issue
Online Detection of Dry Matter in Potatoes Based on Visible Near-Infrared Transmission Spectroscopy Combined with 1D-CNN
Previous Article in Journal
What Is the Willingness to Pay for a Basket of Agricultural Goods? Multi-Features of Organic, Animal Welfare-Based and Natural Products with No Additives
Previous Article in Special Issue
Distinguishing Seed Cultivars of Quince (Cydonia oblonga Mill.) Using Models Based on Image Textures Built Using Traditional Machine Learning Algorithms
 
 
Article
Peer-Review Record

Remote Sensing Identification and Rapid Yield Estimation of Pitaya Plants in Different Karst Mountainous Complex Habitats

Agriculture 2023, 13(9), 1742; https://doi.org/10.3390/agriculture13091742
by Zhongfa Zhou 1,2, Ruiwen Peng 1,2,*, Ruoshuang Li 3, Yiqiu Li 1,2, Denghong Huang 1,2 and Meng Zhu 1,2
Reviewer 1:
Reviewer 2:
Reviewer 3:
Reviewer 4: Anonymous
Reviewer 5:
Agriculture 2023, 13(9), 1742; https://doi.org/10.3390/agriculture13091742
Submission received: 5 July 2023 / Revised: 29 August 2023 / Accepted: 30 August 2023 / Published: 1 September 2023

Round 1

Reviewer 1 Report

In introduction or in other instances authors stated "can" to referring their proposed methods. However, it would be best to tone down the firmness of the statements.  Though problem elaborated in text. It is clearly visualized if audience go through the texts. This ay be attributed to English. The problem statements may be further improved. 

Though problem elaborated in text. It is clearly visualized if audience go through the texts. This ay be attributed to English. The problem statements may be further improved. 

Author Response

Dear Reviewer,

 

It is a great honor to have your precious time and expertise to review our manuscript. We are very grateful for your valuable opinion and thank you heartily.

The problems you have pointed out are the weak spots of the manuscript, we have corrected them according to your suggestion. We listed the detailed response to your advice in the following page.

1)In introduction or in other instances authors stated "can" to referring their proposed methods. However, it would be best to tone down the firmness of the statements. Though problem elaborated in text. It is clearly visualized if audience go through the texts. This maybe attributed to English. The problem statements may be further improved.

The statements in the manuscript has been revised according to suggestion.

 

All modifications are highlighted in the article.

Please allow us to represent the sincerest thank you again.

 

Yours truly,

Ruiwen Peng

Author Response File: Author Response.pdf

Reviewer 2 Report

The manuscript entitled: „Remote Sensing Identification and Rapid Yield Estimation of Pitaya plants in Karst Mountainous Complex Habitat” presents interesting study on application of UAV remote sensing in agriculture.

Manuscript contains some drawbacks:

1) Word “plants” should start with capital letter in the title. Please follow the instructions for authors. The manuscript was probably not prepared using template of the journal because there are no line numbers.

2) email address is wrong, because it is without “@) (19030090027.gzun.edu.cn)

3) The aim of the study, which is described in the aim of the Introduction is not clear. Please be more specific.

4) Flight height was at altitude 60 meters. It was distance from starting point or for total area? The terrain is quite variable and how the same distance was kept? Digital terrain model was prepared before the flight?

5) What was the pixel size?

6) It would be good if the example of image from UAV which presents individual plants and fruits will be included in the manuscript.

7) What units of R, G and B were used for calculation of CCVI according the equation (1)? It was normalized value or DN - digital number (how many bits)?

8) I section 2.2.1 there information “Four batches of sample plant data were collected”. How many plants were evaluated? Please provide more details. Is it not possible to evaluate number of fruits using UAV images?

9) In the title of the manuscript there is information that it is study on “Rapid Yield Estimation”, while it is quite complicated method. One time consuming step is manual evaluation of number of fruits and their size. It would be good if you optimize this step and evaluate what is minimum sample size (number of plants). Accuracy of yield estimation is dependent on this step and more attention should be paid on evaluation of mean yield per plant.

10) In the Introduction there are many information about vegetation indices based on multispectral images, but in this study only RGB images were used? Please be consistent and adjust Introduction to the methods which were applied.

11) Please provide specification of UAV sensors. It were only RGB cameras or multispectral sensors were used too?

12) It would be good if in the Conclusions you will recommend one method which has not only high accuracy but it is simple to perform.

 

 

Author Response

Dear Reviewer,

 

It is a great honor to have your precious time and expertise to review our manuscript. We are very grateful for your valuable opinion and thank you heartily.

The problems you have pointed out are the weak spots of the manuscript, we have corrected them according to your suggestion. We listed the detailed response to your advice one by one in the following page.

1)Word “plants” should start with capital letter in the title. Please follow the instructions for authors. The manuscript was probably not prepared using template of the journal because there are no line numbers.

Changed “plants” to “Plants” according to suggestion. Line numbers were added to the article and the article was rearranged according to the author's instructions.

2)  email address is wrong, because it is without “@) (19030090027.gzun.edu.cn).

Modified the email address.

3) The aim of the study, which is described in the aim of the Introduction is not clear. Please be more specific.

A description of the importance of rapid yield estimation of pitaya fruit was added to highlight the purpose of the article (in the introduction).

4) Flight height was at altitude 60 meters. It was distance from starting point or for total area? The terrain is quite variable and how the same distance was kept? Digital terrain model was prepared before the flight?

The flight altitude is the distance from the starting point, and the area with large terrain changes is specially segmented, and the canopy height model is used for experiments. Digital terrain model(1m) was prepared before the flight, Obtained from the Guizhou Bureau of Surveying and Mapping.

5) What was the pixel size?

The pixel size is 1.48*1.48cm, which has been added to the article (in Chapter 2.2.1 Materials).

6)It would be good if the example of image from UAV which presents individual plants and fruits will be included in the manuscript.

We added the images of individual plants and fruits taken by UAV in the modified figure 2 as suggested.

7)  What units of R, G and B were used for calculation of CCVI according the equation (1)? It was normalized value or DN - digital number (how many bits)?

It was DN - digital number,8bit,The values range from 0~255.

8) In section 2.2.1 there information “Four batches of sample plant data were collected”. How many plants were evaluated? Please provide more details. Is it not possible to evaluate number of fruits using UAV images?

In each of the four study areas, the monthly yield of 200 sample plants was used as an estimated yield model, and total regional output was collected as validation of estimated yield results. (This information has been added to the article). The pitaya fruit is constantly producing and some fruits may be blocked by the leaves, so it could not directly judge the fruit number from the drone images. The combination of UAV-imaged plant surveys and manual surveys of fruits to construct empirical models enables rapid yield estimation.

9)In the title of the manuscript there is information that it is study on “Rapid Yield Estimation”, while it is quite complicated method. One-time consuming step is manual evaluation of number of fruits and their size. It would be good if you optimize this step and evaluate what is minimum sample size (number of plants). Accuracy of yield estimation is dependent on this step and more attention should be paid on evaluation of mean yield per plant.

Artificial assessment of the number of fruit and its size is essential to build the pitaya rapid steps in the model, but once the model construction is completed, one can use the model combining drone image classification extraction results for rapid estimation, in the subsequent estimates there is no longer need to manually assess the number of fruit and its size, but can rely on the empirical model to implement the estimate.

10) In the Introduction there are many information about vegetation indices based on multispectral images, but in this study only RGB images were used? Please be consistent and adjust Introduction to the methods which were applied.

RGB image is one of multi-spectral image. Compared with other sensors, RGB sensor is light and more convenient, and has high accuracy in the extraction of crops and vegetation. We also have many references in this paper to prove this point [15,18,20-21].

11) Please provide specification of UAV sensors. It was only RGB cameras or multispectral sensors were used too?

The UAV is equipped with RGB camera sensor and point cloud generation system. We add a table to explain the UAV specification (Table 1).

12) It would be good if in the Conclusions you will recommend one method which has not only high accuracy but it is simple to perform.

After we completed the construction of the estimated yield model, we actually only needed to classify the planting scene, and extract the number of pitaya strains by different methods according to the scene type, so as to quickly and accurately estimate the yield. In future studies, we will focus on how to improve the efficiency of scene classification and pitaya plant number extraction, so as to smplify the method of rapid yield estimation.

 

All modifications are highlighted in the article.

Please allow us to represent the sincerest thank you again.

 

Yours truly,

Ruiwen Peng

Author Response File: Author Response.pdf

Reviewer 3 Report

In discussion, there are well validated yet about CCVI usage in this paper research. Because it looks like just pattern recognition instead of time series VI usage. In addition, by using UAV, ortho correction is also key with appropriate spatial resolution DSM/DEM. But, there is no good explanation about that. And also, sun elevation angle is also very important factor but there is no mentioned. About Yield model, it is not good enough model verification with this UAV observation data.

no comment

Author Response

Dear Reviewer,

 

It is a great honor to have your precious time and expertise to review our manuscript. We are very grateful for your valuable opinion and thank you heartily.

The problems you have pointed out are the weak spots of the manuscript, we have corrected them according to your suggestion. We listed the detailed response to your advice one by one in the following page.

1)In discussion, there are well validated yet about CCVI usage in this paper research. Because it looks like just pattern recognition instead of time series VI usage.

We improved the presentation of the introduction. Previous studies on CCVI have good effects and accuracy [30], so it can be used in this study.

2)  In addition, by using UAV, ortho correction is also key with appropriate spatial resolution DSM/DEM. But, there is no good explanation about that. And also, sun elevation angle is also very important factor but there is no mentioned.

Ortho correction is performed automatically by image processing software, and we have supplemented this in the experimental flow (line216-218).As the research areas are all karst plateau areas, the weather is cloudy when shooting, which is less affected by sun elevation angle.

3) About Yield model, it is not good enough model verification with this UAV observation data.

We have improved this by refining the yield estimation process.

 

All modifications are highlighted in the article.

Please allow us to represent the sincerest thank you again.

 

Yours truly,

Ruiwen Peng

Author Response File: Author Response.pdf

Reviewer 4 Report

The paper focuses on utilizing UAV (Unmanned Aerial Vehicle) remote sensing technology for the identification and yield estimation of Pitaya plants in complex karst mountainous areas. By applying various identification methods and remote sensing software, the study claims to accurately identify individual Pitaya plants and constructs a yield estimation model with high accuracy across different complex scenes. 

 

I found the paper difficult to follow in general. I wonder whether a flow chart would help the reader understand the different components of the methodology.

 

I found the results section to be long and difficult to follow.  I wonder if the paper would benefit from headline results in the main document, and the rest of the results in supplementary materials?

 

The English writing is mostly understandable throughout the document, but many of the sentences are poor expressed or awkwardly written and need to be revised.

 

Figure 1 - Some of the maps only have a single axis tick mark which makes it difficult to gauge the size and extent of the area.  Also, please label subplots.  It seems like you labeled (a) and (b), but not the rest.  The caption should include a description of the subplots.

 

2.1. Study Area - In the coordinates near the start of this section, there is a comma after 105°36'30′′-105°46'30′′E - I'm wondering if there is something missing from here?

 

How many images did you use to train your U-Net Deep Learning model?

 

Figure 5 is barely legible it's so small.

 

Figure 6 has subplots (a) and (b), but no (c) and there are 3 plots. The figure caption should indicate what each subplot shows.

 

Figure 7 - Please label the images and add a caption that explains what is in the subplots.

 

Equation 6 - What does the astrisk represent here? Please explain.  Looking at Equation 7, it seems to me that this is intended to be multiplication. If this is the case, then please replace with multiplication sign in both equations.

 

Equation 2 - Calculating the canopy height in this way seems like it might be prone to measurement errors.  You mention that you measure the plant height, but I don't see any discussion of how accurate this equation is at determining the canopy height.  I think this is an important detail.

 

Figure 8 - The subplots need to be labeled.  The caption should include information about the subplots. It's not immediately clear to me what the difference is between the 3 different types of images.

 

Figure 9 - The subplots need to be labeled.  The caption should include information about the subplots.

 

Figure 10 - The caption should contain information about what is in the subplots.

 

Figure 12 - The caption should contain information about what is in the subplots.

 

Section 2.2.2 - "...and the steps were mainly screening valid aerials, high-precision processing, air-triple encryption and data generation."  What exactly does this mean - I think it requires further explanation.

 

page 5, just before Figure 2.  "... data was validity" is one of many mistakes in the text.  The paper needs to be checked to ensure the English makes sense and to remove careless mistakes, of which there are many.  Unfortunately I could not feedback all of them due to the lack of line numbers in the submitted manuscript.

 

U-net or U-Net, pitaya or Pitaya - please be consistent throughout the document.

 

The bibliography should follow the MDPI style for every reference.

 

 

Requires revision to remove awkwardly constructed sentences and some phrasing that just doesn't make sence.

Author Response

Dear Reviewer,

 

It is a great honor to have your precious time and expertise to review our manuscript. We are very grateful for your valuable opinion and thank you heartily.

The problems you have pointed out are the weak spots of the manuscript, we have corrected them according to your suggestion. We listed the detailed response to your advice one by one in the following page.

1)I found the paper difficult to follow in general. I wonder whether a flow chart would help the reader understand the different components of the methodology.

A flow chart was drawn and added to the article (Figure 6), so as to help the readers understand the relationship between the different components of the article.

2)  I found the results section to be long and difficult to follow. I wonder if the paper would benefit from headline results in the main document, and the rest of the results in supplementary materials?

We have simplified the results section.

3) The English writing is mostly understandable throughout the document, but many of the sentences are poor expressed or awkwardly written and need to be revised.

We have revised the whole manuscript to improve the language.

4) Figure 1 - Some of the maps only have a single axis tick mark which makes it difficult to gauge the size and extent of the area.  Also, please label subplots.  It seems like you labeled (a) and (b), but not the rest.  The caption should include a description of the subplots.

We replotted Figure 1, added the subgraph title and added a description of the subplot.

5) 2.1. Study Area - In the coordinates near the start of this section, there is a comma after 105°36'30′′-105°46'30′′E - I'm wondering if there is something missing from here?

We adjusted the content and removed the redundant symbols.

  • How many images did you use to train your U-Net Deep Learning model?

In the training, we divided the regional sample set into 224×224 sample slices, and selected 100 slices containing pitaya plants in the sample slices to construct the training sample data(Line 422-428).

7)  Figure 5 is barely legible it's so small.

We revised Figure 6 (original Figure 5) to make it clearer.

8) Figure 6 has subplots (a) and (b), but no (c) and there are 3 plots. The figure caption should indicate what each subplot shows.

We add (c) to Figure 7 (original Figure 6) and add the content to the title.

9)Figure 7 - Please label the images and add a caption that explains what is in the subplots.

We added titles and content to the subgraph in Figure 8 (original Figure 7).

10) Equation 6 - What does the astrisk represent here? Please explain.  Looking at Equation 7, it seems to me that this is intended to be multiplication. If this is the case, then please replace with multiplication sign in both equations.

We replace the multiplicative notation in both equations.

11) Equation 2 - Calculating the canopy height in this way seems like it might be prone to measurement errors.  You mention that you measure the plant height, but I don't see any discussion of how accurate this equation is at determining the canopy height.  I think this is an important detail.

In reference [31-32], previous investigators demonstrated the accuracy of this equation in determining the canopy height. We cite these articles in the section of method selection and application.

12) Figure 8 - The subplots need to be labeled.  The caption should include information about the subplots. It's not immediately clear to me what the difference is between the 3 different types of images.

In Figure 9 (original Figure 8), the first column represents the original image of the example pitaya plant, the second column represents the digital surface model image corresponding to the image, and the third column represents the corresponding UAV point cloud image. We added the correspondence to the title.

13) Figure 9 - The subplots need to be labeled.  The caption should include information about the subplots.

Figure 10 (original Figure 9) was changed to mark the subfigure.

14) Figure 10 - The caption should contain information about what is in the subplots.

The title of Figure11 (original Figure 10) is modified and the information is included in the next heading of Figure 11.

15) Figure 12 - The caption should contain information about what is in the subplots.

The title of Figure12 (original Figure 11) is modified and the information is included in the next heading of Figure 12.

16) Section 2.2.2 - "...and the steps were mainly screening valid aerials, high-precision processing, air-triple encryption and data generation."  What exactly does this mean - I think it requires further explanation.

We rewrite the part of the content, the main steps include: 1) aerial segment screening: eliminate fuzzy or redundant aerial segment; 2) high-precision splicing, according to the automatically generated control points for influence splicing; 3) air three encryption: according to the flight point cloud control point, in the room, obtain the elevation and plane position of the encryption point, and generate a digital surface model accordingly. This content has been added to the article (Line215-225).

17) page 5, just before Figure 2.  "... data was validity" is one of many mistakes in the text.  The paper needs to be checked to ensure the English makes sense and to remove careless mistakes, of which there are many.  Unfortunately, I could not feedback all of them due to the lack of line numbers in the submitted manuscript.

We have revised the whole manuscript to remove careless mistakes. 

18) net or U-Net, pitaya or Pitaya - please be consistent throughout the document.

We modified the content to keep it consistent. 

19) The bibliography should follow the MDPI style for every reference.

We rearranged the reference to follow the MDPI style.

 

All modifications are highlighted in the article.

Please allow us to represent the sincerest thank you again.

 

Yours truly,

Ruiwen Peng

Author Response File: Author Response.pdf

Reviewer 5 Report

This paper proposes using UAV remote sensing images to identify and rapidly estimate the characteristic crops in the complex karst habitat.

The scope of this paper is to use UAV remote sensing images for single Pitaya plant identification and implement rapid yield estimation. I think that this scope is too narrow, since only Pitaya plant is considered in this paper.

computational load of your method needs to be analyzed in detail. is it possible to run your method in real time?

in section 3, ENVI software is used. can you provide references for this software? 

There are many papers which use UAV for crop estimation[16-24]. contribtuions of this paper compared to these papers are not clear to me.

ok

Author Response

Dear Reviewer,

 

It is a great honor to have your precious time and expertise to review our manuscript. We are very grateful for your valuable opinion and thank you heartily.

The problems you have pointed out are the weak spots of the manuscript, we have corrected them according to your suggestion. We listed the detailed response to your advice one by one in the following page.

1)The scope of this paper is to use UAV remote sensing images for single Pitaya plant identification and implement rapid yield estimation. I think that this scope is too narrow, since only Pitaya plant is considered in this paper.

This paper aims to study a suitable for many scenarios of pitaya rapid yield method, because the pitaya industry is one of the characteristic fruit industry in Guizhou, is the advantage of Guizhou mountain industry and consolidate the key industry of poverty alleviation achievements, in the construction of ecological civilization, agricultural supply side structure reform, farmers' income growth, the main industry comprehensive development has played an important role. In fact, this yield estimation method can provide a certain process reference for other crops grown in the same area, not just for the pitaya itself.

2)  Computational load of your method needs to be analyzed in detail. is it possible to run your method in real time?

Artificial assessment of the number of fruit and its size is essential to build the pitaya rapid steps in the model, but once the model construction is completed, one can use the model combining drone image classification extraction results for rapid estimation, in the subsequent estimates there is no longer need to manually assess the number of fruit and its size, but can rely on the empirical model to implement the estimate.

3) In section 3, ENVI software is used. can you provide references for this software?

ENVI (The Environment for Visualizing Images) is the flagship product of Exelis Visual Information Solutions Company. It is a set of powerful remote sensing image processing software developed by scientists in the field of remote sensing by using the interactive data language (IDL). This software has been widely used in studies both at home and abroad [7,14,29 -30]. In this study, ENVI software was mainly used to implement image enhancement, correction, orthophoto correction, mosaicism, data fusion, information extraction and image classification. In the article we added references to this article.

4) There are many papers which use UAV for crop estimation [16-24]. contributions of this paper compared to these papers are not clear to me.

From the above studies, we find that there are many studies on UAV image recognition, but few studies on further application of recognition results. There are more studies on the identification and estimation of corps in a single scene, and few studies on the identification and estimation of multiple scenes. This study made up for the above gaps, carried out the identification of pitaya plants under multiple scenarios, and further applied the identification results to the yield estimation.

 

All modifications are highlighted in the revised article.

Please allow us to represent the sincerest thank you again.

 

Yours truly,

Ruiwen Peng

Author Response File: Author Response.pdf

Round 2

Reviewer 2 Report

The manuscript was improved according all my comments.

In Table 1 specification of the UAV is presented, for example maximum speed. I rather expect flight speed during the study mot maximum. Please present flight parameters and camera parameters not such general specification of the UAV.

The references are still not formatted according the guidelines for authors. For example in many places there is "et al." but all the authors should be listed.

Author Response

Dear Reviewer,

 

It is a great honor to have your precious time and expertise to review our manuscript. We are very grateful for your valuable opinion and thank you heartily.

The problems you have pointed out are the weak spots of the manuscript, we have corrected them according to your suggestion. We listed the detailed response to your advice in the following page.

1)In Table 1 specification of the UAV is presented, for example maximum speed. I rather expect flight speed during the study mot maximum. Please present flight parameters and camera parameters not such general specification of the UAV.

We did not change the standard sensor, but we added some flight parameters of the experiments in Table 1, such as Experimental flight height, Image Pixel size obtained by experiment and Flight plan completion speed.

 

2)The references are still not formatted according the guidelines for authors. For example in many places there is "et al." but all the authors should be listed.

We revised the reference according to your suggestion and the "agriculture-template".

 

 

All modifications are highlighted in the article.

Please allow us to represent the sincerest thank you again.

 

Yours truly,

Ruiwen Peng

Reviewer 3 Report

Good to have detail explanation but I still have a difficult time to understand how to use CCVI information in your proposed research. You revised paper is still only mentioned about one shot data recognition. I strongly believe the value of CCVI should be time series analyzation. If you only need one day data, why you have to use CCVI? This kind of pattern recognition, I believe that RGB data is enough.

 

In addition, I could not understand the relationship between figure 5 and figure 7.

 

You only never answer about my point of incident angle effect in my previous comments.

Author Response

Dear Reviewer,

 

It is a great honor to have your precious time and expertise to review our manuscript. We are very grateful for your valuable opinion and thank you heartily.

The problems you have pointed out are the weak spots of the manuscript, we have corrected them according to your suggestion. We listed the detailed response to your advice in the following page.

1)Good to have detail explanation but I still have a difficult time to understand how to use CCVI information in your proposed research. You revised paper is still only mentioned about one shot data recognition. I strongly believe the value of CCVI should be time series analyzation. If you only need one day data, why you have to use CCVI? This kind of pattern recognition, I believe that RGB data is enough.

CCVI is a color index calculated based on the band information of RGB images, which aims to eliminate the influence of weeds on pitaya plants in the images. As can be seen, it is used in complex scenes with similar colors. Because weeds and pitaya leaves have similar colors, we need to calculate CCVI and threshold the results to collect the correct number of pitaya plants.

 

2)In addition, I could not understand the relationship between figure 5 and figure 7.

Figure 4 (original image 5) is a complete image of the study area. A complete study area contains different complex scenes, which are divided separately, and different methods are used to identify the pitaya. Figure 7 represents the results of CCVI calculation in the similar color scenes in the segmented complex scenes.

 

3)You only never answer about my point of incident angle effect in my previous comments.

All images were taken between 1 to 3 p.m. and were cloudy. So light changes caused by the sun's incident angle had little impact on the UAV's imaging.

 

All modifications are highlighted in the article.

Please allow us to represent the sincerest thank you again.

 

Yours truly,

Ruiwen Peng

Reviewer 4 Report

Regarding comment 1) The word "modle" appears three times in the flowchart - I presume this is meant to say "model"?  It's still not completely clear how the three models, CCVI, CHM and U-Net are combined or used to count the number of Pitaya plants given the flowchart.

On line 248, Section 2.3.1, there is a stray "]".  The text here should reference the flow chart.

Figure 1 appears after Figure 5 in the document.  Is this meant to be Figure 6?  And the rest of the figures need to be renumbered?

I still don't find the paper straightforward to follow unfortunately.

There are two figures with the title Figure 8.

In my first review, I raised the issue that in Figure 1 some of the maps only have a single axis tick mark, but this wasn't addressed in the revision.  Also, I can only see a very short, 6 word, figure description in the figure caption.  Each of the subplots should be referenced and described.

I'm not sure what is happening with the supplementary information file.  This looks like the main document?

Figure 6 although revised, at 100%, I still can't read all of the text.

Figure 8 is still confusing.  Subplots are labeled as a-1, a-2, a-3, ..., c-3.  In the text, eg. l378-379 and l380 the text uses "Figure b-3" and "Figure c-3" to reference what I presume are subplots of Figure 8, but this isn't clear, since there is no mention of Figure 8.  Why not just label a-i and reference as Figure 8a-8i?

Figure 9 - The colourscale and mapscale are unreadable at 100%.

Regarding comment 18), I can still a mixture of pitaya/Pitaya being used throughout the document.

Regarding comment 19), the bibliography still doesn't follow the MDPI style.  For example, journal names should use the abbreviated form of the journal title.

 

I'm still finding the paper littered with basic grammatical and spelling errors.  Writing is still awkwardly expressed throughout the document. eg. l.369 "...was used to identificated of Pitaya plants...".  This must be addressed before publication.

Author Response

Dear Reviewer,

 

It is a great honor to have your precious time and expertise to review our manuscript. We are very grateful for your valuable opinion and thank you heartily.

The problems you have pointed out are the weak spots of the manuscript, we have corrected them according to your suggestion. We listed the detailed response to your advice in the following page.

1)The word "modle" appears three times in the flowchart - I presume this is meant to say "model"?  It's still not completely clear how the three models, CCVI, CHM and U-Net are combined or used to count the number of Pitaya plants given the flowchart.

Change the "model" in the flowchart to "model". Each of the complete study areas contains different complex scenes, which are divided separately into three types: complex scenes with similar colors, complex terrain variation scenes, and and complex scenes with multiple crop species coexisting.In complex scenes with similar colors, the influence of weeds with similar colors was the mainly problem, so CCVI was calculated to separate weeds and pitaya, and finally the number of pitaya was identified by the separation results.In complex terrain variation scenes needs to eliminate the influence of terrain, so the CHM model is used to identify the number of pitaya by the canopy height. There are different types of crops in complex scenes with multiple crop species coexisting, which is more complex, so we need to establish pitaya sample slices and use the U-Net model to identify the number of pitaya.

2)On line 248, Section 2.3.1, there is a stray "]".  The text here should reference the flow chart.

We fixed the issue you mentioned, modified text to consistent with the flow chart.

3)Figure 1 appears after Figure 5 in the document.  Is this meant to be Figure 6?  And the rest of the figures need to be renumbered?

We have renumbered the figure numbers.

4)There are two figures with the title Figure 8.

We have fixed the problem according to your suggestion.

5)In my first review, I raised the issue that in Figure 1 some of the maps only have a single axis tick mark, but this wasn't addressed in the revision.  Also, I can only see a very short, 6 word, figure description in the figure caption.  Each of the subplots should be referenced and described.

We have added axis tick mark in Figure 1 (b), Each of the subplots now be referenced and described(line386-394 for example).

6)I'm not sure what is happening with the supplementary information file.  This looks like the main document?

We have uploaded two version to help you understand the changes we have made. The main file is the marked revised manuscript, and the the supplementary Information is revised version without marking.

7)Figure 6 although revised, at 100%, I still can't read all of the text.

We have Changed Figure 6 to make it look clearer at 100%

8)Figure 8 is still confusing.  Subplots are labeled as a-1, a-2, a-3, ..., c-3.  In the text, eg. l378-379 and l380 the text uses "Figure b-3" and "Figure c-3" to reference what I presume are subplots of Figure 8, but this isn't clear, since there is no mention of Figure 8.  Why not just label a-i and reference as Figure 8a-8i?

We have modified it according to your suggestion, label a-i and reference as Figure 9a-9i.

9)Figure 9 - The colourscale and mapscale are unreadable at 100%.

Change Figure 9 to make it look clearer at 100%

10)I can still a mixture of pitaya/Pitaya being used throughout the document.

We Unifid the usage as the "Pitaya"。

6) the bibliography still doesn't follow the MDPI style.  For example, journal names should use the abbreviated form of the journal title.

We revised the reference according to your suggestion and the "agriculture-template".But some journals themselves do not have abbreviated names。

 

All modifications are highlighted in the article.

Please allow us to represent the sincerest thank you again.

 

Yours truly,

Ruiwen Peng

Reviewer 5 Report

I am satisfied with revision

ok

Author Response

Dear Reviewer,

 

It is a great honor to have your precious time and expertise to review our manuscript.

Please allow us to represent the sincerest thank you again.

 

Yours truly,

Ruiwen Peng

Back to TopTop