Next Article in Journal
The Impact of Community-Focused CUREs on Biology Student Identity, Persistence, and Career Outcomes at an HBCU
Previous Article in Journal
Exploring the Insider–Outsider Status of Postgraduate Students in Leading Lesson Study
 
 
Article
Peer-Review Record

Recognition of Effective Co-Teaching Practices by Interdisciplinary Pre-Service Candidates

Trends High. Educ. 2024, 3(4), 960-977; https://doi.org/10.3390/higheredu3040056
by Shawnee Wakeman *, Holly N. Johnson, Khadija Ouedraogo and Kristin Sinclair
Reviewer 1: Anonymous
Reviewer 2: Anonymous
Trends High. Educ. 2024, 3(4), 960-977; https://doi.org/10.3390/higheredu3040056
Submission received: 16 September 2024 / Revised: 29 October 2024 / Accepted: 13 November 2024 / Published: 15 November 2024

Round 1

Reviewer 1 Report

Comments and Suggestions for Authors

The introductory part of the paper is thorough and well organized. Several references are new and bring new insights into this field. In addition, the theoretical background provides a solid basis for this research and its necessity.

 

Your study has one research question: To what degree do pre-service interdisciplinary groups of undergraduates education candidates recognize co-teaching skills and behaviors in practice? (lines 171-173) It is clear and concise, and verbalizes the conducted research well.

 

Chapter 2 Materials and Methods introduces participants, setting, measure, design, data collection, research positionalities and analysis. I am not sure if this order is the best possible. I suggest that you consider it. Perhaps you could introduce the context and design first and then your participants and data collection methods, then measures for data gathering and codes for analysis. I am not sure is this the best way to write this but somehow the order now is difficult to follow. At least the analysis must be introduced so that the reader understands right away that you are conducting two analysis first this quantitative one where students observations are compared to researchers’ observations from those videos, and second the analysis of conversations the students had when trying to find the consensus. I didn't fully understand why the goal was to reach a consensus.

 

There are only 33 participants in the study and most of them were women, but they represent several fields of education and nationalities and are divided into meaningful small groups for the duration of the study. The framework for research is university studies and two different courses. And the measure students were using is the Co-teaching Core Competencies Observation Checklist. The subjects watched two videos and marked their observations on a checklist, where some of the indicators were related to the events seen and some to the events heard. The conversations these small groups had were recorded and student summaries were assessed as part of their course performance. All the indicators students used are in table 1. 

 

However, data collection process is carefully documented and so is the quantitative and qualitative analysis. The only challenge in my view is the tables. I am not sure if tables 1 and 2 are necessary here in methods section. Perhaps you could have only brief lists of indicators and codes here and the complete tables later on in results section where the reader really needs them. 

For example in results section in quantitative analysis the reader must go back and forth to understand what is the indicator you mention here. For example line 351 indicator 11. What was it?

Moreover, the table 3 is really difficult for the reader to understand and its informational value is weak in my opinion. But if you move the table 1 here and add a column which contains also these percentages, table 3 could perhaps then be excluded and replaced with this new version of table 1.

 

I also suggest that you move the table 2 in results section and its qualitative analysis part. Then it could help the reader to keep this critical information in her/his mind. Perhaps you could then add a column where you conclude the main finding of each nine codes.

I also wondered why the subtitles in the results section are different from those codes. For example code 5 Couldn’t evaluate the practice (didn’t want to be too harsh) has changed into subtitle Didn’t feel comfortable accurately rating the video. 

 

Moreover, although the research arrangements are clear, I would have liked to read some kind of small description of these videos. What kind of teaching and learning episodes they were? At the beginning I did not even understood that the videos differed from each other so that one of them was a good example and the other not so good. Was it so? Can you elaborate this a bit more.

 

I results section there are some sentences describing the analysis which could also be moved to analysis section.

 

The conversations you have included in results section are nice and informative. Some of them are quite long. Perhaps you could shorten them a bit. I was a little confused that you wanted to make comparisons between groups. At least it seemed like that. If you want to compare and for example to compare the effect of group composition on findings, then the reader must know a bit more about the groups.

 

Discussion part was clear and I enjoyed your suggestions for policy makers, partitioners and researchers.

 

 

Author Response

Author Responses to Reviewers’ Feedback on the Manuscript   

Recognition of Effective Co-Teaching Practices by Interdisciplinary Pre-Service Candidates

 

We would like to thank the reviewers for their thoughtful comments and suggestions. We have addressed the reviewers’ comments and outlined our responses below.

 

Reviewers’ Comments

Response

Pages

Reviewer 1:

   
  • Chapter 2 Materials and Methods introduces participants, setting, measure, design, data collection, research positionalities and analysis. I am not sure if this order is the best possible. I suggest that you consider it. Perhaps you could introduce the context and design first and then your participants and data collection methods, then measures for data gathering and codes for analysis. I am not sure is this the best way to write this but somehow the order now is difficult to follow. At least the analysis must be introduced so that the reader understands right away that you are conducting two analysis first this quantitative one where students observations are compared to researchers’ observations from those videos, and second the analysis of conversations the students had when trying to find the consensus. I didn't fully understand why the goal was to reach a consensus.

Per formatting guidelines, content in the Materials and Methods section follows a common sequence presenting information in chronological order. Additionally, clarification in language has been included in both the Measure and Design sections to address additional feedback provided.  

pp.4-6

  • The only challenge in my view is the tables. I am not sure if tables 1 and 2 are necessary here in methods section. Perhaps you could have only brief lists of indicators and codes here and the complete tables later on in results section where the reader really needs them. For example in results section in quantitative analysis the reader must go back and forth to understand what is the indicator you mention here. For example line 351 indicator 11. What was it? Moreover, the table 3 is really difficult for the reader to understand and its informational value is weak in my opinion. But if you move the table 1 here and add a column which contains also these percentages, table 3 could perhaps then be excluded and replaced with this new version of table 1. 

Thank you so much for providing this feedback. We agree that changes could and should be made to address clarity and add value to the information being presented. Therefore, changes have been made to Table 3 to reflect recommended changes. 

p. 10 Table 3

  • I also suggest that you move the table 2 in results section and its qualitative analysis part. Then it could help the reader to keep this critical information in her/his mind. Perhaps you could then add a column where you conclude the main finding of each nine codes.

I also wondered why the subtitles in the results section are different from those codes. For example code 5 Couldn’t evaluate the practice (didn’t want to be too harsh) has changed into subtitle Didn’t feel comfortable accurately rating the video. 

To ensure clarity in our reporting, we kept Table 2 and included titles of codes and corresponding subtitles as headings in the Results section.

pp. 13-17

  • Moreover, although the research arrangements are clear, I would have liked to read some kind of small description of these videos. What kind of teaching and learning episodes they were? At the beginning I did not even understood that the videos differed from each other so that one of them was a good example and the other not so good. Was it so? Can you elaborate this a bit more.

Additional information has been included in the narrative to describe and elaborate on the two videos used as a part of the examination of co-teaching practices collaborative assignment.

p. 5

  • In the results section there are some sentences describing the analysis which could also be moved to analysis section.

The sentences included in this beginning of the Qualitative section were included to provide context for the information that followed.

p.13

  • The conversations you have included in results section are nice and informative. Some of them are quite long. Perhaps you could shorten them a bit. I was a little confused that you wanted to make comparisons between groups. At least it seemed like that. If you want to compare and for example to compare the effect of group composition on findings, then the reader must know a bit more about the groups.

Thank you for this feedback. Authors believe the length of the quotes are necessary to provide context to the reader as it relates to each identified code and sub-code. 


Additionally, descriptions of groups were included under the Participants heading in the Materials and Methods section. Please see p. 4 for this information.

p. 13-17





p. 4

Reviewer 2: 

   
  • Introduction: On page 3, the paragraph beginning on line 117, you mention that there are few programs for pre-service teachers that include direct training and support in co-teaching with an in-service teacher. I encourage you to name a few studies that are exceptions to this or possibly include an entire paragraph summarizing previous research that has provided this type of support. Research to consult could include: Gallo-Fox & Scantlebury (2015) (2016); Gallo-Fox & Stegeman (2020); Guise, Hegg, Hoellwarth, & O'Shea (2022); Scantlebury, Gallo-Fox, & Wassell (2008); Soslau et al. (2018). I think adding this research would situate your study in the work that has already been done, and you may be able to connect some of your implications to these previous studies.

We appreciate the example studies. The content within some of the provided studies had student teachers in general education (science or early childhood)  working with their clinical educator, a point we made on page 3 of the introduction. We used some of the references provided by the reviewer to reiterate the point about the limitation of co-teaching experiences between pre-service interdisciplinary candidates. 

pp. 3, 23

  • Results Quantitative: As a read, I found the two paragraphs (Pass rates by group and Pass rates by indicator) to be difficult to read. I found myself re-reading and was still confused (all of the different numbers -- e.g., groups, indicators, percentages) made it difficult to understand these findings. Furthermore, Table 3 was difficult to understand. I also wondered how the scores on each indicator presented in Table 3 compared to the experts' scores. I'd encourage you to consider revising, possibly thinking about using a format other than a table (perhaps a figure) to display your quantitative data to help with readability. 

Please see comments regarding edits made to Table 3 to reflect these suggested changes included in the response above. 

p. 10 Table 3

  • Editing recommendation: When presenting the qualitative data, consider adding in ( ) a brief description of the indicator to remind your reader of what it was when it is not clear in the transcript itself. For example, on page 12, line 351, indicator 11 is mentioned but it is not clear based on the transcript what the focus of indicator 11 was. Page 13, line 416 may be another place where a reminder of what indicator 9 is might be helpful. 

Additionally, codes have now been included in parentheses where the narrative about the code is listed. 

pp. 13-17

 

Author Response File: Author Response.pdf

Reviewer 2 Report

Comments and Suggestions for Authors

Thank you for sharing your work and for highlighting the importance of teacher preparation programs creating opportunities for collaboration between pre-service general and special education teachers. I appreciate your article's focus on implementing intentional co-teaching and collaboration experiences across courses and programs not only to improve skills in inclusive and equitable practices but also to hopefully see a positive impact on teacher retention and job satisfaction. Your article is well written and provides helpful details regarding your study's methodology and your positioning as researchers. 

Below are a few recommendations for revisions:

1. Introduction: On page 3, the paragraph beginning on line 117, you mention that there are few programs for pre-service teachers that include direct training and support in co-teaching with an in-service teacher. I encourage you to name a few studies that are exceptions to this or possibly include an entire paragraph summarizing previous research that has provided this type of support. Research to consult could include: Gallo-Fox & Scantlebury (2015) (2016); Gallo-Fox & Stegeman (2020); Guise, Hegg, Hoellwarth, & O'Shea (2022); Scantlebury, Gallo-Fox, & Wassell (2008); Soslau et al. (2018). I think adding this research would situate your study in the work that has already been done, and you may be able to connect some of your implications to these previous studies.

3. Results Quantitative: As a read, I found the two paragraphs (Pass rates by group and Pass rates by indicator) to be difficult to read. I found myself re-reading and was still confused (all of the different numbers -- e.g., groups, indicators, percentages) made it difficult to understand these findings. Furthermore, Table 3 was difficult to understand. I also wondered how the scores on each indicator presented in Table 3 compared to the experts' scores. I'd encourage you to consider revising, possibly thinking about using a format other than a table (perhaps a figure) to display your quantitative data to help with readability. 

Editing recommendation: When presenting the qualitative data, consider adding in ( ) a brief description of the indicator to remind your reader of what it was when it is not clear in the transcript itself. For example, on page 12, line 351, indicator 11 is mentioned but it is not clear based on the transcript what the focus of indicator 11 was. Page 13, line 416 may be another place where a reminder of what indicator 9 is might be helpful. 

Author Response

Author Responses to Reviewers’ Feedback on the Manuscript   

Recognition of Effective Co-Teaching Practices by Interdisciplinary Pre-Service Candidates

 

We would like to thank the reviewers for their thoughtful comments and suggestions. We have addressed the reviewers’ comments and outlined our responses below.

 

Reviewers’ Comments

Response

Pages

Reviewer 1:

   
  • Chapter 2 Materials and Methods introduces participants, setting, measure, design, data collection, research positionalities and analysis. I am not sure if this order is the best possible. I suggest that you consider it. Perhaps you could introduce the context and design first and then your participants and data collection methods, then measures for data gathering and codes for analysis. I am not sure is this the best way to write this but somehow the order now is difficult to follow. At least the analysis must be introduced so that the reader understands right away that you are conducting two analysis first this quantitative one where students observations are compared to researchers’ observations from those videos, and second the analysis of conversations the students had when trying to find the consensus. I didn't fully understand why the goal was to reach a consensus.

Per formatting guidelines, content in the Materials and Methods section follows a common sequence presenting information in chronological order. Additionally, clarification in language has been included in both the Measure and Design sections to address additional feedback provided.  

pp.4-6

  • The only challenge in my view is the tables. I am not sure if tables 1 and 2 are necessary here in methods section. Perhaps you could have only brief lists of indicators and codes here and the complete tables later on in results section where the reader really needs them. For example in results section in quantitative analysis the reader must go back and forth to understand what is the indicator you mention here. For example line 351 indicator 11. What was it? Moreover, the table 3 is really difficult for the reader to understand and its informational value is weak in my opinion. But if you move the table 1 here and add a column which contains also these percentages, table 3 could perhaps then be excluded and replaced with this new version of table 1. 

Thank you so much for providing this feedback. We agree that changes could and should be made to address clarity and add value to the information being presented. Therefore, changes have been made to Table 3 to reflect recommended changes. 

p. 10 Table 3

  • I also suggest that you move the table 2 in results section and its qualitative analysis part. Then it could help the reader to keep this critical information in her/his mind. Perhaps you could then add a column where you conclude the main finding of each nine codes.

I also wondered why the subtitles in the results section are different from those codes. For example code 5 Couldn’t evaluate the practice (didn’t want to be too harsh) has changed into subtitle Didn’t feel comfortable accurately rating the video. 

To ensure clarity in our reporting, we kept Table 2 and included titles of codes and corresponding subtitles as headings in the Results section.

pp. 13-17

  • Moreover, although the research arrangements are clear, I would have liked to read some kind of small description of these videos. What kind of teaching and learning episodes they were? At the beginning I did not even understood that the videos differed from each other so that one of them was a good example and the other not so good. Was it so? Can you elaborate this a bit more.

Additional information has been included in the narrative to describe and elaborate on the two videos used as a part of the examination of co-teaching practices collaborative assignment.

p. 5

  • In the results section there are some sentences describing the analysis which could also be moved to analysis section.

The sentences included in this beginning of the Qualitative section were included to provide context for the information that followed.

p.13

  • The conversations you have included in results section are nice and informative. Some of them are quite long. Perhaps you could shorten them a bit. I was a little confused that you wanted to make comparisons between groups. At least it seemed like that. If you want to compare and for example to compare the effect of group composition on findings, then the reader must know a bit more about the groups.

Thank you for this feedback. Authors believe the length of the quotes are necessary to provide context to the reader as it relates to each identified code and sub-code. 


Additionally, descriptions of groups were included under the Participants heading in the Materials and Methods section. Please see p. 4 for this information.

p. 13-17





p. 4

Reviewer 2: 

   
  • Introduction: On page 3, the paragraph beginning on line 117, you mention that there are few programs for pre-service teachers that include direct training and support in co-teaching with an in-service teacher. I encourage you to name a few studies that are exceptions to this or possibly include an entire paragraph summarizing previous research that has provided this type of support. Research to consult could include: Gallo-Fox & Scantlebury (2015) (2016); Gallo-Fox & Stegeman (2020); Guise, Hegg, Hoellwarth, & O'Shea (2022); Scantlebury, Gallo-Fox, & Wassell (2008); Soslau et al. (2018). I think adding this research would situate your study in the work that has already been done, and you may be able to connect some of your implications to these previous studies.

We appreciate the example studies. The content within some of the provided studies had student teachers in general education (science or early childhood)  working with their clinical educator, a point we made on page 3 of the introduction. We used some of the references provided by the reviewer to reiterate the point about the limitation of co-teaching experiences between pre-service interdisciplinary candidates. 

pp. 3, 23

  • Results Quantitative: As a read, I found the two paragraphs (Pass rates by group and Pass rates by indicator) to be difficult to read. I found myself re-reading and was still confused (all of the different numbers -- e.g., groups, indicators, percentages) made it difficult to understand these findings. Furthermore, Table 3 was difficult to understand. I also wondered how the scores on each indicator presented in Table 3 compared to the experts' scores. I'd encourage you to consider revising, possibly thinking about using a format other than a table (perhaps a figure) to display your quantitative data to help with readability. 

Please see comments regarding edits made to Table 3 to reflect these suggested changes included in the response above. 

p. 10 Table 3

  • Editing recommendation: When presenting the qualitative data, consider adding in ( ) a brief description of the indicator to remind your reader of what it was when it is not clear in the transcript itself. For example, on page 12, line 351, indicator 11 is mentioned but it is not clear based on the transcript what the focus of indicator 11 was. Page 13, line 416 may be another place where a reminder of what indicator 9 is might be helpful. 

Additionally, codes have now been included in parentheses where the narrative about the code is listed. 

pp. 13-17

 

Author Response File: Author Response.pdf

Back to TopTop