Next Article in Journal
Opportunities and Challenges in Harnessing Digital Technology for Effective Teaching and Learning
Previous Article in Journal
Effect of American-Based Professional Development Program on Acculturation Strategies of Kazakhstan Mathematics Faculty
 
 
Article
Peer-Review Record

Examining Master’s Students’ Success at a Hispanic-Serving Institution

Trends High. Educ. 2025, 4(1), 5; https://doi.org/10.3390/higheredu4010005
by Kenneth John Tobin 1,*, Jacinto De La Cruz Hernandez 1, José R. Palma 2, Marvin Bennett 1 and Nandita Chaudhuri 3
Reviewer 1:
Reviewer 2: Anonymous
Trends High. Educ. 2025, 4(1), 5; https://doi.org/10.3390/higheredu4010005
Submission received: 2 August 2024 / Revised: 7 January 2025 / Accepted: 9 January 2025 / Published: 15 January 2025

Round 1

Reviewer 1 Report

Comments and Suggestions for Authors

Overview

The article is devoted to the problem of examining factors that impact master's student success. The research focuses on studying the changes in learning success predictors accompanying the alterations in learning design due to the COVID-19 pandemic and the transition of several educational programs to an accelerated format.

 General comments

1. The main concerns are  about the correctness of the statistical analysis regarding the impact of the factors "COVID-19" and "Transition to Accelerated Programs" on student success:

1.1. Tables 3-5 present the results of applying the Student's t-test to four samples. The results seem unreliable, as first, the initial samples are divided into subgroups (for example, by discipline), and second, several hypotheses are tested simultaneously regarding the differences in the variables "GPA," "Term Count," "Course Load," and "Course Drop Rate." With this design, the issue of multiple hypothesis testing arises. However, the authors do not mention this problem or whether they made any corrections for multiple testing.

1.2. From lines 215-235, we can conclude that the authors fitted logistic regression only once for each of the four groups of students. The observed differences in the statistical significance of several predictors may not be replicated in models fitted on new data. It is important to address this issue or, at the very least, mention it as a limitation of the study in the Discussion section.

1.3. Unlike the pre-accelerated phase, the accelerated online phase includes the COVID-19 pandemic period. Thus, the impact of the factor 'Transition to Accelerated Programs' on student success cannot be estimated in isolation. I recommend either changing the experimental design or providing a rationale for why the 'COVID-19' factor is not relevant in this case.

2. The article would benefit from moving the small literature reviews (lines 289-304, 311-315, 323-329, 342-350) from the Discussion section to the Introduction. This will provide the reader with a better understanding of the context of the study.

 Specific comments:

1. For tables 3-5, it should be clarified what values are recorded in the columns. Are these the average values of the variables and the number of students in the corresponding samples?

2. In Conclusion, it is stated that the decline in GPA for one of the educational programs is caused by admitting students without undergraduate degrees to the program. However, the article does not provide justifications for this claim. This statement should be reframed as a hypothesis, one of the possible reasons for the decline in GPA.

Author Response

General comments

  1. The main concerns are about the correctness of the statistical analysis regarding the impact of the factors "COVID-19" and "Transition to Accelerated Programs" on student success:

 

1.1. Tables 3-5 present the results of applying the Student's t-test to four samples. The results seem unreliable, as first, the initial samples are divided into subgroups (for example, by discipline), and second, several hypotheses are tested simultaneously regarding the differences in the variables "GPA," "Term Count," "Course Load," and "Course Drop Rate." With this design, the issue of multiple hypothesis testing arises. However, the authors do not mention this problem or whether they made any corrections for multiple testing. 

Response: We have applied the Bonferroni correction when applying multiple t-tests to the same populations to account for multiple hypothesis testing. Furthermore, we modified our methodology with fewer disaggregation to keep group sizes large enough for reliable results and to have a more focused approach.

 

1.2. From lines 215-235, we can conclude that the authors fitted logistic regression only once for each of the four groups of students. The observed differences in the statistical significance of several predictors may not be replicated in models fitted on new data. It is important to address this issue or, at the very least, mention it as a limitation of the study in the Discussion section.

Response: A disclaimer for this limitation has been included in the discussion on lines 520 to 528. The logistic regression model is exploratory for the purposes of identifying relevant predictors. We believe that our adjusted t-tests bear out the changes in the variable predictors over the modality transitions.

 

1.3. Unlike the pre-accelerated phase, the accelerated online phase includes the COVID-19 pandemic period. Thus, the impact of the factor 'Transition to Accelerated Programs' on student success cannot be estimated in isolation. I recommend either changing the experimental design or providing a rationale for why the 'COVID-19' factor is not relevant in this case.

Response: A rationale has been added to the revised manuscript on lines 142 to 144. While the pandemic massively impacted every sector of society this event did not change how accelerated online classes were delivered. Our experimental design focuses on modality change that defines the four program types examined and not Covid-19 as an isolated factor.

 

  1. The article would benefit from moving the small literature reviews (lines 289-304, 311-315, 323-329, 342-350) from the Discussion section to the Introduction. This will provide the reader with a better understanding of the context of the study.

Response: With respect we disagree with this suggestion as this literature puts the specific findings into context. The rewritten introduction places the study into an overall context and the discussion focuses more on specifics. I believe as framed the revised manuscript provides the best possible treatment of the topic.

 

 Specific comments:

  1. For tables 3-5, it should be clarified what values are recorded in the columns. Are these the average values of the variables and the number of students in the corresponding samples?

Response: The data table have been extensive revised, and it has been made clear that it is the variable averages that are indicated. In addition, the number of students for each grouping is now clear.

 

  1. In Conclusion, it is stated that the decline in GPA for one of the educational programs is caused by admitting students without undergraduate degrees to the program. However, the article does not provide justifications for this claim. This statement should be reframed as a hypothesis, one of the possible reasons for the decline in GPA.

Response: In the revised manuscript, we no longer make such a claim about the GPA of the Business Administration (BA) program, but we do find that BA students without undergraduate degrees in BA have lower GPAs than students with such degrees (lines 484 to 500).

Reviewer 2 Report

Comments and Suggestions for Authors

Thank you for the opportunity to review your manuscript! Your work provides valuable insights into graduate student persistence, particularly in the context of an HSI. I found your research to be quite timely and relevant, especially considering the ongoing effects of the COVID-19 pandemic and the increasing reliance on online learning opportunities.

 

There are just a few areas, however, where I believe some improvements could strengthen the paper even further.

The author(s) did a wonderful job positioning their research within the broader literature on student persistence, however, including information on the unique challenges faced by HSIs would really help the contextualization of the study (i.e., including more specific educational frameworks or policies relevant to HSIs).

The research design is clear, and the research questions align well with the design. However, there are some concerns that should be addressed:

The hypotheses should be framed more clearly. While the author(s) do describe the expected outcomes, explicitly stating the hypotheses upfront would make the direction of the study much clearer. Clearly laying out the null and alternative hypotheses, especially for logistic regression models, would also add more depth and rigor to the methodology.

 It would be helpful to provide more clarity on how decisions were made about which variables to include or exclude. This would make the study more transparent and credible, helping readers understand why certain variables were chosen and how they might influence the results.

Even though multicollinearity checks are mentioned, there's not enough detail on the model's assumptions or how they were validated. It’s important to explain this to make sure the model is reliable and the results are accurate.

The discussion is well-structured and aligns with the flow of the research design and findings. The interpretation of the logistic regression results is clear. There are, however, some areas of concern that should be addressed:

The discussion around pandemic-related grade inflation could use a bit more depth. While the increase in GPA for discontinued students is mentioned, it would be helpful to consider other possible, explanations like student resilience or changes in grading criteria during the pandemic.

The practical recommendations from the findings should be given more focus. The author(s) mention some implications for program design, but expanding this with more actionable recommendations for HSIs facing similar transitions would be really useful.

The conclusions should be tied more closely to broader trends in higher education, such as the growing reliance on online programs and the challenges of supporting non-traditional students. This would help place the findings in a larger context.

Overall, the conclusions mostly align with the study's results and focus on addressing the two main research questions. However, there are a few areas that could be improved:

It would be helpful to offer more concrete recommendations for HSIs, especially those dealing with similar challenges in online program delivery or post-pandemic academic environments. This could include exploring whether better support systems are needed for students transitioning to online or accelerated learning formats, with a particular focus on students from underrepresented backgrounds.

It would also be valuable to tie the findings more clearly back to the persistence theories discussed in the literature review. This would show how the results either confirm, challenge, or expand on existing theories about student persistence, particularly for non-traditional graduate programs.

 

 

Author Response

The author(s) did a wonderful job positioning their research within the broader literature on student persistence, however, including information on the unique challenges faced by HSIs would really help the contextualization of the study (i.e., including more specific educational frameworks or policies relevant to HSIs). 

The research design is clear, and the research questions align well with the design. However, there are some concerns that should be addressed:

 

  1. The hypotheses should be framed more clearly. While the author(s) do describe the expected outcomes, explicitly stating the hypotheses upfront would make the direction of the study much clearer. Clearly laying out the null and alternative hypotheses, especially for logistic regression models, would also add more depth and rigor to the methodology. Response: In the revised manuscript this has been done both at the end of the introduction (lines 73 to 83) and in the methods section (lines 206 to 211 and 235 to 238).

 

  1. It would be helpful to provide more clarity on how decisions were made about which variables to include or exclude. This would make the study more transparent and credible, helping readers understand why certain variables were chosen and how they might influence the results.

Response: Section 4.2 (lines 160 to 196) describe in detail the variable used in this study. All variables selected were based on the backing in the literature. We did our best to select variable that were the most relevant for tracking master’s student success, which have characteristics that differ from undergraduate student populations.

 

  1. Even though multicollinearity checks are mentioned, there's not enough detail on the model's assumptions or how they were validated. It’s important to explain this to make sure the model is reliable and the results are accurate.

Response: The revised version now features further discussion on the assumptions of the logistic regression model in the methods (Lines 216 to 221) and results sections (lines 306 to 311). This includes further discussion of multicollinearity and independence of observations. Discussion of linearity between variables and the logit is omitted as the model is exploratory and the focus of the paper is on the variable comparisons. 

 

The discussion is well-structured and aligns with the flow of the research design and findings. The interpretation of the logistic regression results is clear. There are, however, some areas of concern that should be addressed:

 

  1. The discussion around pandemic-related grade inflation could use a bit more depth. While the increase in GPA for discontinued students is mentioned, it would be helpful to consider other possible, explanations like student resilience or changes in grading criteria during the pandemic.

Response: While the focus on this work is on explicitly academic measures of student success, we add some information as suggested by the reviewer to this paragraph (lines 444 to 460).

 

  1. The practical recommendations from the findings should be given more focus. The author(s) mention some implications for program design, but expanding this with more actionable recommendations for HSIs facing similar transitions would be really useful.

Response: The revised work has been given HSI focused recommendations (lines 537 to 546).

 

  1. The conclusions should be tied more closely to broader trends in higher education, such as the growing reliance on online programs and the challenges of supporting non-traditional students. This would help place the findings in a larger context.

Response: The implications of the conclusions now allude to broader trends in education in terms of the impact of accelerated online programs on duration in study and course load (lines 569 to 574). Also discuss if GPA and course drop rate is articulated within the concern that accelerated online programs potentially are sacrificing educational quality (lines 574 to 578).

 

  1. Overall, the conclusions mostly align with the study's results and focus on addressing the two main research questions. However, there are a few areas that could be improved:

It would be helpful to offer more concrete recommendations for HSIs, especially those dealing with similar challenges in online program delivery or post-pandemic academic environments. This could include exploring whether better support systems are needed for students transitioning to online or accelerated learning formats, with a particular focus on students from underrepresented backgrounds.

Response: An expanded discussion of HSI has been added to the revised manuscript throughout the paper. In terms of specific recommendations for HSIs these are include on lines 537 to 546.

 

  1. It would also be valuable to tie the findings more clearly back to the persistence theories discussed in the literature review. This would show how the results either confirm, challenge, or expand on existing theories about student persistence, particularly for non-traditional graduate programs.

Response: The revised conclusions stress the importance of academic variables which aligns with the limited literature focused on master’s student success. All theoretical frameworks stress the importance of GPA as a predictor of persistence. The revised conclusions underscore two examples where this variable was particularly significant.

Round 2

Reviewer 1 Report

Comments and Suggestions for Authors

The authors have made significant improvements to their statistical analysis, resulting in findings that appear more rational. They have also addressed the majority of my comments comprehensively. However, I would like to highlight my concern regarding comment 1.3.

In experimental design, it is crucial to identify and account for all potential sources of variation affecting the response variable. If certain identified factors cannot be controlled, it is essential to acknowledge this limitation within the study.

If I have accurately interpreted the authors' response, they do not consider COVID-19 to be a potential source of variation for GPA, Terms Count, Course Load, and Course Drop Rate, asserting that the pandemic did not impact the delivery of accelerated online classes. I must respectfully disagree with such approach. The Course Drop Rate may indeed be influenced by pandemic-related factors, not solely through changes in learning conditions but also via motivational aspects affecting student success, the emergence of additional organizational challenges, and ultimately, health issues. Therefore, I believe it is crucial to emphasize that it is not feasible to evaluate the influence of the treatment in isolation from the pandemic factor. This represents a limitation in the applicability of the study's results and should be mentioned as a limitation.

Comments on the Quality of English Language

I recommend refining the English language, especially in the newly written text 

Author Response

1.3. Unlike the pre-accelerated phase, the accelerated online phase includes the COVID-19 pandemic period. Thus, the impact of the factor 'Transition to Accelerated Programs' on student success cannot be estimated in isolation. I recommend either changing the experimental design or providing a rationale for why the 'COVID-19' factor is not relevant in this case.

Response: We have added new data including a new table, results, and discussion to address this concern.

Back to TopTop